Scientific Misconduct

This nature article discusses the results of a survey about scientific misconduct, while an editorial makes some comments.

Quote: “The 2,212 researchers we surveyed observed 201 instances of likely misconduct over a threeyear period. That’s 3 incidents per 100 researchers per year. A conservative extrapolation from our findings to all DHHS-funded researchers predicts that more than 2,300 observations of potential misconduct are made every year.” Almost 9% of the respondents had witnessed some sort of misconduct, and 37% of those incidents went unreported.

The authors conclude that, besides protecting the whistleblowers better, it is necessary “to create a zero-tolerance culture”. The editor, however, holds the opinion that one also needs to take a look at “the environment that has allowed misconduct to flourish”. In his opinion, there should be the possibility of finding a solution without ruining the career of a scientist, especially in mild cases.

I tend to follow the editor’s reasoning. In my opinion, the zero-tolerance culture already exists to a certain extent, because a scientist convicted of, e.g. faking data, can forget about his career. But the result of such a policy is clear: no-one wants to blow the whistle on a colleague, because they don’t want to end somebody else’s career and because they will make themselves very unpopular. The real problem is the way misconduct is treated at the moment: we want to identify the guilty scientist, and punish him/her.

While this makes sense for the worst cases of fraud, in milder cases one should try and ask the question *why* the misdeed was done. Take, for example, the way hospitals treat mistakes nowadays: they try to find out how it could happen, and how it can be avoided in the future. This is very sensible, because it treats the problem in a proactive way: instead of reacting to the incident by punishing somebody, future incidents are reduced by tackling the things that cause them in the first place.

If there is a lot of pressure to produce as much data as possible in a research group, it is tempting to cut a corner once in a while. Can this not partly be considered the prof’s fault? In a similar way, one should address the working atmosphere in the group in question. The problem with the academic system is that there is no informal institution to turn to, besides your boss, if you are to witness a case of scientific misconduct. So we fall back to the old issue: the only person you can contact in case of problems has all the power over you.

At the University of Toronto, a “Graduate Student Oath”, similar to the Hippocratic Oath, has been tried as a means to strengthen scientific ethics (Science). Although this is an interesting idea, I doubt it will change the behaviour of people very much.

6 Comments

  1. I kind of wonder what they would of said if asked about the ethical conduct of their own research.

    Is it ethical for synthetics to only publish their maximum yield instead of their average yield with associated standard deviation?

    Is it ethical for physical chemists to use a q-test of 90% confidence to clean up outliers when a 95% confidence would be scientifically meaningful? Are using error bars based on a standard deviation of data points really more useful than instead carrying all the different types of systematic and human errors and basing it off that?

    • I think you’re making some interesting points there. One of the biggest problems about unethical behaviour is the fact that there are no clear-cut boundaries. There seems to be a kind of “grey area” where some practices are quite common, although they could be interpreted as misbehaviour.

      The example about the published yield is a very good one. It is so common to publish the best yield instead of an average one that this could already be regarded as “normal”. On the other hand, if I read a methodology paper, I expect the yields to be reproducible.

  2. But only for that science you don’t like! The good stuff, like global warming, we know they’re all honest on that!

  3. What about the ethical misconduct of professional societies? The American Chemical Societies is having some problems, including trying to kill off Open Access to protect the bonuses of their leadership.

    http://www.sourcewatch.org/index.php?title=American_Chemical_Society

    And Rudy Baum, the editor of C&EN, is also a man with severe ethical issues.

    http://www.sourcewatch.org/index.php?title=Rudy_Baum

  4. Pingback: Chemistry Blog » Blog Archive » Please hand me your final product

Leave a Reply

Your email address will not be published. Required fields are marked *