science policy

$cience is Important

Last week Kei Koizumi, Assistant Director for Federal Research & Development for the White House Office of Science and Technology Policy, paid a visit to the University of North Carolina-Chapel Hill (my new home as of mid-June). The visit included a tour of several laboratories where everyone did their very best to convince him that our funding (e.g. my salary) is worthwhile as well as a presentation by Mr. Koizumi that outlined the Presidents plans/goals/vision for scientific funding.

On many occasions, President Obama has voiced his strong support of the sciences. In an address to the National Academy of Sciences on April 27, 2009 he emphasized the importance of science by stating “Science is more essential for our prosperity, our security, our health, our environment, and our quality of life than it has ever been before.” In addition to this type of powerful dialogue we have seen significant action.  Perhaps the most obvious example of this is the scientific funding boost that came through with the American Recovery and Reinvestment Act (ARRA, purple block in the graph below). In addition to this quick funding boost there is a continuing effort by the administration to double the 2006 budget for the Department of Energy, the National Institute of Standards and Technology and the National Science Foundation by the year 2017. The approved 2011 budget continues with this upward funding trend as outline in the graph below.

So what does the future hold? Every year a memorandum is sent from the Office of Management and Budget to the major funding agencies requesting their budget proposals for the upcoming year. In this memorandum the current administration outlines how they intend to direct their funding. In the 2012 memorandum, the Obama administration emphasized the following six areas of focus:

  • Promoting sustainable economic growth and job creation.
  • Defeating the most dangerous diseases and achieving better health outcomes for all while reducing health care costs.
  • Moving toward a clean energy future to reduce dependence on energy imports while curbing greenhouse gas emissions.
  • Understanding, adapting to, and mitigating the impacts of global climate change.
  • Managing the competing demands on land, fresh water, and the oceans for the production of food, fiber, biofuels, and ecosystem services based on sustainability and biodiversity.
  • Developing the technologies to protect our troops, citizens, and national interests.

The proposal writing process is no doubt an exercise in balancing wishful thinking and self control. Along these lines it is not unusual for an agency to submit several versions of their budget (above, below and the same as the previous year). However, due to the recent economic issues, the administration was particular in asking all agencies to submit a funding request that is reduced by 5 percent relative to the previous year. As of Monday, September 13th the new budget proposals for the 2012 fiscal year were due. Over the next several months negotiations between the Office of Management and Budget and the Office of Science Technology Policy will determine the 2012 funding situation.  The 2012 budget will then be announced in the first week of February 2011. Although it is unlikely that every agencies budget will be reduced by 5% , 2012 is likely to be a tough year for many researchers.

tl;dr: All you have to do to guarantee funding in 2012 is submit a solid proposal for a commercially viable, bulletproof, CO2 detecting solar energy converter that cures diseases while still maintaining the ecosystem.

By September 19, 2010 6 comments science policy

How many ways can you say something without plagiarizing?

In a recent post by Derek Lowe on a Chinese journal’s finding that 31% of its submitted papers contained plagiarized material, an editor for a scientific journal noted in the comments that he randomly selected a Tetrahedron Letters paper from a developing country and Googled the first sentence. That sentence (“Multicomponent reactions (MCRs) are important for generating high levels of diversity…”) shows up in very similar form in three different papers, all from institutions in Iran and China. In two of the papers, the second sentence of the paper is exactly the same, all 22 words.

Also, compare the two first sentences, the first by Shaterian et al.[1] and the second by Adib et al.[2]  The highlighted words are the same.

Multi-component reactions (MCRs) are important for the achievement of high levels of brevity and diversity. They allow more than two simple and flexible building blocks to be combined in practical, time-saving one-pot operations, giving rise to complex structures by simultaneous formation of two or more bonds, according to the domino principle.”

Multicomponent reactions (MCRs) are important for generating high levels of diversity, as they allow more than two building blocks to be combined in practical, time-saving one-pot operations, giving rise to complex structures by simultaneous formation of two or more bonds.

While cutting and pasting other people’s introductory sentences is certainly embarrassing and almost certainly plagiarism, there is some difficulty in summarizing a set of facts in a different way each time. It certainly can be done — below are three different labs’ introductory sentences for chemistry towards the total synthesis of the azaspiracids, which are marine natural products. Again, the same words are highlighted in red.
Nicolaou et al.[3]: “The azaspiracids are a group of notorious marine neurotoxins whose accumulation in mussels causes serious human poisoning known as azaspiracid poisoning syndrome (AZP) upon their consumption.”
Geisler, Nguyen and Forsyth[4]: “The azaspiracids are remarkable natural products that combine a unique, complex structure with an acute and perhaps chronic human health hazard.”
Evans et al.[5]: “(-)-Azaspiracid-1 is a structurally complex marine neurotoxin that is implicated in seafood poisoning.”
You can see that Nicolaou, Forsyth and Evans all have specific ideas they’re trying to get across: what the compound is, where it comes from and what it does to people. But they’ve all managed to have relatively few words actually overlap.

Is this sort of cutting-and-pasting ‘real’ plagiarism? — it’s just the quotation of a particularly useful string of words, one might assert, not the stealing of ideas. I don’t think this is a very good way of thinking about things, but I can’t quite reason why. In addition, I doubt that any of the authors of the MCR papers were native speakers of English. Clearly, that plays some role in their choice to cut and paste; again, not an excuse, but another contributing factor. I’m trying to see if I can come up with extenuating circumstances, but I just can’t.

My adviser in graduate school held out “the same five words in a row” as a general rule of thumb for how to spot and/or avoid plagiarism — what about the same five ideas in a row? What do you think, reader? How do you avoid cutting and pasting? And what should we do (if we should) to stop this sort of thing? Do we need TurnYourJournalSubmissionIn.Com?

References:
1. Shaterian, H.R.; Yarahmadi, H.; Ghashang, M. Arkivoc. 2007, 16, 298-313.
2. Adib, M.; Mahdavi, M.; Bagherzadeh, S.; Zhu, L.-G.; Rahimi-Nasrabadi, M. Tet. Lett. 2010, 51, 27-29.
3. Nicolaou, K.C.; Frederick, M.O.; Petrovic, G.; Cole, K.P.; Loizidou, E.Z. Angew. Chem. Int. Ed. 2006, 45, 2609-2615.
4. Geisler, L.K.; Nguyen, S.; Forsyth, C.J. Org. Lett., 2004, 6, 4159-4162.
5. Evans, D.A.; Kvaerno, L.; Mulder, J.A.; Raymer, B.; Dunn, T.B.; Beauchemin, A.; Olhava, E.J.; Juli, M.; Kagechika, K. AngewChem. Int. Ed. 2007, 46, 4693-4697.
By September 17, 2010 13 comments general chemistry, science policy

The Good, The Bad, and The Ugly

Does anyone else have a difficult time trying to separate “good science” from “bad science”?  I’m a very black and white person.  I love facts and truths and logic, and that drives most of my family crazy.  Perhaps that’s why I struggle with identifying bad science; there’s seemingly no clear-cut, concise way of identifying junk that ends up published.  To be clear, I’m not talking about retractions for blatant disregard for scientific ethics.  I’d classify these situations (e.g., the Xenobe controversy, Sames’ retractions, Bell Labs, etc.) as “ugly.”  I’m particularly concerned with cases where during a presentation everyone sort of looks at each other, raises his/her eyebrows, frowns, and collectively mumbles, “Hmm.”

It seems the term “junk science” has been in use in the legal profession since the 1980’s.  Yet, despite its existence, “junk science” is actually an ambiguous concept.  In 1998, legal experts Edmond and Mercer attempted to conquer this beast by identifying “good science,” then considering outlying cases “bad.”  Here’s what they considered “the good”:

“’Good science’ is usually described as dependent upon qualities such as falsifiable hypotheses, replication, verification, peer-review and publication, general acceptance, consensus, communalism, universalism, organized skepticism, neutrality, experiment/empiricism, objectivity, dispassionate observation, naturalistic explanation, and use of the scientific method.”

Does this list really mean that everything else is considered “junk”?  I can think of a few brilliant studies that used trial and error methods in lieu of the scientific method.  Conversely, I’m aware of peer-reviewers who simply check the “publish” box without actually reading the manuscript.  As is argued on several other blogs, identifying “junk science” is a very gray area.

Perhaps one way to define junk science is to take the Jacobellis v. Ohio approach.  In a 1964 US Supreme Court case involving obscenity, Justice Stewart Potter wrote in his opinion, “I shall not today attempt to define the kinds of material I understand to be [pornography]…but I know it when I see it.”  Clearly the same frame of thought can be applied to junk science.  I am less inclined to accept the Jacobellis approach because it offers nothing tangble.

There must be some empirical qualities that set the good from the bad.  Despite all the skills I’ve learned with a mere decade of lab experience, I am disheartened to admit that I honestly never perfected the skill of detecting bad science.  So, like a responsible, up-and-coming assistant professor of chemistry, I went crawling through the literature to determine what separates the good from the bad.  Below is a list of a few things I learned.

In the spirit of Jeff Foxworthy, science might be “junk” if…

Researchers are more concerned with holding press conferences than publishing results in reputable, peer-reviewed journals. One might assume that “breakthroughs” ought to be showcased in the most prestigious journals after being subjected to a rigorous peer review process.  Fast tracking all the way to the press conference phase certainly raises some flags about credibility.  I’ve seen this phenomenon happen first-hand, and when the science is questionable, the ensuing public announcement can get really ugly (and entertaining, for that matter).

Something about the research seems off kilter. If you think something doesn’t feel right, you might be correct.  Although going with your gut will only get you so far, analysis guides such as “Tipsheet: For Reporting on Drugs, Devices and Medical Technologies” help identify specific areas for journalists to consider when examining the veracity of medical therapies.  Cook and co-workers suggested that similar checklists might likewise serve the general scientific community when evaluating the credibility of reported work.

Conflicts of interest are not explicitly disclosed. In these cases, scientific integrity might be compromised for financial, political, or other external motivations.  In developing this article, I encountered journals, funding agencies, and governing bodies that require authors to declare any potential conflicts of interest while publishing or applying for grants.  Although editors and referees try to uphold strict transparency policies, authors can still fail to report external influences and biasing.  These cases essentially touch every facet of research–cancer, testing pesticides (Berkley Scientif. J. 2009, 13, 32-34), and even drug development.  The onus is put on the audience to look into the author’s sources of funding.

The flow of logic doesn’t make any sense. Junk science may have gaping holes in experimental descriptions or proposed models.  Fortunately, overly simplistic and inaccurate scientific explanations usually evoke sharp criticism from the scientific experts.  Credible “debunkers” often attack the logic of an issue by (for example) discrediting cited authoritative opinions, identifying assumptions, and/or offering overlooked hypotheses.

Colleagues in the field are widely skeptical of the work. Mix it up with your cohorts.  A simple, “Hey, what did you think about the most recent (insert name of researcher here) article in JOC,” can shed some light on the context of published or presented findings.  “[He] hasn’t published anything reproducible in the past 20 years,” my PI once said.  “I sincerely doubt that this latest paper is anything new.”

NSF Reauthorization

The bill that will reauthorize the NSF had a markup by the subcommittee on Research and Science Education. Since what happens in the policy world can have repercussions in the science world here is a list of policy changes to NSF that caught my eye.

    The Bill: NSF Reauthorization 2010

  • 5% of the NSF research budget has to be used for high-risk high-reward proposals. (SEC. 201. SUPPORT FOR POTENTIALLY TRANSFORMATIVE RESEARCH)

The Amendments: NSF Reauthorization Amendments 2010

Daniel Lipinski (D-IL)

  • Wants NSF to give cash prizes to high-risk, high-reward research challenges. (SEC. 207. PRIZE REWARDS)
    • The prizes will range from 1 million to 3 million

I don’t care what the topic turns out to be, but for that much money Chemistry Blog will organize and field an open team for the competition. How the NSF goes about and implements these contests will be interesting to see.

The NSF Reauthorization will be wrapped into the America COMPETES Act and likely will be voted on by the full House of Representatives before May 31st. The America COMPETES Act shouldn’t suffer any major hurdles for passage.

Mitch

By April 15, 2010 3 comments science policy