Articles by: Jeremy

Common Student Difficulties in Organic Chemistry

While cleaning out my newly assigned “war room” (the setting where I’ll strategize on how best to torture students this fall), I came across some fairly interesting documents that were buried in far corners of crowded file cabinets.  They’re nothing personal or discriminating (sorry TMZ), but I saw them as material I could use in upcoming classes.

One of the several I found, titled “Common Student Difficulties in Organic Chemistry,” caught my attention more than the others.  The document, which appears to have been assembled using a typewriter (for the unfamiliar, you can find information about typewriters here), lists problems students encounter while navigating through the dreaded “O Chem”.  In any case, at the bottom of the page, in bold, is the following message:

If you start to get into trouble in this course review this sheet.  Knowing what has gone wrong allows you to fix it.

This closing interested me from a historical perspective.  Did enough students bomb the course to warrant this document’s assembly?  Did the professor discover this or a similar list at an ACS meeting and felt it was prudent to include it in his/her course?  Did the document actually help students better understand the course material?

Although I can speculate until the cows come home, I’m throwing it out to you, the blogosphere.  Do you agree with this list?  Would you change anything on it?  I’m curious to see what the blogger generation thinks (FYI, I believe this list was developed in the 1980’s).

  1. Lack of organization
  2. Difficulty in keeping up with lecture while taking notes
  3. Failure to finish exams
  4. Inability to manipulate three-dimensional structures on paper
  5. Too little drill – lack of repetitive practice
  6. Falling behind
  7. Poor problem analysis
  8. Inability to see and mentally manipulate three-dimensional objects
  9. Insufficient energy and/or motivation for the challenges of this course

Your Academic Lineage

Over dinner the other night, my uncle and I started comparing and contrasting our academic experiences.  He’s a fascinating person who earned a bachelor’s degree in computer science in the late 1970’s.

After discussing the finer points of Moore’s Law, and how he agonized over purchasing a 20 MB hard drive in the 1980’s for $400, the substance of the conversation switched.  “Have you ever researched your Ph.D. lineage,” he asked.

“I’ve gone as far back as Breslow,” I replied, completely forgetting that he probably didn’t know this “Breslow” character.

It turns out that several of his doctoral computer buddies had recently taken on this task, many of them somehow descending (academically) from Charles Babbage.

Our discussion prompted me to further examine my background.  I soon discovered that there are several University websites that provide chemistry academic lineage for their faculty members.  Being an organic chemist, I was interested to learn that E.J. Corey worked for John Sheehan (I admit it…I’m nerdly).  In any case, here are some websites I found interesting:

The Good, The Bad, and The Ugly

Does anyone else have a difficult time trying to separate “good science” from “bad science”?  I’m a very black and white person.  I love facts and truths and logic, and that drives most of my family crazy.  Perhaps that’s why I struggle with identifying bad science; there’s seemingly no clear-cut, concise way of identifying junk that ends up published.  To be clear, I’m not talking about retractions for blatant disregard for scientific ethics.  I’d classify these situations (e.g., the Xenobe controversy, Sames’ retractions, Bell Labs, etc.) as “ugly.”  I’m particularly concerned with cases where during a presentation everyone sort of looks at each other, raises his/her eyebrows, frowns, and collectively mumbles, “Hmm.”

It seems the term “junk science” has been in use in the legal profession since the 1980’s.  Yet, despite its existence, “junk science” is actually an ambiguous concept.  In 1998, legal experts Edmond and Mercer attempted to conquer this beast by identifying “good science,” then considering outlying cases “bad.”  Here’s what they considered “the good”:

“’Good science’ is usually described as dependent upon qualities such as falsifiable hypotheses, replication, verification, peer-review and publication, general acceptance, consensus, communalism, universalism, organized skepticism, neutrality, experiment/empiricism, objectivity, dispassionate observation, naturalistic explanation, and use of the scientific method.”

Does this list really mean that everything else is considered “junk”?  I can think of a few brilliant studies that used trial and error methods in lieu of the scientific method.  Conversely, I’m aware of peer-reviewers who simply check the “publish” box without actually reading the manuscript.  As is argued on several other blogs, identifying “junk science” is a very gray area.

Perhaps one way to define junk science is to take the Jacobellis v. Ohio approach.  In a 1964 US Supreme Court case involving obscenity, Justice Stewart Potter wrote in his opinion, “I shall not today attempt to define the kinds of material I understand to be [pornography]…but I know it when I see it.”  Clearly the same frame of thought can be applied to junk science.  I am less inclined to accept the Jacobellis approach because it offers nothing tangble.

There must be some empirical qualities that set the good from the bad.  Despite all the skills I’ve learned with a mere decade of lab experience, I am disheartened to admit that I honestly never perfected the skill of detecting bad science.  So, like a responsible, up-and-coming assistant professor of chemistry, I went crawling through the literature to determine what separates the good from the bad.  Below is a list of a few things I learned.

In the spirit of Jeff Foxworthy, science might be “junk” if…

Researchers are more concerned with holding press conferences than publishing results in reputable, peer-reviewed journals. One might assume that “breakthroughs” ought to be showcased in the most prestigious journals after being subjected to a rigorous peer review process.  Fast tracking all the way to the press conference phase certainly raises some flags about credibility.  I’ve seen this phenomenon happen first-hand, and when the science is questionable, the ensuing public announcement can get really ugly (and entertaining, for that matter).

Something about the research seems off kilter. If you think something doesn’t feel right, you might be correct.  Although going with your gut will only get you so far, analysis guides such as “Tipsheet: For Reporting on Drugs, Devices and Medical Technologies” help identify specific areas for journalists to consider when examining the veracity of medical therapies.  Cook and co-workers suggested that similar checklists might likewise serve the general scientific community when evaluating the credibility of reported work.

Conflicts of interest are not explicitly disclosed. In these cases, scientific integrity might be compromised for financial, political, or other external motivations.  In developing this article, I encountered journals, funding agencies, and governing bodies that require authors to declare any potential conflicts of interest while publishing or applying for grants.  Although editors and referees try to uphold strict transparency policies, authors can still fail to report external influences and biasing.  These cases essentially touch every facet of research–cancer, testing pesticides (Berkley Scientif. J. 2009, 13, 32-34), and even drug development.  The onus is put on the audience to look into the author’s sources of funding.

The flow of logic doesn’t make any sense. Junk science may have gaping holes in experimental descriptions or proposed models.  Fortunately, overly simplistic and inaccurate scientific explanations usually evoke sharp criticism from the scientific experts.  Credible “debunkers” often attack the logic of an issue by (for example) discrediting cited authoritative opinions, identifying assumptions, and/or offering overlooked hypotheses.

Colleagues in the field are widely skeptical of the work. Mix it up with your cohorts.  A simple, “Hey, what did you think about the most recent (insert name of researcher here) article in JOC,” can shed some light on the context of published or presented findings.  “[He] hasn’t published anything reproducible in the past 20 years,” my PI once said.  “I sincerely doubt that this latest paper is anything new.”

Re-issuing Classic Chemistry

I recently bought a 2009 re-issued copy of Pearl Jam’s first album “Ten,” originally released back in 1991.  Those who know me well are also aware of my interest in Pearl Jam; I enjoy collecting demos or live versions of their music.  Anyhow, their officially released re-issue contains a remixed version of their 1991 album and (in my opinion) parts of it sound distinctly different than the original mix.  For you music buffs out there in internet land, Brendan O’Brien—the original producer—dumped the supplemental reverb applied to the original tracks in this newer version.  As a result, the guitars and drums sound much cleaner and less wet (I recommend listening to both versions of “Why Go” or “Oceans” for a good example of the remixing).

Thinking about the whole concept of “re-issue” got me thinking about organic chemistry (big surprise).  How often do scientists report fantastically optimized results, table the idea, and then revisit it at a later date (to make vast improvements)?  Or better yet, how much “new” chemistry has derived from “re-issuing” processed developed in the late 19th or early 20th century?  My PI calls refers to this particular phenomenon as, “teaching an old dog new tricks.”  In writing my dissertation (an ongoing process) I had the pleasure of reading Lipshutz’s recent review about cuprate chemistry (Synlett 2009, 509-524; DOI: 10.1055/s-0028-1087923).  This personalized narrative discusses the Lipshutz group efforts and contributions to the field of copper(I) hydride chemistry. 

This article is of particular interest apart from discussing it at length in the ‘ol thesis.  A few months back, I had a conversation with a colleague of mine who claimed that since Stryker’s contributions, “conjugate reduction chemistry has (basically) fallen to the wayside.”  I recall laughing out loud at his remark.  “What about Lipshutz or Riant or even Buchwald,” I asked.  He claimed, with a sense of arrogance, that their work was “just a new twist on Stryker’s original work.”  Based off of this logic, if someone successfully synthesized Taxol from table sugar in three steps, would it be considered a new twist on Nicolau or Holton’s contributions?  Arrogance aside, this idea of “re-issuing” is a common phenomenon in research chemistry.  It’s done frequently, often to the tune of 10-20 additional printed publications (apart from the seminal contribution).  Perhaps, it’s these instances that call into question the process of “re-issuing” chemistry. 

That said, re-issued chemistry can result in significantly new discoveries and improvements on original methods.  Taking the conjugate reduction example, Stryker’s catalytic reactions, performed under a high pressure of H2, were plagued with over-reduced products.  In switching the stoichiometric hydride source from hydrogen gas to PMHS, Lipshutz reported a vast improvement in reaction times and overall yields (Tetrahedron 2000, 56, 2779-2788; doi: 10.1016/S0040-4020(00)00132-0).  This change has spawned a whole new area of carbon-carbon bond formation, particularly in the field of reductive alkylation reactions. 

While I’m genuinely interested in the idea of inventing new and exciting reactions, the thought of tweaked processes resulting in “re-issued” chemistry is largely appealing (when done responsibly).  A prominent neutron chemist once told me that real chemistry lies in unexplored places.  “We want to be doing things that others aren’t,” he said.  I agree.  But on occasion, it’s necessary to explore the landscapes previously claimed by others for the betterment of the (scientific) community as a whole.

By May 7, 2009 6 comments opinion, synthetic chemistry