Share this article:
Scientific Fraud: Why Researchers Do It?
In 1985, Margot O’Toole, a new postdoc in the laboratory of Thereza Imanishi-Kari at the Massachusetts Institute of Technology (MIT), was asked to do experiments that would expand a previous work carried out in the lab and reported in a paper published in the journal Cell. Her inability to make the procedures described in the paper work in her experiments and the discrepancies she discovered between the laboratory notebooks and the published results made her suspect that there was something wrong with the experimental setup and the data reported in the paper.
Many researchers can easily relate to a situation like this one, as it often happens that a new postdoc or a graduate student is asked to use protocols established in the lab and reproduce results that have already been reported in a paper. By making her suspicions public, O’Toole stirred controversy and vigorous discussion about fraud in science and triggered what became to be known as “The Baltimore Case” (Kevles, 2000; Lang, 2015). The most famous of the authors of the paper in the center of this storm, David Baltimore, was awarded the Nobel Prize in Medicine in 1975 for his discovery of retroviruses and the way they reproduce themselves. Baltimore subsequently became interested in studying the immune system. The Cell paper reported results that could lead to finding a way to induce genetic modifications of the immune system.
The case was the subject of federal investigations and congressional hearings, which led to the establishment of regulations for research misconduct. It was concluded that the paper did not contain fraudulent data and the authors were exonerated, but the investigation took a significant toll on all who were involved in it. At that time, Baltimore was the recently appointed president of the prestigious Rockefeller University; a year later, he resigned, exhausted by fighting a long battle over these accusations of fraud. He later (1997-2006) served as president of the California Institute of Technology (Caltech). What became clear from the investigations and the Congress hearings, though, was that the notebooks in the lab had not been kept in a very orderly and consistent manner.
A chapter in my book, Managing Scientific Information and Research Data, describes different forms of scientific misconduct and how publishers are trying to detect it. One of the examples included in the book is the case of Jan Hendrik Schön, a young scientist from Bell Labs who only in a few years published 17 papers in prestigious scientific journals. The research reported in these papers was considered extraordinary and a break-through in physics, as it should have been—the creation of molecular-scale transistors, as it was claimed, was a great achievement. Schön was already considered by many to be running to the Nobel Prize. As it turned out, the articles were based on fiction, not on actual experiments.
A book, Plastic Fantastic: How the Biggest Fraud in Physics Shook the Scientific World (Reich, 2009), shows how Schön was able to mislead so many people—the reviewers who did not notice that the same graph had been used repeatedly in several different articles, as well as the editors of Science, Nature, and other prestigious journals that have published his papers and who had to deal with the fallout from this scandal. The fabrication of data by Schön had a devastating effect on many researchers who have wasted years of their careers trying to reproduce his results.
Another notorious case is the one of Hyung-In Moon, a South Korean plant compound researcher, who became famous with the way he misled the editors of many peer review journals. The scientific journals usually require that the authors suggest potential reviewers. Hyung-In Moon made up reviewers and provided for them email addresses that were actually his own. So the journals, without checking the identities of the “reviewers,” sent the articles for review—to their author. The truth came out when one of the editors became suspicious after receiving a review in only 24 hours after sending the article — something unheard of in the journal practices. By July 9, 2015, the number of Moon’s papers retracted by scientific journals reached 35 (Retraction Watch, 2015).
Some scientific journals now require raw experimental data to be included in papers submitted for publication. Many publishers are using CrossCheck to detect plagiarism and verify the originality of content submitted to them for publication. The program compares the text of manuscripts to a unique database of content which contains the full-text of already published articles, books and conference proceedings from hundreds of publishers.
The Office of Research Integrity (ORI) at the U.S. Department of Health and Human Services provides a number of tools that allow the detection of improperly reused text. ORI’s image analysis tools, such as Forensic Droplets, a desktop application in Adobe Photoshop, can be automated to examine the questionable images. The Committee on Publication Ethics (COPE) has a mission to educate authors and reviewers about scientific misconduct. It maintains Retraction Watch and a database with cases of such ethical violations.
This is how Michael Gordin, Director of Graduate Studies in History of Science at Princeton University, discussed scientific fraud in an interview I conducted with him several years ago (Baykoucheva, 2011):
“Fraud, if one subscribes to a particular model of psychology, is a matter of incentives, and it is possible that with intensified safeguards, one can reduce its occurrence. But we almost certainly can’t eliminate it altogether. Peer review is often put forward as a solution to this problem, and it is likely better than having no safeguard at all—at least this guarantees that a few scientists read over the piece before it is published—but the evidence of recent years has shown that it is far from foolproof in catching fraud. But, as in the case of Schön, eventually the misdeeds come to light. Time seems to be our best tool in this matter.”
Graduate students and postdocs depend on their supervisors for current financial support and recommendations for future jobs. Presenting results that differ from those previously published by the lab or that do not confirm a preliminary hypothesis might turn out to be very detrimental for their careers. Such situations could potentially lead to data manipulation. It is very important that students, from the very beginning of their research, be educated about ethical standards in research and the potential risks of not abiding by these standards.
Although unethical behavior cannot be fully detected by any control system, everything possible should be done to prevent it, because the intentional misconduct of a single author can seriously damage the reputation of a department, an institution, and a publication. A chapter in my book discusses Electronic Laboratory Notebooks (ELNs) and how they could prevent scientific fraud and protect intellectual property.
Baykoucheva, S. (2011). Political, cultural, and technological impacts on chemistry. An interview with Michael Gordin, Director of Graduate Studies of the Program in the History of Science, Princeton University. Chemical Information Bulletin, 63, http://acscinf.org/publications/bulletin/63-61/gordin.php.
Kevles, D. J. (2000). The Baltimore Case: A Trial of Politics, Science, and Character: W. W. Norton & Company.
Lang, S. (2015). Questions of scientific responsibility: The Baltimore case (Article published in Ethics and Behavior, 1994, 3 (1), 3-72). Retrieved July 9, 2015, from http://www-gateway.vpr.drexel.edu/files/Gateway_Project_Moshe_Kam/Resource/DBCre/serge.html
Reich, E. S. (2009). Plastic fantastic : how the biggest fraud in physics shook the scientific world. New York: Palgrave Macmillan.
Retraction Watch. (2015). Posts about Hyung-In Moon on Retraction Watch. Retrieved July 9, 2015, from http://retractionwatch.com/category/by-author/hyung-in-moon/
About the Author
Svetla Baykoucheva (Baykousheva) is the head of the White Memorial Chemistry Library at the University of Maryland College Park. For more than 20 years she has performed interdisciplinary research in infectious microbiology and biochemistry, and has published more than 40 articles in peer-review scientific journals such as the Journal of Biological Chemistry, Biochemistry, Journal of Chromatography, and FEBS Letters. She was also the editor of the Chemical Information Bulletin (published by the ACS Chemical Information Division) and manager of the Library and Information Center of the American Chemical Society (ACS) in Washington, D.C. In 2005 she moved back to academia to become head of the White Memorial Chemistry Library at the University of Maryland College Park, where she teaches scientific information and bibliographic management.
Innovative technologies are changing the way research is performed, preserved, and communicated. Svetla’s book, Managing Scientific Information and Research Data, explores how innovative technologies are used and provides detailed analysis of the approaches and tools developed to manage scientific information and data. It is available for purchase on the Elsevier Store using discount code “STC215″ at checkout to save up to 30% off the list price and get free global shipping.
The general scope of social sciences is vast, and Elsevier’s collection of journals, books, and eBooks examine in detail a wide range of topics in this area, from sociology, law, and cognitive science to political science, education, and linguistics. Our Chandos imprint in particular, known for high-quality scholarship in Asian studies, library and information science, and business management, reflects Elsevier’s continuing commitment to these crucial areas of study.