Samiksha Jaiswal (Editor)

Scientific misconduct

Updated on
Edit
Like
Comment
Share on FacebookTweet on TwitterShare on LinkedInShare on Reddit
Scientific misconduct

Scientific misconduct is the violation of the standard codes of scholarly conduct and ethical behavior in professional scientific research. A Lancet review on Handling of Scientific Misconduct in Scandinavian countries provides the following sample definitions: (reproduced in The COPE report 1999.)

Contents

  • Danish definition: "Intention or gross negligence leading to fabrication of the scientific message or a false credit or emphasis given to a scientist"
  • Swedish definition: "Intention[al] distortion of the research process by fabrication of data, text, hypothesis, or methods from another researcher's manuscript form or publication; or distortion of the research process in other ways."
  • The consequences of scientific misconduct can be damaging for both perpetrators and any individual who exposes it. In addition there are public health implications attached to the promotion of medical or other interventions based on dubious research findings.

    Motivation to commit scientific misconduct

    According to David Goodstein of Caltech, there are motivators for scientists to commit misconduct, which are briefly summarised here.

    Career pressure
    Science is still a very strongly career-driven discipline. Scientists depend on a good reputation to receive ongoing support and funding, and a good reputation relies largely on the publication of high-profile scientific papers. Hence, there is a strong imperative to "publish or perish". Clearly, this may motivate desperate (or fame-hungry) scientists to fabricate results.
    Ease of fabrication
    In many scientific fields, results are often difficult to reproduce accurately, being obscured by noise, artifacts, and other extraneous data. That means that even if a scientist does falsify data, they can expect to get away with it – or at least claim innocence if their results conflict with others in the same field. There are no "scientific police" who are trained to fight scientific crimes; all investigations are made by experts in science but amateurs in dealing with criminals. It is relatively easy to cheat although difficult to know exactly how many scientists fabricate data.

    Forms of scientific misconduct

    The U.S. National Science Foundation defines three types of research misconduct: fabrication, falsification, and plagiarism.

  • Fabrication is making up results and recording or reporting them. This is sometimes referred to as "drylabbing". A more minor form of fabrication is where references are included to give arguments the appearance of widespread acceptance, but are actually fake, and/or do not support the argument.
  • Falsification is manipulating research materials, equipment, or processes or changing or omitting data or results such that the research is not accurately represented in the research record.
  • Plagiarism is the appropriation of another person's ideas, processes, results, or words without giving appropriate credit. One form is the appropriation of the ideas and results of others, and publishing as to make it appear the author had performed all the work under which the data was obtained. A subset is citation plagiarism – willful or negligent failure to appropriately credit other or prior discoverers, so as to give an improper impression of priority. This is also known as, "citation amnesia", the "disregard syndrome" and "bibliographic negligence". Arguably, this is the most common type of scientific misconduct. Sometimes it is difficult to guess whether authors intentionally ignored a highly relevant cite or lacked knowledge of the prior work. Discovery credit can also be inadvertently reassigned from the original discoverer to a better-known researcher. This is a special case of the Matthew effect.
  • Plagiarism-Fabrication - the act of taking an unrelated figure from an unrelated publication and reproducing it exactly in a new publication (claiming that it represents new data). Recent papers from the University of Cordoba have come to light showing how this can go undetected and unchallenged for years.
  • Self-plagiarism – or multiple publication of the same content with different titles and/or in different journals is sometimes also considered misconduct; scientific journals explicitly ask authors not to do this. It is referred to as "salami" (i.e. many identical slices) in the jargon of medical journal editors (MJE). According to some MJE this includes publishing the same article in a different language.
  • Other types of research misconduct are also recognized:

  • The violation of ethical standards regarding human and animal experiments – such as the standard that a human subject of the experiment must give informed consent to the experiment. Failure to obtain ethical approval for clinical studies characterised the case of Joachim Boldt.
  • Ghostwriting – the phenomenon where someone other than the named author(s) makes a major contribution. Typically, this is done to mask contributions from drug companies. It incorporates plagiarism and has an additional element of financial fraud.
  • Conversely, research misconduct is not limited to not listing authorship, but also includes the act of conferring authorship on those that have not made substantial contributions to the research. This is done by senior researchers who muscle their way onto the papers of inexperienced junior researchers as well as others that stack authorship in an effort to guarantee publication. This is much harder to prove due to a lack of consistency in defining "authorship" or "substantial contribution".
  • In addition, some academics consider suppression—the failure to publish significant findings due to the results being adverse to the interests of the researcher or his/her sponsor(s)—to be a form of misconduct as well.

  • Bare assertions – making entirely unsubstantiated claims - may also be considered a form of research misconduct although there is no evidence that cases of this form have ever led to a finding of misconduct.
  • In some cases, scientific misconduct may also constitute violations of the law, but not always. Being accused of the activities described in this article is a serious matter for a practicing scientist, with severe consequences should it be determined that a researcher intentionally or carelessly engaged in misconduct. However, in most countries, committing research misconduct, even on a large scale, is not a legal offense.

    Three percent of the 3,475 research institutions that report to the US Department of Health and Human Services' Office of Research Integrity, indicate some form of scientific misconduct. However the ORI will only investigate allegations of impropriety where research was funded by federal grants. They routinely monitor such research publication for red flags. Other private organizations like the Committee of Medical Journal Editors (COJE) can only police their own members.

    The validity of the methods and results of scientific papers are often scrutinized in journal clubs. In this venue, members can decide amongst themselves with the help of peers if a scientific paper's ethical standards are met.

    Responsibility of authors and of coauthors

    Authors and coauthors of scientific publications have a variety of responsibilities. Contravention of the rules of scientific authorship may lead to a charge of scientific misconduct. All authors, including coauthors, are expected to have made reasonable attempts to check findings submitted to academic journals for publication. Simultaneous submission of scientific findings to more than one journal or duplicate publication of findings is usually regarded as misconduct, under what is known as the Ingelfinger rule, named after the editor of the New England Journal of Medicine 1967-1977, Franz Ingelfinger.

    Guest authorship (where there is stated authorship in the absence of involvement, also known as gift authorship) and ghost authorship (where the real author is not listed as an author) are commonly regarded as forms of research misconduct. In some cases coauthors of faked research have been accused of inappropriate behavior or research misconduct for failing to verify reports authored by others or by a commercial sponsor. Examples include the case of Gerald Schatten who co-authored with Hwang Woo-Suk, the case of Professor Geoffrey Chamberlain named as guest author of papers fabricated by Malcolm Pearce, (Chamberlain was exonerated from collusion in Pearce's deception) - and the coauthors with Jan Hendrik Schön at Bell Laboratories. More recent cases include that of Charles Nemeroff, then the editor-in-chief of Neuropsychopharmacology, and a well-documented case involving the drug Actonel.

    Authors are expected to keep all study data for later examination even after publication. The failure to keep data may be regarded as misconduct. Some scientific journals require that authors provide information to allow readers to determine whether the authors might have commercial or non-commercial conflicts of interest. Authors are also commonly required to provide information about ethical aspects of research, particularly where research involves human or animal participants or use of biological material. Provision of incorrect information to journals may be regarded as misconduct. Financial pressures on universities have encouraged this type of misconduct. The majority of recent cases of alleged misconduct involving undisclosed conflicts of interest or failure of the authors to have seen scientific data involve collaborative research between scientists and biotechnology companies (Nemeroff, Blumsohn).

    Responsibilities of research institutions

    In general, defining whether an individual is guilty of misconduct requires a detailed investigation by the individual's employing academic institution. Such investigations require detailed and rigorous processes and can be extremely costly. Furthermore, the more senior the individual under suspicion, the more likely it is that conflicts of interest will compromise the investigation. In many countries (with the notable exception of the United States) acquisition of funds on the basis of fraudulent data is not a legal offence and there is consequently no regulator to oversee investigations into alleged research misconduct. Universities therefore have few incentives to investigate allegations in a robust manner, or act on the findings of such investigations if they vindicate the allegation.

    Well publicised cases illustrate the potential role that senior academics in research institutions play in concealing scientific misconduct. A King's College (London) internal investigation showed research findings from one of their researchers to be 'at best unreliable, and in many cases spurious' but the college took no action, such as retracting relevant published research or preventing further episodes from occurring. It was only 10 years later, when an entirely separate form of misconduct by the same individual was being investigated by the General Medical Council, that the internal report came to light.

    In a more recent case an internal investigation at the National Centre for Cell Science (NCCS), Pune determined that there was evidence of misconduct by Dr. Gopal Kundu, but an external committee was then organised which dismissed the allegation, and the NCCS issued a memorandum exonerating the authors of all charges of misconduct. Undeterred by the NCCS exoneration, the relevant journal (Journal of Biological Chemistry) withdrew the paper based on its own analysis.

    Responsibilities of scientific colleagues who are "bystanders"

    Some academics believe that scientific colleagues who suspect scientific misconduct should consider taking informal action themselves, or reporting their concerns. This question is of great importance since much research suggests that it is very difficult for people to act or come forward when they see unacceptable behavior, unless they have help from their organizations. A "User-friendly Guide," and the existence of a confidential organizational ombudsman may help people who are uncertain about what to do, or afraid of bad consequences for their speaking up.

    Responsibility of journals

    Journals are responsible for safeguarding the research record and hence have a critical role in dealing with suspected misconduct. This is recognised by the Committee on Publication Ethics (COPE) which has issued clear guidelines on the form (e.g. retraction) that concerns over the research record should take.

  • The COPE guidelines state that journal editors should consider retracting a publication if they have clear evidence that the findings are unreliable, either as a result of misconduct (e.g. data fabrication) or honest error (e.g. miscalculation or experimental error). Retraction is also appropriate in cases of redundant publication, plagiarism and unethical research.
  • Journal editors should consider issuing an expression of concern if they receive inconclusive evidence of research or publication misconduct by the authors, there is evidence that the findings are unreliable but the authors' institution will not investigate the case, they believe that an investigation into alleged misconduct related to the publication either has not been, or would not be, fair and impartial or conclusive, or an investigation is underway but a judgement will not be available for a considerable time.
  • Journal editors should consider issuing a correction if a small portion of an otherwise reliable publication proves to be misleading (especially because of honest error), or the author / contributor list is incorrect (i.e. a deserving author has been omitted or somebody who does not meet authorship criteria has been included).
  • Recent evidence has emerged that journals learning of cases where there is strong evidence of possible misconduct, with issues potentially affecting a large portion of the findings, frequently fail to issue an expression of concern or correspond with the host institution so that an investigation can be undertaken. In one case the Journal of Clinical Oncology issued a Correction despite strong evidence that the original paper was invalid. In another case, Nature allowed a Corrigendum to be published despite clear evidence of image fraud. Subsequent Retraction of the paper required the actions of an independent whistleblower.

    The recent cases of Joachim Boldt and Yoshitaka Fujii in anaesthesiology have focussed attention on the role that journals play in perpetuating scientific fraud as well as how they can deal with it. In the Boldt case, the Editors-in-Chief of 18 specialist journals (generally anaesthesia and intensive care) made a joint statement regarding 88 published clinical trials conducted without Ethics Committee approval. In the Fujii case, involving nearly 200 papers, the journal Anesthesia & Analgesia, which published 24 of Fujii's papers, has accepted that its handling of the issue was inadequate. Following publication of a Letter to the Editor from Kranke and colleagues in April 2000, along with a non-specific response from Dr. Fujii, there was no follow-up on the allegation of data manipulation and no request for an institutional review of Dr. Fujii's research. Anesthesia & Analgesia went on to publish 11 additional manuscripts by Dr. Fujii following the 2000 allegations of research fraud, with Editor Steven Shafer stating in March 2012 that subsequent submissions to the Journal by Dr. Fujii should not have been published without first vetting the allegations of fraud. In April 2012 Shafer led a group of editors to write a joint statement, in the form of an ultimatum made available to the public, to a large number of academic institutions where Fujii had been employed, offering these institutions the chance to attest to the integrity of the bulk of the allegedly fraudulent papers.

    Photo manipulation

    Compared to other forms of scientific misconduct, image fraud (manipulation of images to distort their meaning) is of particular interest since it can frequently be detected by external parties. In 2006, the Journal of Cell Biology gained publicity for instituting tests to detect photo manipulation in papers that were being considered for publication. This was in response to the increased usage of programs such as Adobe Photoshop by scientists, which facilitate photo manipulation. Since then more publishers, including the Nature Publishing Group, have instituted similar tests and require authors to minimize and specify the extent of photo manipulation when a manuscript is submitted for publication. However, there is little evidence to indicate that such tests are applied rigorously. One Nature paper published in 2009 has subsequently been reported to contain around 20 separate instances of image fraud.

    Although the type of manipulation that is allowed can depend greatly on the type of experiment that is presented and also differ from one journal to another, in general the following manipulations are not allowed:

  • splicing together different images to represent a single experiment
  • changing brightness and contrast of only a part of the image
  • any change that conceals information, even when it is considered to be aspecific, which includes:
  • changing brightness and contrast to leave only the most intense signal
  • using clone tools to hide information
  • showing only a very small part of the photograph so that additional information is not visible
  • Consequences for science

    The consequences of scientific fraud vary based on the severity of the fraud, the level of notice it receives, and how long it goes undetected. For cases of fabricated evidence, the consequences can be wide-ranging, with others working to confirm (or refute) the false finding, or with research agendas being distorted to address the fraudulent evidence. The Piltdown Man fraud is a case in point: The significance of the bona-fide fossils that were being found was muted for decades because they disagreed with Piltdown Man and the preconceived notions that those faked fossils supported. In addition, the prominent paleontologist Arthur Smith Woodward spent time at Piltdown each year until he died, trying to find more Piltdown Man remains. The misdirection of resources kept others from taking the real fossils more seriously and delayed the reaching of a correct understanding of human evolution. (The Taung Child, which should have been the death knell for the view that the human brain evolved first, was instead treated very critically because of its disagreement with the Piltdown Man evidence.)

    In the case of Prof Don Poldermans, the misconduct occurred in reports of trials of treatment to prevent death and myocardial infarction in patients undergoing operations. The trial reports were relied upon to issue guidelines that applied for many years across North America and Europe.

    In the case of Dr Alfred Steinschneider, two decades and tens of millions of research dollars were lost trying to find the elusive link between infant sleep apnea, which Steinschneider said he had observed and recorded in his laboratory, and sudden infant death syndrome (SIDS), of which he stated it was a precursor. The cover was blown in 1994, 22 years after Steinschneider's 1972 Pediatrics paper claiming such an association, when Waneta Hoyt, the mother of the patients in the paper, was arrested, indicted and convicted on 5 counts of second-degree murder for the smothering deaths of her five children. While that in itself was bad enough, the paper, presumably written as an attempt to save infants' lives, ironically was ultimately used as a defense by parents suspected in multiple deaths of their own children in cases of Münchausen syndrome by proxy. The 1972 Pediatrics paper was cited in 404 papers in the interim and is still listed on Pubmed without comment.

    Consequences for those who expose misconduct

    The potentially severe consequences for individuals who are found to have engaged in misconduct also reflect on the institutions that host or employ them and also on the participants in any peer review process that has allowed the publication of questionable research. This means that a range of actors in any case may have a motivation to suppress any evidence or suggestion of misconduct. Persons who expose such cases, commonly called whistleblowers, can find themselves open to retaliation by a number of different means. These negative consequences for exposers of misconduct have driven the development of whistle blowers charters - designed to protect those who raise concerns. A whistleblower is almost always alone in their fight - their career becomes completely dependent on the decision about alleged misconduct. If the accusations prove false, their career is completely destroyed, but even in case of positive decision the career of the whistleblower can be under question: their reputation of "troublemaker" will prevent many employers from hiring them. There is no international body where a whistleblower could give their concerns. If a university fails to investigate suspected fraud or provides a fake investigation to save their reputation the whistleblower has no right of appeal.

    Exposure of fraudulent data

    With the advancement of the internet, there are now several tools available to aid in the detection of plagiarism and multiple publication within biomedical literature. One tool developed in 2006 by researchers in Dr. Harold Garner's laboratory at the University of Texas Southwestern Medical Center at Dallas is Déjà vu, an open-access database containing several thousand instances of duplicate publication. All of the entries in the database were discovered through the use of text data mining algorithm eTBLAST, also created in Dr. Garner's laboratory. The creation of Déjà vu and the subsequent classification of several hundred articles contained therein have ignited much discussion in the scientific community concerning issues such as ethical behavior, journal standards, and intellectual copyright. Studies on this database have been published in journals such as Nature and Science, among others.

    Other tools which may be used to detect fraudulent data include error analysis. Measurements generally have a small amount of error, and repeated measurements of the same item will generally result in slight differences in readings. These differences can be analyzed, and follow certain known mathematical and statistical properties. Should a set of data appear to be too faithful to the hypothesis, i.e., the amount of error that would normally be in such measurements does not appear, a conclusion can be drawn that the data may have been forged. Error analysis alone is typically not sufficient to prove that data have been falsified or fabricated, but it may provide the supporting evidence necessary to confirm suspicions of misconduct.

    Data sharing

    Kirby Lee and Lisa Bero suggest, "Although reviewing raw data can be difficult, time-consuming and expensive, having such a policy would hold authors more accountable for the accuracy of their data and potentially reduce scientific fraud or misconduct."

    Brazil

  • Denis de Jesus Lima Guerra was fired from the Federal University of Mato Grosso in 2014 after having 13 papers retracted, in what is considered the biggest case of scientific fraud in Brazil.
  • Canada

  • Gideon Koren's publication of article without informed consent of co-author Nancy Olivieri.
  • In 2015, Ranjit Chandra was stripped of his Order of Canada membership after accusations of scientific wrongdoing
  • China

  • H. Zhong, T. Liu, and their co-workers at Jinggangshan University have retracted numerous papers published in Acta Crystallographica following systematic checking which revealed that the organic structures claimed in these papers were impossible or implausible. The supporting data appeared to have been taken from valid cases which had then been altered by substituting different atoms into the structures.
  • Denmark

  • Milena Penkowa (neuroscience), resigned her professorship after accusations of scientific misconduct, fraud, and embezzlement of research funds.
  • Germany

  • Joachim Boldt (drug research), stripped of his professorship, under criminal investigation for possible forgery of up to 90 research studies.
  • Silvia Bulfone-Paus (immunology), 13 peer-reviewed journal articles retracted following investigations of alleged misconduct.
  • Jan Hendrik Schön (physics of semiconductors), forged results, using the same graph image in different contexts.
  • Friedhelm Herrmann (cancer research). Fraud investigation concludes that self-regulation has failed.
  • Iran

    There are many networks buying and selling ISI articles and theses in Iran. Iran does not follow a clear policy on plagiarism, and lacks a comprehensive law in this regard. This has been announced by the head of Tehran University as a national crisis in Iran.

  • Masoumeh Ebtekar, head of Iranian Department of Environment has served at Tarbiat Modares University in Tehran as a faculty member. On 7 October 2008, eTBLAST, a text similarity search engine on MEDLINE database, noted that 85% of a paper published by Masoumeh Ebtekar came from several previously published articles. The paper, on cytokines and air pollution, was published in 2006 in the Iran Journal of Allergy Asthma Immunology (IJAAI) 5 47-56:2006.
  • 58 scientific publications from Iran-based researchers were retracted by publishers due to the probable plagiarisms in November 2016.
  • In another case against Ali Akbar Mehrabian, Hossein Hashemi reformist MP and head of parliament's industry committee (that the time is now Tehran governor) said after scrutiny of the safe room project in favor of Ali Akbar Mehrabian voted court and the statements of the plaintiff's claim rejected.
  • Israel

  • Alexander Spivak, a tenured senior lecturer at Holon Institute of Technology (HIT), plagiarized a paper written in 2001 by his former postdoctoral adviser and two other researchers from Tel Aviv University. Two chapters of their original paper were copied-and-pasted and published, as two separate articles, in the International Journal of Pure and Applied Mathematics (IJPAM) seven years later. After the plagiarism was discovered in 2014, both papers were retracted by the IJPAM Managing Editor. The HIT administration's handling of the plagiarism affair received harsh criticism in Israel and abroad after the plagiator was given a sabbatical. In May 2015, yet another paper by Spivak was retracted from the NumAn-2014 Conference Proceedings.
  • Japan

  • Teruji Cho (Plasma Physics). Cho and three coworkers falsified raw data reported in a research paper. Cho was dismissed by the University of Tsukuba.
  • Yoshitaka Fujii (anaesthesiology) was found to have fabricated data in at least 172 scientific papers, setting what is believed to be a record for the number of papers by a single author requiring retractions.
  • Haruko Obokata (stem cell biology)
  • Akio Sugino (molecular biology) was fired by Osaka University after an investigating committee found he had fabricated research data.
  • Kazunari Taira (molecular biology). An investigation by the University of Tokyo found a "high possibility" that fraud was involved in a series of RNA studies from Taira's laboratory. Taira's contract was not renewed.
  • Netherlands

  • Mart Bax (anthropology) – Various kinds of serious scientific misconduct. For example, in two cases Bax stated to have relied on one single local informant who told him improbable stories about public events that were not confirmed by anyone else. Bax did not check the stories and wrote them down in detail as if these they were historical facts. The commission that investigated Bax' research was unable to interview these two informants, so data fabrication by Bax could not be proven.
  • Jens Förster (social psychology) – Fabricated data. Förster is a German social psychologist who held a chair as professor of psychology at the University of Amsterdam from 2007 to May 2014. An anonymous whistleblower alerted the university in 2012 to a strong regularity in data from studies published by Förster that was extremely unlikely for data obtained in the field. After examination of the evidence, an integrity committee at the university concluded that the patterns in the published data were "practically impossible". One of the studies was retracted in November 2014, after Förster had already left for a prestigious appointment as Alexander von Humboldt Professor at Ruhr University Bochum in Germany. In April 2015, Ruhr University announced that Förster had withdrawn his candidacy for the professorship. A subsequent report by an investigative panel commissioned by the University of Amsterdam published in June 2015 found "strong evidence for low veracity" in eight further studies authored by Förster.
  • Diederik Stapel (social psychology) – Fabricated data in high-publicity studies of human behaviour. Stapel committed scientific fraud in at least 55 of his papers, as well as in 10 Ph.D. dissertations written by his students. According to the New York Times, Stapel "perpetrated an audacious academic fraud by making up studies that told the world what it wanted to hear about human nature."
  • Norway

  • A researcher employed by a Norwegian hospital (Stavanger universitetssjukehus) analyzed samples of spinal fluid from patients, after the researcher had added a substance to the sample.
  • Jon Sudbø fabricated data for a study that reported "nonsteroidal anti-inflammatory drugs reduced the risk of oral cancer".
  • Romania

  • Dănuț Marcu, a Romanian mathematician and computer scientist, who was banned from several journals due to plagiarism. He had submitted a manuscript which was more-or-less word for word the same as a paper written by another author.
  • Ioan Mang, a computer scientist at the University of Oradea, plagiarized a paper by cryptographer Eli Biham, Dean of the Computer Science Department of Technion, Haifa, Israel. He was accused of extensive plagiarism in at least eight of his academic papers.
  • Singapore

  • Alirio Menendez, a professor of immunology, was found guilty of misconduct on an "unprecedented" scale by a committee at the National University of Singapore (NUS), by having fabricated, falsified or plagiarized at least 21 research papers published in international academic journals. Menendez originally worked at NUS but moved to the UK in 2007, where he first worked at the University of Glasgow and next the University of Liverpool.
  • South Korea

  • Woo-Suk Hwang (Hwang Woo-Suk) (cloning).
  • Spain

  • Juan Carlos Mejuto and Gonzalo Astray. (chemical physics). Two papers in Journal of Chemical and Engineering Data withdrawn by the editor because of plagiarism.
  • José Román-Gómez, University of Córdoba (Spain), appropriation of gel images in claimed work on signalling and DNA methylation in leukaemia
  • Switzerland

  • Bruno Frey slammed for self-plagiarism in articles about the Titanic disaster .
  • United Kingdom

  • Researchers at Cranfield University – aircraft cabin air contamination – fudged data and inappropriate methodology.
  • Richard Eastell - Actonel Affair; resigned after allegations of financial irregularities; (Medicine).
  • Malcolm Pearce (author) - Fraudulent description of successful reimplantation of ectopic pregnancy.
  • Andrew Wakefield - Fraudulent paper in the Lancet associating the MMR vaccine with autism.
  • United States

  • Edward Awh and graduate student David Anderson (neuroscience), have retracted nine empirical papers in 2015 and 2016, due to data fabrication. This was named one of the Top 10 Retractions of 2015 by The Scientist.
  • John Darsee (cardiology) – data fabrication as well as errors/discrepancies on 16 of 18 full-length research articles, and an unknown number of over 100 additional abstracts and book chapters.
  • Dipak Das was found guilty of 145 counts of fabrication or falsification of data at the University of Connecticut Health Center.
  • Terry Elton was found guilty of misconduct by both Ohio State University and the Office of Research Integrity.
  • Doctoral student Roxana Gonzalez (social psychology) engaged in scientific misconduct in research supported by National Institute of Mental Health (NIMH) and National Institutes of Health (NIH). The United States Office of Research Integrity found that data falsification altered five published articles first-authored by Jennifer Lerner. As a result, articles were retracted from the Personality and Social Psychology Bulletin, Biological Psychiatry, and the Journal of Experimental Psychology.* Dong-Pyou Han was an assistant professor of biomedical sciences at Iowa State University who spiked samples of rabbit blood with human antibodies to make an experimental HIV vaccine appear to have great promise.
  • Marc Hauser (evolutionary psychology).
  • Industrial Bio-Test Laboratories fabricated research data to the extent that upon FDA analysis of 867 studies, 618 (71%) were deemed invalid, including many of which were used to gain regulatory approval for widely used household and industrial products.
  • H.M. Krishna Murthy, while at the University of Alabama at Birmingham, had nine papers on protein structures retracted because his experimental findings appear to be false or fabricated.
  • Victor Ninov (nuclear physics)
  • Ohio University had a plagiarism crisis in the 2000s when severe plagiarism in MS theses was discovered. This resulted in the firing of two tenured professors, Dr. Gunasekara and Dr. Mehta of the Mechanical Engineering Department and multiple institutional changes. Plagiarism included that by current professor at Miami University
  • Leo Paquette (chemist), an Ohio State University professor who had plagiarized sections from an unfunded NIH grant application for use in his own NIH grant application and had plagiarized a NSF proposal for use in one of his scientific publications.
  • Eric Poehlman, a researcher on aging at the University of Vermont, was prosecuted for grant fraud in 2005 after falsifying data in as many as 17 grant applications in a period of over eight years. He was the first academic in the United States to be jailed for falsifying data in a grant application.
  • Anil Potti (cancer researcher), formerly at Duke University, is the subject of a scientific misconduct investigation alleging that he falsified results of cancer genomics data. To date, ten of his publications have been retracted.
  • Scott Reuben (medical management of pain)
  • Karen M. Ruggiero (social psychology), fabricated data on at least five experiments while at Harvard University on research regarding gender and discrimination in studies supported by NIH
  • Eric J. Smart, nutrition researcher, associate professor and vice-chairman of the Department of Pediatrics and the Barnstable-Brown Chair in Diabetes Research at the University of Kentucky, was censured in November 2012 by the US government’s Office of Research Integrity for a career of scientific misconduct that lasted over 10 years. According to the allegations in the report, published in the Federal Register, he falsified data in at least 10 papers and many grant applications.
  • Albert Steinschneider – Sleep apnea, Pursued a theory on the causes of SIDS even though his research never really supported it. He and his disciples eventually channeled tens of millions of dollars in federal grant money to research programs based on Steinschneider's hunch, even though the theory was never demonstrable or duplicable in the lab.
  • Marc Straus has admitted that as lead researcher he took responsibility for fabricated data in oncology studies at the Boston University School of Medicine in the late 1970s.
  • Luk Van Parijs (biology) – multiple retractions and criminal conviction for grant fraud
  • Weishui Weiser (immunology), falsifying data in biomedical research supported by two Public Health Service (PHS) grants.
  • Non-institutional and non-corporate research

  • Michael LaCour, a graduate student in political science at UCLA, was the lead author of an article published in Science with Donald P. Green (2014) that purported to demonstrate that it is possible to change peoples' minds on socially divisive issues such as gay marriage, via direct contact and conversations with gay people. The study's findings made international headlines and received wide media attention, and was critiqued by Donald P. Green's students: David Broockman, Joshua Kalla, and Peter Aronow on May 19, 2015, titled "Irregularities in LaCour (2014)". Co-author of the study, Donald Green, subsequently requested that the paper be retracted. Despite the fact that, Donald Green, as senior author on the Science paper, certified that he had examined the raw/original data on his Science/AAAS Authorship Form and Statement of Conflicts of Interest. The Science terms include, prominently, the following statement: ``The senior author from each group is required to have examined the raw data their group has produced." On May 28, 2015, the study was retracted by Science. without the agreement of the lead author.
  • Andrew Wakefield, who claimed links between the MMR vaccine, autism and inflammatory bowel disease. He was found guilty of dishonesty in his research and banned from medicine by the UK General Medical Council following an investigation by Brian Deer of the London Sunday Times (broken citation).
  • References

    Scientific misconduct Wikipedia


    Similar Topics