Why We Need Whistleblowers–One-third US Scientists Admit Unethical Research Practices

Why We Need Whistleblowers–One-third US Scientists Admit Unethical Research Practices

Sat, 11 Jun 2005

Although the major focus of these AHRP Infomails has been on the deplorable misconduct by the pharmaceutical industry and government health care agencies, whenever we pointed a flashlight at academia and biomedical scientists, we raised the discomfort level of the apologists who think anyone who criticizes scientific misconduct is a yahoo who endangers the research enterprise.

The fact is, Big Pharma could not possibly have succeeded in undermining the integrity of American medicine without the complicity of leading academics at premier medical institutions–including Harvard, Yale, Columbia, Johns Hopkins, the University of California, and the National Institutes of Health.

A survey of 3,000 government-funded scientists (who responded anonymously) is published in the journal, Nature, confirming our observations and the underlying reason for a continuing stream of news reports about medical research scandals. Brian C. Martinson, the lead author of the study said: “We found a a striking level and breadth of misbehavior…this really causes us to call into question the assumption that it is just a few bad apples.”

The scientists admitted to “circumventing the rules on using human subjects in experiments, and not properly disclosing ties with companies.”

The Boston Globe notes: “Trust and integrity lie at the heart of the scientific process, with published experimental results making careers, determining whether scientists win research grants, and shaping spending priorities in the nearly $30 billion budget of the National Institutes of Health. At a time when scandals have shaken the worlds of business, politics, and journalism, the authors of the new report said that similar factors — such as intense competition and human failings such as greed and cynicism — threaten the fundamental working of science.”

Indeed, those who conducted the survey said that the problem goes well beyond the egregious cases that the government is authorized to investigate.

Scientists can no more be trusted than their partners in industry. Without viable independent checks and balances to monitor and enforce compliance with ethical standards in health care, research, business, and government, we must rely on whistleblowers and lawyers to rein in major misconduct.

Contact: Vera Hassner Sharav
212-595-8974

The Boston Globe
Surveyed scientists admit misconduct One-third cite research tactics
By Gareth Cook, Globe Staff | June 9, 2005

A third of American biomedical scientists have engaged in questionable research practices, according to survey results released yesterday that raise questions about the integrity of the nation’s multibillion-dollar quest to understand the human body and cure diseases.

The study, based on a survey of about 3,000 government-funded scientists, is the first broad, quantitative examination of misconduct that asked researchers to admit their own misdeeds. The scientists, who participated anonymously, were asked whether they had done any of 33 actions in the three years before the 2002 survey. Asked about the most serious misconduct, 0.3 percent said they had falsified data, and 1.4 percent said they had used another’s ideas without gaining permission or giving credit. In addition, 15.5 percent said they had changed how they conducted an experiment or its results in response to pressure from a funding source, raising the prospect that companies are influencing scientific papers to support their commercial interests. The scientists also admitted a range of other misdeeds, such as circumventing the rules on using human subjects in experiments, and not properly disclosing ties with companies.

“We found a striking level and breadth of misbehavior,” said lead author Brian C. Martinson, a researcher at HealthPartners Research Foundation in Minneapolis. “I think this really causes us to call into question the assumption that it is just a few bad apples.”

There is no way, said Martinson, to gauge how much of the nation’s research was compromised by the misconduct. And several specialists on scientific conduct said that it was difficult to know from the study how common scientific misbehavior is because many of the questions were worded vaguely, and could include behavior that is not objectionable. For example, a scientist might have changed the design of an experiment after a legitimate suggestion from a government funding source.

But the specialists welcomed the work, which was published in the journal Nature, saying more research like it is needed at a time when science is becoming increasingly commercialized.

Trust and integrity lie at the heart of the scientific process, with published experimental results making careers, determining whether scientists win research grants, and shaping spending priorities in the nearly $30 billion budget of the National Institutes of Health. At a time when scandals have shaken the worlds of business, politics, and journalism, the authors of the new report said that similar factors — such as intense competition and human failings such as greed and cynicism — threaten the fundamental working of science. They said the problem goes well beyond the egregious cases that the government is authorized to investigate.

Editors of prominent medical journals have been increasingly vocal about financial conflicts of interest that they say are hampering science, causing researchers to hype positive results and downplay negative ones. Yet the topic of misconduct tends to make scientists uneasy, and that has led to a dearth of research on the subject.

“I think this is a very important step,” said C. K. Gunsalus, a special counsel at the University of Illinois at Urbana-Champaign, and one of the nation’s leading specialists on research integrity. “The discomfort means that we don’t like to talk about it, and that means we don’t have good data.”

Surveying misconduct was controversial even before the current study was done. In 2002, the US government’s Office of Research Integrity proposed conducting a survey of scientific misconduct, but several scientific groups, including the Association of American Medical Colleges, objected. They said that the survey questions were vague and might be misused, and that the federal government’s role should be restricted to policing fabrication, falsification, and plagiarism.

The government study was eventually canceled, and that same year the editors of Nature harshly criticized the scientific groups for their role in stopping it, saying they gave “a good impersonation of aged, out-of-touch special interests with something to hide.”

The survey reported yesterday was done with government funding, including money from the Office of Research Integrity, but it was conducted by an independent scientific team. An official with the Association of American Medical Colleges, which represents some of the nation’s leading biomedical research institutions, said the group had no objection to the survey being done this way, but she declined to comment on the results of the study, saying she had not had time to review it carefully.

Martinson said his team designed the survey based on interviews with scientists about the kinds of misbehavior they believe are most common. In these interviews, he said, he was surprised at how candid scientists were in describing a wide range of problems. Some said they felt guilty crossing ethical lines, but that they needed to in order to succeed. One scientist, he said, described coming across a case where his own work had been systematically plagiarized, but the scientist did not report it because the person who had done it was a powerful figure in the field.

The team designated 10 of the behaviors as the most serious types of misconduct, based on interviews with officials at universities who oversee research integrity. Thirty-three percent of scientists admitted to at least one of these 10 behaviors in the three years before the survey, according to the paper.

In the report, titled “Scientists Behaving Badly,” the most common misbehavior was making changes in response to pressure from a funder. There have been cases, now public, where drug firms have pressured scientists to rewrite or not publish papers because they would harm the market for one of their products.

Two of the most common practices found in the survey are likely to raise red flags because they hint at a breakdown of the basic checks and balances that are supposed to correct the scientific record. Of the scientists surveyed, 12.5 percent admitted to “overlooking others’ use of flawed data or questionable interpretation of data,” and 6 percent admitted to “failing to present data that contradict one’s own previous research.” The paper also reported other behaviors beyond what it called the “top 10” most serious offenses. Ten percent admitted to “inappropriately assigning authorship credit” and 15.3 percent admitted to “dropping observations or data points from analyses based on a gut feeling that they were inaccurate.”

But because of the vagueness of many of the questions, it is impossible to know how serious an infraction the scientists were admitting to, or even if it was an infraction, said Dr. Drummond Rennie, a deputy editor of the Journal of the American Medical Association who has been a longtime advocate for more study of misconduct. Rennie said that he welcomed the work and hoped there would now be more rigorous study of the issue.

Another problem, Rennie and others said, is that the survey relies on scientists to report on themselves, and even with the promise of anonymity, the results depend on the honesty of the people filling it out. Also, only about half of the scientists responded to the survey, which was mailed.

Martinson said that he agreed there were flaws in the study, but that he hoped it would inspire more discussion of the problem.

“I don’t have all the answers,” Martinson said. “What I think I have here is some evidence that suggests we need to begin a more broad-based conversation.”

Scientists study scientists behaving badly
Bad practices pose threat to integrity of profession
The Associated Press
Updated: 1:59 p.m. ET June 8, 2005

WASHINGTON – While rare cases of scientific fraud grab headlines, more mundane misbehaviors are so common among researchers that they pose a threat to the integrity of the scientific enterprise, a new report asserts.

One-third of scientists surveyed said that within the previous three years, they’d engaged in at least one practice that would probably get them into trouble, the report said. Examples included circumventing minor aspects of rules for doing research on people and overlooking a colleague’s use of flawed data or questionable interpretation of data.

Such behaviors are “primarily flying below the radar screen right now,” said Brian C. Martinson of the HealthPartners Research Foundation in Minneapolis, who presents the survey results with colleagues in a commentary in Thursday’s issue of the journal Nature.

Scientists “can no longer remain complacent about such misbehavior,” the commentary says.

But “I don’t think we’ve been complacent,” said Mark S. Frankel, director of the Scientific Freedom, Responsibility & Law Program at the American Association for the Advancement of Science.

Frankel, who wasn’t involved in the survey, said its results didn’t surprise him. But he said that the survey sampled only a slice of the scientific community and shouldn’t be taken as applying to all scientists.

More than 3,000 scientists surveyed

The survey included results from 3,247 scientists, roughly 40 percent of those who were sent the questionnaire in 2002. They were researchers based in the United States who’d received funding from the National Institutes of Health. Most were studying biology, medicine or the social sciences, with others in chemistry and a smaller group in math, physics or engineering.

Of the 10 practices that Martinson’s study described as the most serious, less than 2 percent of respondents admitted to falsifying data, plagiarism or ignoring major aspects of rules for conducting studies with human subjects. But nearly 8 percent said they’d circumvented what they judged to be minor aspects of such requirements. The survey questions didn’t name those specific points.

Nearly 13 percent of those who responded said they’d overlooked “others’ use of flawed data or questionable interpretation of data,” and nearly 16 percent said they had changed the design, methods or results of a study “in response to pressure from a funding source.”

Confusing questions?

Martinson said the first question referred to other researchers in their own lab, and the second question referred to pressure from companies funding their work.

But David Clayton, vice president and chief scientific officer at the Howard Hughes Medical Institute, which focuses on biomedical research, said he found both questions worded so vaguely that they could be referring to perfectly acceptable activities.

Clayton also says it’s not clear whether the behaviors addressed in the survey have been increasing or declining over time.

C 2005 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.

PLoS Medicine
Why we whistleblowers are passionate in our convictions and grateful for assistance to reveal them
Stefan P Kruszewski MD
Published: 10 June 2005

Whistleblowers serve no function if they cannot tell their stories. The present story of whistleblowing—-as revealed, in part, in PLoS Medicine—-that involves the pharmaceutical industry, pharmaceutical benefit management corporations, the managed care industry and the political and lobbying forces that zealously guard their secrets, cannot be told without the help of courageous men and women. (1-2) For that reason, those of us who congregated in Washington DC on May 15th, 2005 at the invitation and support of the Public Library of Science (PLoS) and the Government Accountability Project feel particularly humbled and grateful by your reception. Our convictions could not be aired were it not for the essential First Amendment work of responsible journalists who exemplify the best in investigatory research.

For me, whistleblowing is not a theoretical exercise. It has a face. In fact, it has many faces: Children and adults who have been injured or killed by misrepresented pharmaceuticals; clinical research trial results that have been sequestered from the scientific community and whose incomplete findings cause injury; and pharmaceuticals that are physician and publicly-detailed, not to save lives or necessarily to improve the health or welfare of the recipients, but to make money.

In the lonely and, at times, discouraging world of whistleblowing, we whistleblowers are passionate —and often successful— because our efforts have a different goal than the corporations and political interests whose operations we occasionally challenge. Our goal is to tell the truth. That honest effort is the source of any ethical difference we can or might earn. It is the basis for the power of the whistleblower, one that can withstand the assault of unprecedented odds against its voice being heard, that sum of political power, expediency and money. Whistleblower success depends upon competent and articulate media, like PloS. The debate to improve the status quo—be it in pharmaceutical marketing or managed care decision-making—cannot proceed or flourish without it.

Ralph Waldo Emerson, American essayist and philosopher, 1803-1882, commented about success. I’ve adapted his comments for all of us who gathered in Washington in mid-May 2005: “To leave the world a bit better, whether by a healthy child, a garden patch or a redeemed social condition; To know even one life breathed easier because you have lived; this is to have succeeded (as a whistleblower).”

Stefan P. Kruszewski
Psychiatrist
Harrisburg, PA

Competing Interests: I declare that I have no competing interests.
Submitted Date: 10 June 2005

(1) Barbour V, Cohen B and Yamey G. (2005). Why PloS Sponsored a Roundtableof Medical Whistleblowers. PloS Med 2 (7): e208

(2) Lenzer J (2005) What Can We Learn from Medical Whistleblowers? PloS Med 9(7): e209.

FAIR USE NOTICE: This may contain copyrighted (C ) material the use of which has not always been specifically authorized by the copyright owner. Such material is made available for educational purposes, to advance understanding of human rights, democracy, scientific, moral, ethical, and social justice issues, etc. It is believed that this constitutes a ‘fair use’ of any such copyrighted material as provided for in Title 17 U.S.C. section 107 of the US Copyright Law. This material is distributed without profit.