Protecting People with Mental Disabilities and Impairments against Biomedical Research Abuse

Presented at the 29th International Congress on Law and Mental Health, Paris, France, July 6, 2005

John H. Noble, Jr., Ph.D, The Catholic University of America, Washington, DC, USA

Vera Hassner Sharav, MLS, Alliance for Human Research Protection, New York, USA

People with mental disabilities and impairments historically have been targeted by biomedical researchers and their governmental and industrial sponsors for exposure to experiments that impose high risks without offsetting therapeutic benefits on the theory that the future good of society justifies what was done to them. The final report of the Advisory Committee on Human Radiation Experiments (ACHRE)[1] systematically reviewed and condemned much of what was done to institutionalized persons with mental retardation in such places in the United States as Willowbrook in New York State and the Walter Fernald School in Massachusetts. Abusive biomedical research has continued to this day since the publication of the ACHRE report.

This paper reviews several recent cases, including symptom provocation experiments on schizophrenic patients, respirator tidal volume experiments on cognitively-impaired and unconscious patients with Acute Lung Infection (ALI) and Acute Respiratory Distress Syndrome (ARDS), experiments on children with HIV under municipal or state guardianship, and lead poisoning experiments on healthy poor children that caused ensuing neurological impairments. It notes the ubiquitous invocation of “surrogate consent” to enrol persons incapable of understanding the risks and benefits of the research to which they are subjected. Accordingly, we make a number of recommendations to strengthen the protections afforded to people with mental disabilities and impairments, including the definition of these human subjects as members of a protected class who require judicial registration and appointment of a court officer to assess the benefits and risks of the proposed research and to monitor how the research is actually conducted.

Existing Ethical Standards

The sources of ethical standards governing U.S. research that involves human subjects are: the Nuremberg Code,[2] the Declaration of Helsinki,[3] and the Belmont Report of the National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research,[4] on which current federal law and regulations are based.[5]

The Nuremberg Code was enacted by the WWII war crimes tribunal in reaction to the atrocities committed by Nazi physicians on concentration camp inmates before and during the war. The Code lays down the basic conditions and limits under which experiments on human subjects are permitted. First and foremost is that the human subject be legally capable of giving voluntary informed consent to participation “without the intervention of any element of force, fraud, deceit, duress, overreaching, or other ulterior form of constraint or coercion; and should have sufficient knowledge and comprehension of the elements of the subject matter involved as to enable him to make an understanding and enlightened decision” (Principle 1). Second, the human subject “should be at liberty to bring the experiment to an end if he has reached the physical or mental state where continuation of the experiment seems to him to be impossible” (Principle 9). Third, the experiment must offer the prospect of yielding “fruitful results for the good of society, unprocurable by other methods or means of study, and not random and unnecessary in nature” (Principle 2). Fourth, the risk entailed should “never exceed that determined by the humanitarian importance of the problem to be solved by the experiment” (Principle 6). Fourth, the experiment should be terminated if the scientist in charge “has probable cause to believe, in the exercise of good faith, superior skill and careful judgment required of him that a continuation of the experiment is likely to result in injury, disability, or death to the experimental subject” (Principle 10). 3

The Declaration of Helsinki of the World Medical Association builds on the Nuremberg Code and amplifies its provisions with respect to the conduct of (1) medical research combined with professional care and (2) non-therapeutic research involving human subjects. Generally, Principle 6 of the Declaration enunciates “the right of the research subject to safeguard his or her integrity” and the duty of the scientifically qualified persons under the supervision of a clinically competent medical person “to respect the privacy of the subject and to minimize the impact of the study on the subject¹s physical and mental integrity and on the personality of the subject.” Principle 7 indicates that “physicians should abstain from engaging in research involving human subjects unless they are satisfied that the hazards involved are believed to be predictable” and that they “should cease any investigation if the hazards are found to outweigh the potential benefits.” Principle 9 affirms that “each potential subject must be adequately informed of the aims, methods, anticipated benefits and potential hazards of the study and the discomfort it may entail; that he or she is at liberty to abstain from participation in the study and that he or she is free to withdraw his or her consent to participation at any time.”

The Declaration addresses a number of issues on which the Nuremberg Code is silent. Principle 8 requires that the physician “preserve the accuracy of the results” in publications and that reports “not in accordance with the principles laid down in this Declaration should not be published.” With respect to medical research combined with professional care, the Declaration states that “the potential benefits, hazards and discomfort of a new method should be weighed against the advantages of the best current diagnostic and therapeutic methods” and that “every patient, including those in a control group, if any, should be assured of the best proven diagnostic and therapeutic method.” With respect to non-therapeutic biomedical research involving human subjects, the Declaration asserts that the physician is “to remain the protector of life and health of that person on whom biomedical research is carried out;” that the subjects should be volunteers regardless of health status; and that “the interest of science and society should never take precedence over considerations relating to the well-being of the subject.”2

The Belmont Report of the National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research expands the scope of the Nuremberg Code and the Declaration of Helsinki to include behavioural research involving the use of human subjects. It adopts a three-fold classification of the basic ethical principles governing biomedical and behavioural research involving use of human subjects in terms of (1) respect for persons, (2) beneficence, and (3) justice. The Belmont Report concern for “justice” goes beyond the provisions of the Nuremberg Code and the Declaration of Helsinki – explicitly reflecting sensitivity to the infamous Tuskegee Institute study of the progression of untreated syphilis among poor African-American males well after the discovery of penicillin.[6]

Under the rubric, “respect for persons,” the Belmont Report recognizes two related ethical values – (1) the human subject¹s right to autonomy or self-determination in deciding whether or not to participate in a specific biomedical or behavioural research endeavour and (2) the need to protect those with diminished ability to decide. In doing so, the Belmont Report reformulates Principle 1 of the Nuremberg Code and combines Principles 6 and 9 of the Declaration of Helsinki.

According to the Belmont Report, “beneficence” is the obligation to do no harm and to maximize possible benefits while minimizing possible harms. The obligation extends beyond the immediate human subject. Specific research endeavours are justified by future benefit to society through discovery of new and better ways of preventing or treating illness or by promoting healthy development – even though the human research subject may not directly benefit. The Belmont Report recognizes the ethical dilemma that arises when exposing healthy human subjects with no prospect of direct benefit to more than minimal risk for the possible future good of society.4

By permitting possible harm to occur to otherwise healthy human subjects – albeit for the possible future good of society – the Belmont Report qualifies and curtails the application of both Principle 4 of the Nuremberg Code that research is to be conducted so as to avoid all unnecessary physical and mental suffering and injury3 and Principle 6 of the Declaration of Helsinki which unambiguously declares that “the interests of the subject must always prevail over the interests of science and society.”2 By permitting individual researchers to take into account the always speculative “possible future good of society” when deciding to do research on otherwise healthy human subjects, the Belmont Report opened the Pandora¹s Box of rationalizations in the United States about what constitutes “minimal” and “more than minimal risk” in such research. At the same time, it raised a red flag about the ethics of research that cannot provide an answer to the question that justified its undertaking because of poor design or faulty implementation and, increasingly, the suppression, misrepresentation, or paring of findings in published reports.[7] [8] [9] [10] [11] [12] Faulty or distorted research cannot contribute to the possible future good of society and thus fails the test of beneficence.

The Belmont Report defines “justice” as “fairness in distribution” of the benefits and burdens of research. It also states that “equals ought to be treated equally.” Examples of inequalities are offered to give context and clarity to what is meant by the fair and unfair distribution of benefits and burdens. They include selection of vulnerable populations, such as welfare patients, specific racial and ethnic minorities, or institutionalized populations, for research use. Easy availability, compromised status, and manipulability are the cited reasons for their selection. It is unfair that human research subjects be used to create high quality medical care for which others can afford to pay but they cannot. Unfortunately, “justice” in the selection of research subjects remains an elusive goal – given the economics of medical care in the United States that depends largely on the ability to pay to ration a scarce resource.[13] Easy availability, compromised status, and manipulability of human subjects are an ever-present moral hazard for biomedical researchers in the search for a sufficient number to populate expanding clinical trials.

Recent Research Abuses

We describe as abusive any research that violates the normative standards of the Nuremberg Code, the Declaration of Helsinki, or the Belmont Report. We use the three-fold Belmont Report classification to denote the kind of violation – dignity, beneficence, justice, or a combination.

Abuse of Dignity

Psychiatric symptom provocation experiments: In direct violation of Principle 1 of the Nuremberg Code and Principle 6 of the Declaration of Helsinki a large number of symptom provocation experiments have been conducted in various places[14] to induce psychosis and “flash-back” in severely disabled psychiatric patients, many of whom were incoherent and psychotic when recruited at the time of admission to a state mental hospital. Principle 1 of the Nuremberg Code states: “The voluntary consent of the human subject is absolutely essential . . . the person involved should have legal capacity to give consent.” Principle 6 of the Declaration of Helsinki states: “The right of the research subject to safeguard his or her integrity must always be respected. Every precaution should be taken . . . to minimize the impact of the study on the subject¹s physical and mental integrity and on the personality of the subject.”

Clearly, patients who are subjected to abrupt withdrawal or washout of their prescribed anti-psychotic medications to induce a reported 40% to 67% rate of relapse suffer an assault on their physical and mental integrity and personality. Some symptom provocation experiments have used chemical probes, such as amphetamine, L-dopa, methylphenidate and the PCP-derivative, ketamine, among others, to provoke psychosis. Other experiments have exposed detoxified U.S. war veterans to the addictive drugs amphetamine and cocaine. None of these experiments has served a therapeutic purpose but rather has diminished the health status of patients who mistakenly perceived their physician-researcher as healer.[15] Indeed, the legal capacity of mentally impaired patients to give informed consent, as required by Principle 1 of the Nuremburg Code, is arguably suspect in many if not most cases.

Ventilation Pressure Experiments: In direct violation of Principle 1 of the Nuremburg Code and Principle 6 and Clinical Research Principle 3 of the Declaration of Helsinki, the National Institutes of Health (NIH) and private, for-profit sources funded 12 major American research centers to conduct an experiment[16] to compare the effects of two static extreme ventilator tidal volume levels on 861 critically ill patients with acute respiratory distress syndrome (ARDS). Most patients were too incapacitated to give informed consent. There was no control group receiving the individualized standard treatment provided by primary care physicians. Clinical Research Principle 2 of the Declaration of Helsinki states: “The potential benefits, hazards and discomfort of a new method should be weighed against the advantages of the best current diagnostic and therapeutic methods.” A formal complaint by the Alliance for Human Research Protection (AHRP) about the ethics and methodology of the study led to its suspension and that of another related one involving patients with Acute Lung Injury (ALI).[17] Investigation by the Office of Human Research Protection (OHRP), the ethics oversight agency of the U.S. Department of Health and Human Services (HHS), found that many ARDS patients had been enroled without personal informed consent or surrogate consent by a legally-authorized agent.

Internal NIH critics[18] of the experiment argued that both the extreme low tidal volume and the extreme high tidal volume, which the researchers defined as “traditional,” were outside of the range of what most physicians normally prescribe and, therefore, incorrectly characterized. In effect, the study compared two static experimental treatments without benefit of a control group providing the individualized standard treatment of a primary care physician. The critics argued further that all human subjects in the study might have been endangered by random assignment to the static experimental ventilator tidal volumes higher or lower than prescribed by the individualized standard of care. Reanalysis of the original data by these same critics[19] confirmed that some ARDS patients had indeed been harmed. The critics also offered an explanation. After enrollment in the trial, earlier routine care for patients was stopped. Patients who received high tidal volumes had a mortality rate of 41 percent compared to the lower rate of 32 percent among eligible patients who received routine standard care. In patients with more injured, less elastic lungs, increasing tidal volumes from routine care levels after randomization increased mortality rates. Similarly, in patients with less injured, more elastic lungs, reducing tidal volumes from routine care also increased mortality.[20]

Foster Care Children Experiments: In direct violation of Principle 1 of the Nuremberg Code and Principle 6 of the Declaration of Helsinki, more than four dozen Phase I and Phase II drug trials were conducted on infants and children who were either diagnosed with HIV infection or, in some cases, “presumed” to be infected. These children were in foster care guardianship of the New York City Agency for Children’s Services (ACS) and in at least seven states – Illinois, Louisiana, Maryland, New York, North Carolina, Colorado and Texas. Phase I and Phase II experiments serve to test the safety, toxicity, and maximum dose tolerance of drugs and are not meant to have therapeutic value for the patient. They present the greatest level of risk and discomfort for human subjects. Contrary to Clinical Principle 5 of the Declaration of Helsinki, the studies neither obtained informed consent nor communicated the fact and rationale for not doing so to an independent committee for consideration, comment and guidance. Contrary to Clinical Principle 6, the studies could not be justified by reference to the potential diagnostic or therapeutic value for the patient. The estimated 700 to 1,400 largely African-American foster children enrolled in the experiments were reported to have suffered rashes, vomiting, precipitous drops in infection-fighting blood cells, and in one study a very much higher death rate among those who received larger drug doses.[21]

Not only did the foster care children experiments violate Principle 1 of the Nuremburg Code and Principle 6 of the Declaration of Helsinki, they also ignored the U.S. Code of Federal Regulations (45 CFR 46.409 and 21 CFR 50.56) prohibition against the use of children who are wards of the state from being subjected to experiments involving greater than minimal risk, as well as related provisions for granting an exception – subject to the appointment of an independent qualified advocate for each child in addition to any other individual acting on behalf of the child as guardian or in loco parentis.[22]

Lead Poisoning Experiments: In direct violation of Principle 1 of the Nuremberg Code and Principle 6 and Non-Clinical Biomedical Research Principles 1-4 of the Helsinki Declaration, the Kennedy-Krieger Institute and Johns Hopkins University conducted an experiment on 108 poor, primarily African-American children to test the effects in their blood of varying levels of household lead paint dust exposure at 2, 6, 12, and 18 months compared to unexposed children. The developing brains of children under 5 years of age are known to be particularly sensitive to even low blood levels of lead and linkage to reduced intelligence (IQ).[23] Sadly, the experiment was jointly sponsored by the U.S. Environmental Protection Agency and the Maryland Department of Housing and Community Development. The economic payoff for slum landlords and the taxes they pay to city governments throughout the U.S. would be enormous if less than a 100 percent lead paint abatement policy could be justified by research that disclosed a possible minimal safe level.

Two families sued the Kennedy-Krieger Institute for negligent harm and were initially rebuffed by a lower court even before the attorneys of the plaintiffs could complete gathering information. On appeal the Maryland Court of Appeals sent their cases back to the lower court for trial with scathing criticism of the Kennedy-Krieger Institute and Johns Hopkins University, under whose supervision the study was conducted.[24] The Court of Appeals reaffirmed the applicable provisions of the Nuremburg Code and the Declaration of Helsinki – especially as they relate to non-therapeutic research on healthy human subjects.[25] Non-therapeutic research still requires that: (1) “the physician to remain the protector of the life and health of that person on whom biomedical research is being carried out,” (2) “subjects should be volunteers,” (3) initiated research should be discontinued if later determined that it may be harmful to the individual, and (4) “the interest of science and society should never take precedence over considerations related to the well-being of the subject.”

The Maryland Court of Appeals explicitly compared the lead-poisoning experiment to the infamous Tuskegee, Alabama, experiment that withheld effective treatment from black men infected with syphilis. It further castigated the Kennedy-Krieger Institute for failing to inform parents of the risks of participation – in effect, using their children as “canaries in the mine.” Particularly telling was the Court¹s criticism of the researchers and the Johns Hopkins University Institutional Review Board (IRB) for working together “to miscast the characteristics of the study in order to avoid the responsibility inherent in non-therapeutic research involving children.” In the Court¹s opinion, this was no case of mistaken error but one of calculated malfeasance. The Court further dismissed the argument of the defendants that the parents had given their consent for the children to participate in the study by denying the right of parents to put their children in harm¹s way:

Otherwise healthy children, in our view, should not be enticed into living in, or remaining in, potentially lead-tainted housing and intentionally subjected to a research program, which contemplates the probability, or even the possibility, of lead poisoning or even the accumulation of lower levels of lead in blood, in order for the extent of the contamination of the children’s blood to be used by scientific researchers to assess the success of lead paint or lead dust abatement measures. Moreover, in our view, parents, whether improperly enticed by trinkets, food stamps, money or other items, have no more right to intentionally and unnecessarily place children in potentially hazardous nontherapeutic research surroundings, than do researchers. In such cases, parental consent, no matter how informed, is insufficient.25(para. 21)

Abuse of Beneficence

As previously mentioned, faulty or misrepresented research cannot contribute to the future good of society and thus fails the test of beneficence. Unfortunately, faulty and misrepresented research is a growing rather than diminishing problem in the biomedical research community.7-12 One recent survey8 indicates that, overall, one-third of biomedical researchers admit to engaging in one or more kinds of unethical behaviour in the previous three years – scientists in early career reporting 28 percent and those in mid-career reporting 38 percent. Perversely, practice makes perfect the bad behaviour of scientists as they gain experience. The survey data calls into question the common belief of many U.S. government funding and regulatory agency officials that ethics education is the key to improving the protection of human research subjects. Truth be known, scientists cannot help but notice the lack of consequences for ethics violations and may well be encouraged by current policy and practice to adopt a cavalier attitude toward both the protection of human subjects and the integrity of science. They may be adopting the prevailing business practice that calculates the net cost compared to profit of settling lawsuits for harm done to consumers or the environment.

Unethical or marginally ethical behaviours erode the integrity of science and should be of great concern for everybody who values the scientific enterprise. Such behaviour obstructs scientific progress by encouraging false leads or permanently or temporarily blocking a promising avenue for investigation. Particularly obnoxious is the trade secret status of adverse events in U.S. law that threatens the life-safety of human subjects and the public health.7 Physicians and consumers are owed full, complete, and accurate information about the benefits and risks of drugs and medical devices to guide their safe use. The too-often practice of changing the design, methodology or results of a study under pressure from a funding source or non-disclosure of the use of flawed data or questionable interpretation of data by others suggest widespread collusion in deceit and deception.8 Public reaction to increased scientific abuse revealed by multiple sources could result in major cuts in the government research budgets. Fraud and abuse in public programs invite disillusionment, cutbacks and disinvestments in needed biomedical science and thereby threaten the future well-being of society.

Abuse of Justice

Despite the ACHRE1 exposure and condemnation of research abuses at the Walter Fernald School and Willowbrook, vulnerable institutionalized populations subject to the surrogate consent of their caretakers are still viewed as targets of opportunity by some members of the biomedical research community. Many of the psychiatric symptom provocation experiments targeted institutionalized psychiatric patients or WWII veterans who were served by the U.S. federal government Veterans Administration (VA) facilities and programs. The AIDS child experiments recruited poor African-American children in foster care who had been placed by family courts into the protective custody of state or municipal governments. While not institutionalized, the child lead poisoning experiment recruited poor, primarily African-American children, whose surrogate-consenting parents were susceptible in the opinion of the Maryland Court of Appeals to enticement by trinkets, food stamps, money or other items. By way of exception, poverty and race did not figure into the recruitment of patients for the ARDS ventilator tidal volume experiment. The inclusion and exclusion criteria were strictly physiological.[26] It was the severe cognitive impairment or the unconscious state of the subjects that was the source of the exploitable vulnerability and moral hazard for the researchers.

In each of these cases of research abuse, the responsible Institutional Review Board (IRB) failed its duty to apply and oversee implementation of applicable U.S. law and regulations governing the protection of human research subjects. Media and congressional attention[27] to research abuses in the U.S. and elsewhere, some believe, is increasing the difficulty and cost of recruiting sufficient numbers of human subjects for government and private enterprise biomedical research – so much so that American pharmaceutical companies are planning to move new clinical trials overseas to locations in Eastern Europe, South America and India.[28] Notwithstanding the possible explanation, the trend portends a worsening of biomedical research abuses of human subjects outside of the U.S and, indeed, may represent a new form of colonialism in view of India¹s recent loosening of rules governing clinical trials in response to demands from multinational drug companies seeking relief from the “strict regulations, elaborate safety and compensation requirements, and small populations” in Western countries.[29] Africa¹s scientific elite is being asked to speak out against new clinical trials that exploit vulnerable populations of their countries.[30]

Recommended System Reforms

Grouped by reference to the Belmont Report ethical principle primarily addressed, we make a number of recommendations for system reform and give their rationale. There is no claim of originality. Others have recognized the same flaws in the structure and functioning of the existing system of protection for human subjects in biomedical and behavioural research. Resistance to the adoption of these recommendations is expected from those who benefit from the system as it now operates. The opposition has “deep pockets” and is well-positioned to resist.[31] [32] As recently as June 28, 2005 the U.S. Environmental Protection Agency issued proposed regulations that would permit testing the safety of pesticides on children, pregnant women, and newborns while rejecting establishment of an independent ethics review board because it would “unnecessarily confine EPA¹s discretion.”[33]

Dignity

1. Define vulnerable populations, including persons requiring surrogate consent because of cognitive impairment, as a protected class in need of judicial oversight under uniform federal law.

Rationale: While current federal regulations define “vulnerable subjects” who require “additional safeguards” (45 CFR 46.111; 21 CFR 56.111), including informed consent from a “legally authorized representative” (45 CFR 46.116; 21 CFR 56.116), the safeguards are unclear, and most states have not specified who is authorized to provide informed consent.[34] As our case examples illustrate, continuing uncertainty permits variable interpretations, which often reflect the interest of the researchers and their sponsors rather than the best interest of the human subjects.

2. Require appointment by a court of an ombudsman for members of vulnerable populations, including those who require surrogate consent, to assure Institutional Review Board (IRB) and researcher compliance with all federal regulations, from inception of the research to publication and archiving of reported findings and the original data on which statistical or other inferences rely.

Rationale: The 30-year history of IRB review and oversight of biomedical and behavioural research reveals weaknesses that require an additional overlay of protections for human subjects. There was IRB review and approval in all of our case examples. In the child lead-poisoning case the Maryland Court of Appeals castigated the Johns Hopkins University IRB for coaching the researchers on how to evade federal regulations to protect human subjects.25 Located in the very institutions that compete for government and private, for-profit research funds, the IRBs are not free and independent agents that can look solely to the protection of the best interest of human subjects. One might legitimately ask about the feasibility of creating a new layer of protection for vulnerable populations. Existing federal regulatory and oversight agencies are not adequately funded and staffed to monitor IRB performance and rely instead on outside complaints to detect flagrant ethics violations.

What would the new layer of protection add and how could it be financed? First, it would add a second opinion and thereby break the IRB monopoly in determining compliance with federal law and regulations. Further, the threat of being found wanting would likely improve the quality of IRB review and oversight of individual research projects. Second, the sponsors of biomedical research have enormous resources at their disposal and can well afford to pay for court-appointed ombudsmen and associated administrative cost. Some of the millions of dollars that are now spent on product advertising and promotion by pharmaceutical companies can be diverted to improving protections for the human research subjects that made possible their marketing in the first place. If existing federal law permits industry to reimburse the U.S. Food and Drug Administration (FDA) for the expense of review of applications for approval of new drugs or medical devices,[35] why not permit payment for additional needed protections of human research subjects?

A low-cost complementary additive would be strengthening of Qui Tam[36] and whistle-blower protection laws, which encourage insiders to reveal violations of federal law and regulations. The federal False Claims Act of 1863, as amended in 1986, could be interpreted or extended by defining violation of 45 CFR 46 and 21 CFR 50 human research subject protection provisions as evidence of “fraud and abuse.” This change would reward whistle-blowers, including employees of private, for-profit and non-profit entities that benefit from FDA approval of applications for new drug or medical devices, who come forward to reveal research abuses.

3. Amend existing Social Security Act Titles IV-B and IV-E child protection and foster care provisions to require training of child protection and foster care agencies and workers about the appropriate use of surrogate consent for enrolment of children in foster care in biomedical and behavioural research.

Rationale: As the unethical experiments involving foster care children clearly demonstrate, the child protection agencies involved were either ignorant of or chose to ignore the relevant federal regulations (45 CFR 46.409 and 21 CFR 50.56) that specify the conditions under which children who are wards of the state or any other agency, institution, or entity can be included in greater than minimal risk research. They should have known that the children could have been legitimately enrolled only if the research had been approved under 45 CFR 46.406 or 45 CFR 46.407 as (1) related to their status as wards or (2) conducted in schools, camps, hospitals, institutions, or similar setting in which the majority of children involved as subjects were not wards. They should also have known that the federal regulations require, in addition to any other individual acting on behalf of the child as guardian or in loco parentis, appointment by the IRB of a qualified, independent advocate for each child who is a ward to assure that the child¹s best interests are served during the time of participation in the research.

Beneficence

4. Abolish trade secret status for adverse events.

Rationale: Granting protected trade secret status for adverse events that must otherwise be reported pursuant to 21 CFR 312.32 investigational new drug (IND) safety reports contradicts the very purpose of the provision to secure the public health. As argued by Dr. Jerome Hoffman, Professor of Medicine and Emergency Medicine, University of California, Los Angeles, it is “unconscionable that the FDA appears to be prevented by law from carrying out what we all surely believe is its primary role in this process, which is to safeguard the interests of the public. The fact that the law not only does not make this a requirement – of the drug company itself, no less of the FDA – but that it actually makes it forbidden, clearly turns the function of this government agency on its head from protector of the public health to protector of industry.”7 Publication of partial or “asymmetrical” data in medical journals to protect trade secrets violates Principle 8 of the Declaration of Helsinki and blocks independent observation and replication of reported findings – one of the cardinal principles of empirical science.[37] The law in this regard makes worse an already compromised situation wherein the typically underpowered clinical trial with a small sample size is biased toward reporting “no statistically significant” evidence of adverse events.[38] As previously noted, trade secret status also frustrates communication of what physicians need to know about the benefits and risks of new drugs and medical devices in reaching decisions about their use for specific patients – a judgment in each case about the potential benefit versus the likely risk of an adverse reaction.

5. Mandate open-access publishing and archiving of original data for reanalysis and meta-analysis.

Rationale: Beneficence requires transparency and full disclosure of the knowledge that is obtained by use of human subjects in biomedical and behavioural research. To withhold or misrepresent the findings of such research is to render futile the risks and burden of the human subjects who made it possible, and flies in the face of Principle 2 of the Nuremberg Code that “the experiment should be such as to yield fruitful results for the good of society, unprocurable by other methods or means of study . . .” Further, in the view of a recent report of the UK House of Commons Health Committee,[39](p. 98) “traditional secrecy in the drug regulatory process” and “closeness between regulators and pharmaceutical companies” have “deprived the industry of rigorous quality control and audit.” It has also been argued that full and open access to reported research findings and the original data on which they depend is needed to facilitate meta-analysis or systematic reviews of the primary research on which the evidence-based practice of medicine depends.38 Open-access publication policy assures unfettered communication of the results of research for use by all.[40]

6. Promote systematic end-product peer review and cumulative meta-analysis of biomedical research for dissemination by the Cochrane Collaboration.

Rationale: Despite billions of dollars per year outlays for biomedical and behavioural research, there is insufficient investment in documenting the quality of the output. The size of the problem is large and growing larger, if one considers the following recently published statistics8 about the behaviour of biomedical scientists in the previous three years: Failing to present data that contradict one¹s previous research, 6%; overlooking others¹ use of flawed data or questionable interpretation of data, 12.5%; changing the design, methodology or results of a study in response to pressure from a funding source, 15.5%; inappropriately assigning authorship credit, 10%; withholding details of methodology or results in papers or proposals, 10.8%; using inadequate or inappropriate research designs, 13.5%; dropping observations or data points from analyses based on a gut feeling that they were inaccurate, 15.3%. Generally, mid-career scientists admitted these behaviours more frequently than early careerists.

End-product peer review taking the threats-to-inference approach[41] to documentation of the quality of government-funded research with wide dissemination of the results of review[42] might well provide powerful incentive for researchers to avoid being exposed as guilty of any of the aforementioned problem behaviours. The threats-to-inference approach focuses on research design elements as well as on how the study was actually conducted to determine whether there may be more plausible alternative explanations for the study¹s findings than the one presented by the investigator. Correlatively, greater investment by government funding agencies in cumulative meta-analysis of reported findings by studies in their research portfolios – again, taking the threats-to-inference approach – could take advantage of the capabilities of the Cochrane Collaboration[43] to conduct and disseminate systematic reviews for use in evidence-based medical practice.

Such systematic reviews could be expanded to incorporate the requirement that reviewers check and control for author authenticity and financial conflicts – perhaps going so far as to suggest that some or all of these tainted reports be either thrown out or discounted by means of a suitably-weighted sensitivity analysis. Indeed, the most conservative approach for combining studies in meta-analysis is to employ the laboratory or researcher as the smallest unit of analysis.[44] Contemporary information technology permits tagging of compromised authors and research institutions for use in evaluating subsequent publications. In this regard, the meta-analyst can utilize available reports of violations of the International Committee of Medical Journal Editors (ICMJE) accountability standards, access to data, and control of publication.[45]

It is well known that commercialization of the biomedical research enterprise is the principal source of flawed research designs and manipulated presentation of findings.[46] [47] Government regulatory decisions have been known to facilitate the marketing strategies of individual companies. Most recently, the U.S. Food and Drug Administration (FDA) permitted a confounded study of a new drug, torcetrapib, only in combination with another patented drug, atorvastatin (Lipitor). The FDA decision created a monopoly by preventing sale of torcetrapib for separate use or in combination with a generic statin or with one available from a competing source.[48]

Systematic end-product peer review and meta-analysis taking the threats-to-inference approach are among the few non-draconian antidotes to behaviours that undermine the integrity of science while violating the ethical principle of beneficence required by the Nuremburg Code, the Declaration of Helsinki and the Belmont Report of the National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research.

Justice

7. Adopt deterrents in the form of monetary penalties for non-compliance with law and regulations for the protection of human research subjects to compel fair and honest practice.

Rationale: As these examples of research abuse illustrate, Institutional Review Board (IRB) voluntarism and self-policing have repeatedly failed to protect human subjects in biomedical and behavioural research. Practices which erode the integrity of science increase rather than decrease as biomedical scientists advance in their careers.8 If nothing more punishing than a slap on the wrist can be expected, why change behaviour? Deterrents in the form of monetary penalties and suspension of research privileges for major non-compliance with law and regulations for the protection of human subjects seem needed to reinforce education in how to conduct ethical biomedical and behavioural research.

Conclusion

Given the poor track record of the existing U.S. system of human research subject protection, there is need for an additional layer of protection that defines people with mental disabilities and impairments as a protected class of vulnerable subjects who require the appointment of a court officer to oversee their enrolment and treatment in research. It is the most important of our recommendations for needed reform. At stake in all of this is the public trust that is absolutely essential for the advancement of biomedical science and the future good of society.

References

[1]Final Report of the Advisory Committee on Human Radiation Experiments. No. 061-000-00-848-9. Washington, DC. U.S. Government Printing Office, October 1995. Available at: http://www.eh.doe.gov/ohre/roadmap/achre/index.html Accessed June 30, 2005.

[2]The Nuremberg Code (1947) In: Mitscherlich A, Mielke F. Doctors of infamy: the story of the Nazi medical crimes. New York: Schuman, 1949: xxiii-xxv. Available at: http://www.cirp.org/library/ethics/nuremberg/ Accessed June 30, 2005.

[3]World Medical Organization. Declaration of Helsinki. British Medical Journal (7 December) 1996; 313(7070):1448-1449. Available at: http://www.cirp.org/library/ethics/helsinki/ Accessed June 30, 2005

[4]The National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research. The Belmont Report: Ethical Principles and Guidelines for the Protection of Human Subjects of Research. No. (OS) 78-0014. Washington, DC: U.S. Government Printing Office, April 18, 1979. Available at: http://ohsr.od.nih.gov/guidelines/belmont.html Accessed June 30, 2005.

[5]45 CFR 46 and 21 CFR 50.

[6]Jones JH. Bad blood: the Tuskegee syphilis experiment. New York: The Free Press, 1993. Tuskegee study historical timeline available at: http://www.cdc.gov/nchstp/od/tuskegee/time.htm Accessed June 30, 2005.

[7]Lenzer J, Pyke N. Was Traci Johnson driven to suicide by anti-depressants? That¹s a trade secret, say U.S. officials. On-line Independent (UK), 19 June 2005. Available at: https://ahrp.org/infomail/05/06/19.php Accessed June 30, 2005.

[8]Martinson BC, Anderson MS, deVries R. Scientists behaving badly. Nature 435, 737-738 (9 June 2005). Available at: http://www.nature.com/nature/journal/v435/n7043/full/435737a.html Accessed June 30, 2005.

[9]Flanagin A, Carey LA, Fontanarosa PB, Phillips SG, Pace BP, Lundberg GD, Rennie D. Prevalence of articles with honorary authors and ghost authors in peer-reviewed medical journals. JAMA 1998; 280: 222-224.

[10]Schulman KA, Seils DM, Timbie JW, Sugarman J, Dame LA, Weinfurt KP, Mark DB, Califf RM. A national survey of provisions in clinical trial agreements between medical schools and sponsors. New Engl. J. Med. 2002; 347: 1335-1341.

[11]Melander H, Ahlqvist-Rastad J, Meijer G, Beermann B. Evidence b(i)ased medicine – selective reporting from studies sponsored by pharmaceutical industry: review of studies in new drug applications. BMJ 2003; 326: 1171-1173.

[12]Lexchin J, Bero LA, Djulbegovic B, Clark O. Pharmaceutical industry sponsorship and research outcome and quality: systemic review. BMJ 2003; 326: 1167-1170.

[13]Ramsay M. Ethical dilemmas in health care rationing. Newcastle-upon-Tyne, UK: Political Studies Association, 2005. Available at: http://www.psa.ac.uk/cps/1995%5Crams.pdf Accessed June 30, 2005.

[14]Sharav VH. Federally-funded relapse producing experiments in psychiatry: drug washout/chemical provocation – a partial bibliography, January 2000. Available at: https://ahrp.org/testimonypresentations/InducedPsychosisBiblio.php Accessed June 30, 2005.

[15]Sharav VH. Chemically induced psychosis experiments: an inhumane paradigm in psychiatric research. Testimony for the record of the Public Health & Safety Subcommittee of the Senate Health, Education, Labor & Pensions Committee Hearing, February 2, 2000. Available at: https://ahrp.org/testimonypresentations/InducedPsychosis.php Accessed June 30, 2005.

[16]Acute Respiratory Distress Syndrome Network. Ventilation with lower tidal volumes compared with traditional tidal volumes for acute lung injury and the acute respiratory distress syndrome. N Engl J Med 2000; 342:1301­1308.

[17]AHRP testimonies re: fatal ARDS lung experiment, June 26, 2003. Available at: https://ahrp.org/infomail/0603/26.php Accessed June 30, 2005.

[18]Eichacker, PQ, Gerstenberger, EP, Banks, SM, Cui, X, Natanson, C. Meta-analysis of acute lung injury and acute respiratory distress syndrome trials testing low tidal volumes. Am J Respir Crit Care Med 2002; 166:1510-1514. Available at: http://ajrccm.atsjournals.org/cgi/search?fulltext=eichacker&volume=166&issue=11&journalcode=ajrccm Accessed June 30, 2005.

[19]Deans K, Minneci P, Cui X, Banks S, Natanson C, Eichacker P. Editorial: Mechanical ventilation in ARDS: One size does not fit all. Crit Care Med 2005; 33: 1141-1143.

[20]AHRP. Mechanical ventilation in ARDS: One size does not fit all_Editorial AJCCM, May 20, 2005. Available at: https://ahrp.org/infomail/05/05/20.php Accessed June 30, 2005.

[21]Solomon J. Researchers tested AIDS drugs on children. Associated Press, May 4, 2005. Available at: https://ahrp.org/infomail/05/05/04.php Accessed June 30, 2005.

[22]AHRP letter of complaint to Dr. Michael Carome, Chief Compliance Officer, Office of Human Research Protection, DHHS, March 10, 2004. Available at: https://ahrp.org/ahrpspeaks/HIVkids0304.php See also: https://ahrp.org/infomail/05/04/23a.php Accessed June 30, 2005.

[23]National Institute of Environmental Health Sciences. Lead and your health, 2005. Available at: http://www.niehs.nih.gov/oc/factsheets/pdf/lead.pdf Accessed June 30, 2005.

[24]Roig-Franzia M. My kids were used as guinea pigs. Washington Post, August 2001. Available at: http://www.sskrplaw.com/publications/pigs.html Accessed June 30, 2005.

[25]Grimes v. Kennedy Krieger Institute, Inc., No. 128 September Term, 2000 (Md. 08/16/2001. Available at: http://biotech.law.lsu.edu/cases/research/grimes_v_KKI.htm Accessed June 30, 2005.

[26]ARDS Clinical Network. ARDSNet study 03, version I: A phase II/III, randomized, double-blind, placebo-controlled trial of Lisofylline in patients with acute lung injury and adult respiratory distress syndrome, January 5, 1998.

[27]AHRP. Senator Grassley to offer mandatory drug data registry bill, December 10, 2004. Available at: https://ahrp.org/infomail/04/12/10.php Accessed June 30, 2005.

[28]Kaufman M. Clinical trials of drugs fewer, study says: report also notes decline in number of principal investigators in U.S. Washington Post, May 4, 2005, A2. Available at: https://ahrp.org/infomail/05/05/04a.php Accessed June 30, 2005.

[29]Nundy S, Gulhati CM. A new colonialism? – conducting clinical trials in India. N Engl J Med 2005; 352: 16.

[30] Richards T. Conduct of drug trials in poor countries must improve. BMJ 2005; 330: 1466. Available at: http://bmj.bmjjournals.com/cgi/content/full/330/7506/1466-a Accessed June 30, 2005.

[31]Mulkern AC. When advocates become regulators: President Bush has installed more than 100 top officials who were once lobbyists, attorneys or spokespeople for the industries they oversee. The Denver Post, May 24, 2004. Available at: https://ahrp.org/infomail/04/05/24.php Accessed June 30, 2005.

[32]Willman D. Stealth merger: Drug companies and government medical research: some of the National Institutes of Health’s top scientists are also collecting paychecks and stock options from biomedical firms. Increasingly, such deals are kept secret. Los Angeles Times, December 7, 2003. Available at: https://ahrp.org/infomail/03/12/07.php Accessed June 30, 2005.

[33]Eilperin J. EPA proposal would allow human tests of pesticides: draft rule omits some recommended safeguards. Washington Post, June 28, 2005, A13.

[34]Karlawish JHT. Research involving cognitively impaired adults. N Eng J Med 2003; 348:1389-1392.

[35]McCabe AR. A precarious balancing act – the role of the FDA as protector of public health and industry wealth. Suffolk U Law Rev 2003; 36: 787-819.

[36] Who can file a Qui Tam action, June 28, 2005. Available at: http://www.quitam.com/quitam4.html Accessed June 30, 2005.

[37]Hammerschmidt DE, Franklin M. Secrecy in medical journals. Minnesota Medicine March 2005; Commentary: 34-35.

[38]Noble JH. Meta-analysis: methodology, strengths, weaknesses, and political uses. Unpublished manuscript. Washington, DC: Catholic University of America, April 8, 2005.

[39]House of Commons Health Committee. The influence of the pharmaceutical industry: fourth report of session 2004-2005, volume I. London: United Kingdom Parliament, 2005. Available at: http://www.publications.parliament.uk/pa/cm200405/cmselect/cmhealth/42/42.pdf Accessed June 30, 2005.

[40]Pincock S. Wellcome insists on open access. The Scientist, May 19, 2005. Available at: http://www.the-scientist.com/news/20050519/01 Accessed June 30, 2005.

[41]Campbell D, Stanley, J. Experimental and quasi-experimental designs for research. Chicago: Rand McNally, 1963.

[42]Noble JH. Peer review: quality control of applied social research. Science 1974; 185: 916-921.

[43]Cochrane Collaboration: the reliable source of evidence in health care, June 29, 2005. Available at: http://www.cochrane.org/index0.htm Accessed June 30, 2005.

[44]Cooper H. The integrative research review. Beverley Hills: Sage Publications, 1997: 15.

[45]Schulman KA, Seils DM, Timbie JW, Sugarman J, Dame LA, Weinfurt KP, Mark DB, Califf RM. A national survey of provisions in clinical trial agreements between medical schools and sponsors. New Engl. J. Med. 2002; 347: 1335-1341.

[46]Editorial. Lying, cheating and stealing in clinical research. Clin Trials 2004; 1:475-476.

[47]Lemmens T. Leopards in the temple: restoring scientific integrity to the commercialized research scene. Int Comp Health Law Ethics 2004; Winter: 641-657.

[48]Avorn J. Torcetrapib and atorvastatin – should marketing drive the research agenda? N Engl J Med 2005; 352: 2573-2576.

Subscribe to Our Newsletter!

Sign up and be the first to find out the latest news and articles about what's going on in the medical field.