From the issue dated June 25, 1999
POINT OF VIEW
In the early 1990s, plastic surgeons at a prominent hospital in New York City decided to test the relative merits of two techniques that had been used for years in face-lifts. In one technique, the "superficial" approach, surgeons work on muscles closer to the surface of the skin; the technique involves few risks, but its effects may not last long. The other, a "deep" approach, allows surgeons to work on muscles farther below the skin; it has a greater risk of infection and nerve damage, but potentially longer-lasting results. Without the knowledge or consent of the patients, and without any oversight by an institutional review board or any other research-review body, the surgeons tested the two techniques on 21 patients. One half of each patient's face was lifted with the deep approach, the other with the superficial technique.
The surgeon who led the group, himself a member of the hospital's I.R.B., did not ask the board to approve the research, because, as he later told The New York Times, "This was not an experiment. ... These were not experimental procedures we were using."
The surgeon, of course, was wrong. Comparing two different standard therapies on patients is research, precisely because it puts the interests of the patients second, and the interests of science first.
But it is even more shocking that the surgeon wouldn't have needed the I.R.B.'s approval even if he had realized that what he was doing was research, and the I.R.B. had known about it. No I.R.B. involvement was required, because federal protections don't automatically cover human subjects in research that is supported only by private funds. In the surgeon's case, the only money involved was that provided by the patients, their insurance companies, or other private sources.
The news has been filled lately with stories of federal crackdowns on research with human subjects. The Veterans Administration has had to suspend research at a hospital in Los Angeles for inappropriate studies of the mentally ill, and the Office of Protection from Research Risks, at the National Institutes of Health -- which monitors human experiments conducted with N.I.H. funds -- has announced an investigation of a Cincinnati hospital that conducted research that involved inducing symptoms of mental illness.
Most dramatic was the O.P.R.R.'s announcement in May that it had identified, at the prestigious Duke University Medical Center, 20 violations of federal regulations designed to protect human subjects, and had temporarily suspended all of the center's research that was supported by the N.I.H. -- some 2,000 studies in all.
But an even bigger story is that an unknown number of people have been used as subjects in privately financed research without their knowledge, and without the essential protections built into the federal regulations. Those regulations grew out of the Belmont Report, issued in 1979 by the first U.S. bioethics commission, the National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research. The report called for basic protections for all human research subjects, including the informed consent of subjects to their participation in research; independent review of studies to insure that risks have been minimized and are justified by the potential benefits to the subjects or to society; and justice in the selection of participants, so that the poor and the powerless do not make up a disproportionate percentage of subjects.
However, the resulting regulations -- which were slowly worked out by various federal agencies throughout the 1980s, until a "common rule" was developed in 1991 to guarantee informed consent and review by an institutional review board -- don't apply to all of the human subjects. Indeed, few people realize that the regulations apply only to studies sponsored by one of the 17 agencies and departments following the common rule, and to studies of drugs, devices, and other items regulated by the Food and Drug Administration.
It is true that voluntary adherence to the regulations for research conducted without federal funds is widespread among major universities and research institutions, which often promise to abide by federal standards in all of their employees' research, in exchange for some paperwork relief from the government.
But that leaves many settings in which research can occur without federal oversight. Colleges and universities that do not receive federal research funds are not necessarily subject to the common rule. Nor are state psychiatric hospitals; many in vitro-fertilization and weight-loss clinics; companies that develop genetic tests; physicians, dentists, and psychotherapists in private practice; or industrial and corporate programs that promote employees' health and safety. Indeed, the common rule does not even apply throughout the federal government: Its adoption is at the discretion of Cabinet secretaries and agency administrators, and only the 17 agencies and departments follow it.
How many people have been used in studies that lacked the basic protections of the common rule? Nobody knows, because no law mandates the collection of data on human subjects used in research. How many of those people thought they were patients rather than research subjects? Nobody knows. How many were injured or received substandard care? Nobody knows. How many of them paid, privately or through their insurance, for the privilege of being unwitting subjects? Nobody knows.
To answer those questions, John Glenn, who recently became one of the most famous research subjects in the United States when he went back into space, sponsored the Human Research Subject Protection Act of 1997. If it had been enacted into law, the bill would have required that all research on human subjects be governed by federal protections, and that data be collected that, for the first time, would document the extent and conditions of research on human beings in the United States. Anyone using human subjects without their consent would have been subject to criminal penalties, and the I.R.B. system would not only have been extended to cover all research in the United States, but would also have been strengthened with new resources for training, audits, and oversight.
Critics of the bill argued that there was no evidence that the absence of regulation had caused significant harm to substantial numbers of people, and asserted that the measure was a solution in search of a problem. Organizations such as the Association of American Medical Colleges and the Pharmaceutical Research and Manufacturers of America objected to strengthening the oversight powers of I.R.B.'s. They asserted that such changes would unduly hamper scientific progress. The bill died without even a hearing.
But, of course, one cannot assess the problem without a mandate to collect the data. The files of the Office of Protection from Research Risks are filled with cases of research irregularities over which it has no jurisdiction. The plastic-surgery case described above reached the office in the form of a letter from a whistle blower at the hospital. The office had to tell the letter writer -- as it must tell many other informants -- that the research was not covered by federal law.
A committee recently recommended that the O.P.R.R. be moved from the N.I.H. to the Department of Health and Human Services, and that the office receive more federal funds. Those changes certainly would help improve the regulation of research that is already subject to I.R.B. review, but they would not deal with the problem of unregulated research carried out or financed by state governments or the private sector.
Fifty-two years ago, following revelations of the Nazis' gruesome medical "experiments" upon helpless prisoners, the judges of the Nuremberg court announced their verdict in the trial of a number of Nazi doctors. To answer the arguments made by the defense at the trial, the judges listed 10 conditions for permissible experimentation on humans, a list that came to be known as the Nuremberg Code. It stated that no one should be enrolled in research without his or her knowledge and consent. The code was never formally incorporated into U.S. law, but it did provide the political and philosophical underpinnings for subsequent resolutions and regulations, such as the World Medical Association's 1964 adoption of the Declaration of Helsinki, reaffirming the basic human right to be enrolled in research only when the risks justify the benefits and when informed consent has been obtained.
Twenty-five years ago, Congress passed the National Research Act of 1974, which mandated I.R.B. review for research conducted under the Public Health Services Act, thus offering basic protections to human subjects in most health research supported by federal funds. The act also created the commission that issued the Belmont Report 20 years ago. That report made no distinction between the ethical obligations owed to subjects in federally sponsored studies and those owed to people in studies supported by private funds. It simply listed the basics: informed consent, minimization and justification of risk, and I.R.B. review.
Two years ago, the National Bioethics Advisory Commission took note of the dual standard of protection in the United States: one for subjects in federally regulated research and the other for those in unregulated research. It called for a single standard of basic protections, and resolved unanimously that "no person in the United States should be enrolled in research without the twin protections of informed consent by an authorized person and independent review of the risks and benefits of the research."
And in June 1997, the journal Science published the text of a speech by President Clinton, in which he said, "Science must always respect the dignity of every American. We must never allow our citizens to be unwitting guinea pigs in scientific experiments that put them at risk without their consent and full knowledge."
Ironically, every guinea pig -- and cat, dog, rabbit, hamster, and monkey -- used in research is protected by federal law, whether the research is supported by public or private funds. We have oversight committees to minimize unnecessary use and abuse of the animals, and we know precisely how many of them are used, how they are treated, and how many have been injured. Certainly, extending similar protections to all human subjects would be costly, requiring additional resources for I.R.B.'s already buckling under the pressures of inadequate staffing and training. But that is simply the price of doing ethical research.
Americans shouldn't fear being treated like guinea pigs in research. They should clamor to be treated as well.
R. Alta Charo is a professor of law and medical ethics at the University of Wisconsin at Madison, and a member of the National Bioethics Advisory Commission.
From the June 25, 1999, Issue
Section: Opinion & Arts