Ethical Imperialism: IRBs and the Social Sciences, 1965–2009
a history of IRBs in the social sciences
While Censor's Hand is a sustained argument against IRB's for both social and biomedical research, "Ethical Imperialism" by Zachary Schrag is a history of IRB's in the social sciences, documenting the minute details of regulatory decisions that have huge impacts downstream.
Since I've documented the birth of IRB's in a previous post, I'll be focused more on some insights into research regulation bureaucracy than the history itself.
First, as intrusive as biomedical IRB's can be, at least physicians, researchers and psychologists had a seat at the table when the Belmont Report was being drafted; social scientists (excluding psychologists) largely didn't. Second, that government bureaucrats can and do break promises and lie, with no consequence. Third, the case for IRB's in the social sciences is much weaker than the case for medical IRBs. Fourth, that unless a well-connected and passionate opponent of a government bureaucracy emerges, federal officials win battles with academics through sheer attrition, even when the evidence is squarely against them.
Schrag writes of social scientists being mostly ignored in the formative discussions and commissions on research ethics, even though there had been a vibrant debate within many disciplines on research ethics, with questions ranging from how much anonymity was due to subjects, the ethical use of deception to illuminate important subjects, and how consent would work in ethnographic settings. These debates had been ongoing since about the 1940's and would periodically flare up as more risque studies made the news. The most controversial of these studies was the Tearoom Study, an ethnography of anonymous male gay sex that Laud Humphreys, a gay (though at the time, closeted) male sociologist, conducted. Humphreys did not tell the men he observed that he was a researcher and later interviewed them while pretending to be a healthcare worker. The ethics of this study were hotly debated by his peers, but the possibility of self-developed ethics in each discipline was effectively foreclosed by the resulting IRB system.
Multiple attempts by social scientists to streamline and rationalize the IRB system, by professors at leading universities who had served on IRB commissions, were rejected by federal officials. These proposals varied in form: Harvard Professor Edward Pattulo sought to exempt all social science research by default unless it met certain specific conditions; another proposal by Berkeley professor Herbert Phillips would allow researchers to certify by affidavit that the study didn't endanger human subjects and avoid IRB review; Paul Mishkin, also at Berkeley, proposed making a handbook of previous approved IRB cases and allowing researchers to certify via affidavit that their study matched one of those. Mishkin's proposal could have been the beginning of a common law for the social sciences.
Even before the 1974 commission and the subsequent Belmont Report, there had been a growing tendency for federal agencies to assert control over research:
Since 1966, American scholars have faced four different regimes of IRB review. The first, initiated in 1966 by the Public Health Service, applied mainly to medical and psychological researchers with PHS grants. By 1972, however, IRB oversight had begun to spread to social scientists, even those without direct federal funding. (Location 3950)
Social Scientists’ Exclusion
After the Tuskegee scandal, various government commissions, usually filled with physicians, psychologists, and biomedical researchers, and conspicuously missing substantial social scientist representation, imposed their biomedical model on everybody. Sometimes this neglect was benign, and the commission members intended to exempt social science from most IRB regulation. At other times, especially when career NIH bureaucrats like Charles Mccarthy got involved, there emerged a tendency to try to bring all social science under IRB control.
From biomedical research to all research
This tendency was spurred along by various research scandals, almost all of which were biomedical scandals, not social science scandals. The 1966 Beecher paper that kickstarted much of the concern around research ethics had no examples of social science misconduct- it was all biomedical!
And yet the consistent trend was for all research to come under stricter control. My best explanation for this is the following: the original justification for more control is a scandal which gets substantial media attention, which makes Congress members care about the issue. They (Congress) either pass a law pertaining to the issue or, more commonly, write letters and hold hearings that prod federal officials into action. These federal officials then use those incidents to justify control over huge swathes of science, secure in the knowledge that they can outlast any scientist backlash and also that Congress is usually disinterested in the exact details of implementation. When Congress uses informal mechanisms like hearings or strongly worded letters to prod the administrative state, the result depends crucially on the staff, while legislative action can limit bureaucrat jurisdiction.
An example of the ability of federal officials to push their own agendas in the face of scientist backlash and congressional rebuke is Charles Mccarthy's successful attempt in the late 1980's to effectively reverse many of the concessions social scientists had won from the Office for Human Research Protections (OHRP). In the late 1970's and early 80's Professor Ithiel de Sola Pool, who was well-connected and articulate, had successfully pushed back against IRB's through lobbying Congress, criticizing its proposed regulations, having professional associations put out statements of of support, and garnering elite media (New York Times and WSJ) coverage. The result had been an informal ceasefire by OHRP, a promise that subsequent regulations would adopt most of Pool's ideas, and a sustained "backing-off" of IRB attention to the social sciences.
And then Mccarthy lied, broke the promises he had made to Pool and many others and tricked the Reagan transition team into signing off on stricter regulations. Here is the story in his own words, from here:
"Then we went to the transition team, and we said would the transition team endorse regulations that are less stringent than the previous regulations? And, of course, they weren't, but they looked like they were because we wrote some exceptions. And so when we sent the package down to Harris, we said "Diminished Regulations for the Protection of Human Subjects." And that was the title. And, of course, we knew nobody down there in the last weeks of the Harris administration getting ready to leave office would actually read it. So they didn't know what all that was about, but they could read the title."
That is how an ostensibly deregulatory President (Reagan) presided over an increase in federal oversight over IRBs.
There are other strategies he used: for instance, instead of mandating that universities use IRB assurance forms that required IRB review of all research (as opposed to federally funded research), he merely suggested it, which allowed him to get around federal notification requirements [emphasis mine]:
Yet because the model assurances were not legally binding regulations, HHS did not have to submit them to the public comment process that had so empowered dissenters in 1978 and 1979. McCarthy had found a way to cut critics Pool and Pattullo out of the loop. (Location 2604)
In 1991, after a few years of interagency meetings (which were not publicized), the Common Rule was enacted under Charles Mccarthy's leadership. The new regulations went into effect on 18 June 1991, not just for the Department of Health and Human Services, but for fifteen other departments and agencies as well, earning the regulations the moniker Common Rule. What had begun in 1966 as a policy specifically for Public Health Service grants was now, twenty-five years later, widespread throughout the federal executive branch. (Location 2669)
Never let a scandal go to waste
The stage had been quietly set for stricter research regulation, but it would take a 1993 article on the plutonium research scandal (which took place in the 1940's in wartime under direct government control, not in a university setting!) and a death in a gene editing trial, for a sustained crackdown by IRB's to begin, helped along by public outrage and rumblings from Congress. The OHRP sent sternly worded letters to university IRB's for imagined or real infractions (though they often failed to identify actual harm to subjects, as opposed to possible harms) which spurred an era that some have called "hypercompliance". As before, though the impetus was a medical research scandal, social scientists were also put under increased scrutiny.
Universities & associations accede
OHRP also sent out non-binding guidance suggesting that all research, not just federally funded research, should be subject to review, and local IRBs complied. Universities were quicker to obey than they'd been 30 years ago, likely for two reasons: NIH funding had doubled between 1998 and 2003 and university staff and administration had become increasingly professionalized and non-academic:
IRBs were just one part of a broader trend in universities that shifted power from faculty to professional staff in matters ranging from classroom allocation to business attire. As political scientist William Waugh noted in 2003, “the [university] bureaucracy is increasingly made up of people who have little or no academic experience and do not understand the academic enterprise,” (Location 2944)
Universities were not the only "captured" institution. The same professional organizations that had decisively rejected federal micromanagement of research ethics in the 70's accepted the new state of affairs quietly. This does not appear to have been an organic shift in the attitudes of their researchers but mostly the result of Felicia Levine, at that time the executive director of the American Sociological Association, pushing through a new Code of Ethics in 1997 with little understanding of the group's members.
This would result in substantial professional rewards for Levine:
The victor in this debate was Levine, who later became a leading voice for trying to work within the IRB system. In subsequent years she gained appointments to official committees, and several associations—including the American Political Science Association, the American Psychological Association, the American Sociological Association, the Consortium of Social Science Associations, and the Law and Society Association—endorsed her comments on OHRP proposals. (Location 3199)
In the early 2000's a group of social scientists, mostly historians, tried to push back, specifically asking not for exemption from IRB review, but merely expedited review for oral history projects. After negotiation with OHRP and speaking with Congressional staff, OHRP apparently came close to announcing that exemption-- but then reneged on the deal. Ironically, even after OHRP had stated that oral history still required IRB review, when the OHRP later conducted an oral history project itself, it decided the study was exempt from IRB review. IRBs for thee but for me!
IRBs restrict academic freedom
The price of stricter IRBs was substantial, as the following anecdotes demonstrate:
Such hypersensitivity to controversy threatened some of the most important research conducted by social scientists. Anthropologist Scott Atran, for example, sought to interview failed suicide bombers in order to learn how to negotiate with other potential terrorists. (Location 3513)
Mark Kleiman was limited in his study of California’s probation system. “After considerable delay because one of the ‘community members’ of the IRB hated the criminal justice system and decided to express that hatred by blocking research about it,” Kleiman wrote, “I finally got permission to interview probation officers and judges.” But the IRB refused him permission to interview probationers, since it couldn’t decide how he might approach them without coercion or risk of exposure. In the name of protecting the probationers, the board denied them the chance to record their views of the system that governed their lives.34 (Location 3496)
IRBs were not simply overcautious but deeply mistaken. They relied on their own intuition and prejudice to think up potential sources of harm to subjects. Conversation about distressing subjects became "harm", though this was an entirely baseless fear. IRBs also viewed their role as reputation management for institutions and openly admitted to censoring research on those grounds:
Perhaps the most serious clashes are those—as predicted by Ithiel de Sola Pool—in which IRBs sought to suppress unpopular ideas. At Florida State University, the IRB application bluntly asked, “Is the research area controversial and is there a possibility your project will generate public concern?” (Location 3483)
Schrag is pessimistic about OHRP moving towards smarter regulation of the social sciences on it's own and instead recommends legislative action:
My own hope, then, would be for Congress to relieve health regulators of the responsibility for overseeing the social sciences, a task they have mishandled for decades. Congress should amend the National Research Act to restrict its scope to the research discussed during the 1973 Senate hearings that still serve as the evidentiary record for that (Location 4040)
I heartily endorse his conclusion. Relying on good-faith negotiation with OHRP when the fruits of that approach have historically been temporary and limited is misguided. Bypass them entirely, write exemptions for social science research into law, and ignore the outrage from organizations whose very existence (like PRIM&R and AAHRPP) depends on overregulation.
Insights into Bureacracy
I found this book valuable in large part because of the focus Schrag places on the intricate details of bureaucratic process, which are usually illegible to outsiders. Here are a few:
The composition of a commission is hugely influential. This is obvious, but seeing how consequential the absence of social scientists was on subsequent research regulation makes it vivid. Not being included in a commission does not exempt you from oversight but just removes your voice from the process.
Career bureaucrats can subvert their political masters, as Mccarthy did. This can be good or bad, depending on your opinion on what level of democratic accountability you want bureaucracy to be subject to.
A deregulatory agenda enacted through administrative fiat can be effective if the political operatives appointed are knowledgeable and motivated, and have sustained political support. In the absence of those factors, career bureaucrats can often win through attrition or inattention. Mccarthy lying to Reagan's transition team is merely the most striking example of this.
The existence of ostensibly private organizations like PRIM&R, which are effectively an arm of the federal government, can further obscure IRB authority. This is similar to the use of non-binding "guidance" that OHRP issues, which are not required to go through the same level of review and public comments that full-fledged federal law requires. You have a private organization issuing "best practice" for IRBs, all but endorsed by OHRP, which is often led by former high level officials (such as Charles Mccarthy serving on the Board of PRIM&R.)
Organizations like the American Sociological Association, to the degree that they are coopted by people (like Felicia Levine) who don't really represent their members' view, can effectively astroturf a profession's consent to federal oversight. Making new professional associations is hard, network effects are durable and important, and this is an underrated feature of how science is politicized, distorted or captured.
Legislative ambiguity in regulation means bureaucrats and whatever thermostatic response they elicit from other actors are the true arbiters of "what the law is". Whenever legislative action is hard to accomplish, the administrative state takes on more importance. Bad news in politically polarized times..