“It sounds as if the donor knows who he is,” wrote Francis Collins, former director of the then-called National Center for Human Genome Research, in a 1996 email. “That’s not the way it should have been done.”
This quote appears in Undark and STAT’s recent, deeply reported exposé on how the first human genome was sequenced in the late 1990s and early 2000s by the Human Genome Project. Collins was referring to the provenance of one of the initial DNA samples donated for the project, but I reckon that he would have objected just as vehemently had any of the donors been able to spot their own DNA within the final “reference” genome. This includes one prominent donor: The subject of the Undark/STAT story, an anonymous man from Buffalo, New York, known as RP11, who wound up being the project’s primary DNA source. Despite signing a consent form saying the researchers expected that no single person’s DNA would account for more than 10 percent of the reference genome, RP11’s DNA made up 74 percent of that genome.
Collins’ quote implies that it would be intrinsically wrong for a DNA donor to know their sample number or some other identifier to which they might link their genome. Collins was acting in line with principles that would become the 1996 National Center for Human Genome Research and Department of Energy’s guidance on the matter, which suggested that for someone to eat from the tree of genomic knowledge would be to risk “unexpected, unwelcome or unauthorized use of information about them” and that attaching phenotypic or demographic information to DNA sequence data “will rarely be useful.” In other words, identifiable genomic information could only lead to either bad outcomes or no outcomes.
What if RP11 knew about his role in the Human Genome Project? Stanford University law and biosciences scholar Hank Greely told Undark, “If I were NIH, I would worry — hey, if this guy knows, he might sue us or make trouble for us.” Contrary to the informed consent form he signed, RP11’s DNA made up the majority of the reference genome; thus, he might have standing to shake down the agency. So is it better to mitigate liability (as institutions love to do) and compound a deceptive act than to risk having a participant know the truth? Meanwhile, the Roswell Park Cancer Institute’s institutional review board, whose mandate was to protect the interests and ensure the ethical treatment of the project’s research participants, opted not to let RP11 and others in the donor pool know what was going on, let alone reconsent them.
To me, these two quotes perfectly encapsulate the biomedical and academic genetics research establishment’s mostly unyielding and reflexive stance toward the people they study: opacity on the part of researchers and compulsory ignorance on the part of participants.
One problem here is false advertising. The ideals of the government-sponsored Human Genome Project were predicated on and marketed as something that stood in stark contrast to this kind of arm’s length treatment of DNA donors. In June 2000, President Bill Clinton brought Collins and his private sector rival, Celera Genomics Group’s J. Craig Venter, together at a White House ceremony to celebrate the publication of the draft human genome sequence. There, Clinton said, “We, all of us, share a duty to ensure that the common property of the human genome is used freely for the common good of the whole human race.” For his part, Collins asked, “What more powerful form of study of mankind could there be than to read our own instruction book?” This instruction book, he observed, was “previously known only to God.”
But in 2000, we were not yet ready to allow mere mortals to read their own instruction books. When Venter admitted a couple of years later that the particular human genome that he and his company had sequenced was largely derived from his own DNA, an editorial in Science magazine referred to this revelation as “Not Wicked, Perhaps, but Tacky.” Thus, for Venter, having unfettered access to his own genomic data was akin to wearing an electric pink polyester tuxedo to the prom. He was a scientist, after all; he could handle the truth. For RP11, on the other hand, Promethean self-knowledge would have been unethical or dangerous or, heaven forfend, might have posed a financial risk to Big Science. The message was clear: Genomes for me, but not for thee.
A second problem stems from the rules that scientists themselves agree to play by whenever they walk into a lab or sit down at a computer. In 1942, the sociologist Robert Merton outlined the norms of science — among these was the idea that the fruits of research are for everyone. “Secrecy is the antithesis of this norm,” Merton wrote, “full and open communication its enactment.” To their credit, the major players in the public Human Genome Project in 1996 honored this notion by agreeing to quickly upload (ideally within 24 hours) the sequence data they generated and to place them in the public domain (even if this sometimes turned out to be more aspirational than a guarantee). RP11’s data, like the other volunteers’, would be sequenced at a furious pace. But the person whose own biology had played such an outsized role in decoding the so-called “Language of God” was never brought into the fold.
In 2007, I was fortunate enough to have the opportunity to be among the early enrollees in the first Personal Genome Project. The original cohort of 10 people were all trained in science or medicine. Like RP11, I had my genome sequenced and made public. Unlike RP11, however, all of the other PGP participants and I knew who we were from day one — we had access to our own DNA, and the project took no heroic measures to de-identify us. We were given a very long, brutally honest consent form that let us know that we might find out, for example, that we were related to criminals or “other notorious figures.”
We then took an exam meant to demonstrate that we had read and understood it — not a perfect system by any means, but at least one not rooted in secrecy and asymmetric power dynamics. The PGP sequences have since become the basis of the National Institute of Standards and Technology’s Genome in a Bottle Consortium, which aims to authoritatively characterize human genomes to enable their use in clinical practice and other applications. One donor to that program has since made his identity public and discussed his excitement at being part of the “leading edge” of science.
So is this approach a panacea? Should the model be for all donors to get sequenced and post their genomes on government websites (or anywhere else)? Hell no. We have more than enough stories of DNA collection and surveillance by law enforcement, governments, aggrieved rich people, and hackers targeting specific racial or ethnic groups to give us pause. The risks of misusing of people’s genomic information are real. And to be sure, not everyone wants to know, and no one should be forced to know.
But what about the potential upsides? What if RP11 carried a genetic variant that raised his risk of developing a treatable disease? The point is, in PGP, we participants got to do the cost-benefit analysis for ourselves; RP11 did not. (More recent efforts, like NIH’s All of Us Research Program, have made real progress.) Would RP11 have wanted to know? The fact that the original consent forms have long since been lost suggests that no one was ever going to ask.
And what if RP11 wanted recompense like Henrietta Lacks’ descendants wanted for the enormous success of the HeLa cell line, derived from their now-famous relative without her consent? In 2023, Thermo Fisher Scientific, a company worth more than $200 billion, settled with the Lacks family for an undisclosed sum. But 10 years earlier, NIH managed to reach an accommodation with the Lacks family that recognized Henrietta’s importance to science without paying the family a dime. This stipulation is easy to enshrine in a consent form — if you participate in research, you might get paid a flat fee, a gift card, or reimbursed for an Uber.
But the Lacks story notwithstanding, no entity that I know of is going to give a study participant a cut of the profits from something like a new monoclonal antibody developed from a tissue sample, or a continuous glucose monitor tested on them. Statements such as “Dr. Smith stands to gain from commercial applications of this work, but you will not” are a staple of 21st century American consent forms. But they do not preclude honest conversations with participants about a researcher’s goals and ambitions.
I think it also bears mentioning that, like Henrietta Lacks, RP11 is of African American descent (in addition to European ancestry). I am not suggesting that race played any role whatsoever in NIH’s behavior toward him. But given the long history of shameful treatment toward people of color in research, RP11’s story feels like a missed opportunity to have done better.
In the Undark and STAT story on RP11, Aristides Patrinos, the leader of the Department of Energy’s human genome efforts at the time, conceded that “it probably would be a good idea to come out in the open and tell everybody what happened. And give as many specifics as possible.”
I suspect that Robert Merton would agree. When researchers ask taxpayers to give us biological samples to study and interpret, when we make lofty pronouncements about the importance of our science, when we tell participants without any evidence that their identities will be hidden from everyone — including them — forever and for their own good, and when we upload our findings to a massive, global public database that effectively minimizes their contributions while shedding extraordinary light on their biology, is it still reasonable to elide all of those specifics more than 25 years later?
Must the altruistic act of research participation depend on the erasure of the participant? Is there a world in which RP11 is finally celebrated for what he did — before anyone else and in much murkier circumstances — rather than kept invisible for eternity?
Misha Angrist is a senior fellow at the Initiative for Science & Society and associate professor of the practice at the Social Science Research Institute at Duke University.
This article was originally published on Undark. Read the original article.
Comments
Comments