Some thoughts after reading “Bad Blood: Secrets and Lies in a Silicon Valley Startup”

I just read the above-titled John Carreyrou book, and it’s as excellent as everyone says it is. I suppose it’s the mark of any compelling story that it will bring to mind other things you’ve been thinking about, and in this case I saw many connections between the story of Theranos—a company that raised billions of dollars based on fake lab tests—and various examples of junk science that we’ve been discussing for the past ten years or so.

Before going on, let me emphasize three things:

1. Theranos was more fake than I’d realized. On the back cover of “Bad Blood” is this quotation from journalist Bethany McLean:

No matter how bad you think the Theranos story was, you’ll learn that the reality was actually far worse.

Indeed. Before reading Carreyrou’s book, I had the vague idea that Theranos had some high-tech ideas that didn’t work out, and that they’d covered up their failures in a fraudulent way. The “fraudulent” part seems about right, but it seems that they didn’t have any high-tech ideas at all!

Their claim was that they could do blood tests without a needle, just using a drop of blood from your finger.

And how did they do this? They took the blood, diluted it, then ran existing blood tests. Pretty obvious, huh? Well, there’s a reason why other companies weren’t doing this: you can’t do 100 tests on a single drop of blood. Or, to put it another way, you can’t do 1 test on 1/100th of a drop of blood: there’s just not enough there, using conventional assay technology.

I think that, with care in data collection and analysis, you could do a lot better than standard practice—I’d guess that it wouldn’t be hard at all to reduce the amount of blood needed by a factor of 2, by designing and analyzing your assays more efficiently (see here, for example, where we talk about all the information available from measurements that are purportedly “below detection limit”). But 100 tests from one drop of blood: no way.

So I’d just assumed Theranos was using a new technology entirely, maybe something with gene sequencing or microproteins or some other idea I’d never heard of. No, not at all. What they actually had was an opaque box containing several little assay setups, with a mechanical robot arm to grab and squeeze the pipette to pass around the blood. Unsurprisingly, the machine broke down all the time. But even if it worked perfectly, it was a stupid hack. Or, I should say, stupid from the standpoint of measuring blood; not so stupid from the standpoint of conning investors.

You’ve heard about that faked moon landing, right? Well, Theranos really was the fake moon landing. They got billions of dollars for, basically, nothing.

So, yeah, the reality was indeed far worse than I’d thought!

It would be as if Stan couldn’t really fit any models, as if what we called “Stan” was just an empty program that scanned in models, ran them in Bugs, and then made up values of R-hat in order to mimic convergence. Some key differences between Stan and Theranos: (a) we didn’t do that, (b) Stan is open source so anyone could check that we didn’t do that by running models themselves, and (c) nobody gave us a billion dollars. Unfortunately, (c) may be in part a consequence of (a) and (b): Theranos built a unicorn, and we just built a better horse. You can get more money for a unicorn, even though—or especially because—unicorns don’t exist.

2. Clarke’s Law. Remember Clarke’s Third Law: Any sufficiently crappy research is indistinguishable from fraud. Theranos was an out-and-out fraud, as has been the case for some high-profile examples of junk science. In other cases, scientists started out by making inadvertent mistakes and then only later moved to unethical behavior of covering up or denying their errors. And there are other situations where there is enough confusion in the literature that scientists could push around noise and get meaningless results without possibly ever realizing they were doing anything wrong. From the standpoint of reality, it hardly matters. The Theranos story is stunning, but from the perspective of understanding science, I don’t really care so much if people are actively cheating, whether they’re deluding themselves, or whether it’s something in between. For example, when that disgraced primatologist at Harvard was refusing to let other people look at his videotapes, was this the action of a cheater who didn’t want to get caught, or just a true believer who didn’t trust unbelievers to evaluate his evidence? I don’t care: either way, it’s bad science.

3. The role of individual personalities. There are a lot of shaky business plans out there. It seems that what kept Theranos afloat for so long was the combination of Elizabeth Holmes, Ramesh Balwani, and David Boies, three leaders who together managed to apply some mixture of charisma, money, unscrupulousness, and intimidation to keep the narrative alive in contradiction to all the evidence. It would be nearly impossible to tell the story of Theranos without the story of these personalities, in the same way that it would be difficult to try to understand the disaster that was Cornell University’s Food and Brand Lab without considering the motivations of its leader. People matter, and it took a huge amount of effort for Holmes, Balwani, and their various cheerleaders and hired guns, to keep their inherently unstable story from exploding.

That said, my own interest in Theranos, as in junk science, is not so much on the charismatic and perhaps self-deluded manipulators, or on the disgusting things people will do for money or prestige (consider the actions of Theranos’s legal team to intimidate various people who were concerned about all the lying going on).

Rather, I’m interested in the social processes by which obviously ridiculous statements just sit there, unchallenged and even actively supported, for years, by people who really should know better. Part of this is the way that liars violate norms—we expect scientists to tell the truth, so a lie can stand a long time before it is fully checked—part of it is wishful thinking, and part of it seems to be an attitude by people who are already overcommitted to a bad idea to protect their investment, if necessary by attacking truth-tellers who dispute their claims.

Bad Blood

OK, on to the book. There seem to have been two ingredients that allowed Theranos to work. And neither of these ingredients involved technology or medicine. No, the two things were:

  1. Control of the narrative.

  2. Powerful friends.

Neither of these came for free. Theranos’s leaders had to work hard, for long hours, for years and years, to maintain control of the story and to attract and maintain powerful friends. And they needed to be willing to lie.

One thing I really appreciated about Carreyrou’s telling of the tale was the respect he gives to the whistleblowers, people who told the truth and often were attacked for their troubles. Each of them is a real person with complexities, decisions, and a life of his or her own. Sometimes in such stories there’s such a focus on the perpetrators, that the dissenters and whistleblowers are presented just as obstacles in the way of someone’s stunning success. Carreyrou doesn’t do that; he treats the critics with the respect they deserve.

When reading through the book, I took a lot of little notes, which I’ll share here.

p.35: “Avid had asked more pointed questions about the pharmaceutical deals and been told they were held up in legal review. When he’d asked to see the contracts, Elizabeth had said she didn’t have the copies readily available.” This reminds me of how much of success comes from simply outlasting the other side, from having the chutzpah to lie and just carrying it out, over and over again.

p.37: various early warnings arise, suspicious and odd patterns. What strikes me is how clear this all was. It really is an Emperor’s New Clothes situation in which, from the outside, the problems are obvious. Kind of like a lot of junk science, for example that “critical positivity ratio” stuff that was clearly ridiculous from the start. Or that ESP paper that was published in JPSP, or the Bible Code paper published in Statistical Science. None of these cases were at all complicated; it was just bad judgment to take them seriously in the first place.

p.38: “By midafternoon, Ana had made up her mind. She wrote up a brief resignation letter and printed out two copies . . . Elizabeth emailed her back thirty minutes later, asking her to please call her on her cell phone. Ana ignored her request. She was done with Theranos.” Exit, voice, and loyalty. It’s no fun fighting people who don’t fight fair; easier just to withdraw. That’s what I’ve done when I’ve had colleagues who plagiarize, or fudge their data, or more generally don’t seem to really care if their answers make sense. I walk away, and sometimes these colleagues can then find new suckers to fool.

p.42: “Elizabeth was conferenced in by phone from Switzerland, where she was conducting a second demonstration for Novartis some fourteen months after the faked one that had led to Henry Mosley’s departure.” And this happened in January, 2008. What’s amazing here is how long it took for all this to happen. Theranos faked a test in 2006, causing one of its chief executives to leave—but it wasn’t until nearly ten years later that this all caught up to them. Here I’m reminded of Cornell’s Food and Brand Lab, where problems had been identified several years before the scandal finally broke.

p.60: Holmes gave an interview to NPR’s “BioTech Nation” back in 2005! I guess I shouldn’t be surprised that NPR got sucked into this one: they seem to fall for just about anything.

p.73: Dilution assays! Statistics is a small world. Funny to think that my colleagues and I have worked on a problem that came up in this story.

p.75: “Chelsea’s job was to warm up the samples, put them in the cartridges, slot the cartridges into the readers, and see if they tested positive for the virus.” This is so amusingly low-tech! Really I think the best analogy here is the original “Mechanical Turk,” that supposedly automatic chess-playing device from the 1700s that was really operated by a human hiding inside the machine.

p.86: “Hunter was beginning to grow suspicious.” And p.86: “The red flags were piling up.” This was still just in 2010! Still many years for the story to play out. It’s as if Wiley E. Coyote had run off the cliff and was standing midair for the greater part of a decade before finally crashing into the Arizona desert floor.

p.88: “Walgreens suffered from a severe case of FoMO—the fear of missing out.” Also there was a selection effect. Over the years, lots of potential investors decided not to go with Theranos—but they didn’t matter. Theranos was able to go with just the positive views and filter out the negative. Here again I see an analogy to junk science: Get a friend on the editorial board or a couple lucky reviews and you can get your paper published in a top scientific journal. Get negative reviews, and just submit somewhere else. Once your paper is accepted for publication, you can publicize. Some in the press will promote your work, others will be skeptical—but the skeptics might not bother to share their skepticism with the world. In that way, the noisiness and selection of scientific publication and publicity have the effect of converting variation into a positive expectation, hence rewarding big claims even if they only have weak empirical support.

p.97: “To be sure, there were already portable blood analyzers on the market. One of them, a device that looked like a small ATM called the Piccolo Xpress, could perform thirty-one different blood tests and produce results in as little as twelve minutes.” Hey: so it already existed! It seems that the only real advantage Theranos had over Piccolo Xpress was a willingness to lie. I guess that’s worth a lot.

p.98: “Nepotism at Theranos took on a new dimension in the spring of 2011 when Elizabeth hired her younger brother, Christian, as associate director of project management. . . . Christian had none of his sister’s ambition and drive; he was a regular guy who liked to watch sports, chase girls, and party with friends. After graduating from Duke University in 2009, he’d worked as an analyst at a Washington, D.C., firm that advised corporations about best practices.” That’s just too funny. What better qualifications to advise corporations about best practices, right?

p.120: It seems that in 2011, Theranos wanted to deploy its devises for the military in Afghanistan. But the devices didn’t exist! It makes me wonder what Theranos was aiming for. My guess is that they wanted to get some sheet of paper representing military approval, so they could then claim they were using their devices in Afghanistan, a claim which they could then use to raise more money from suckers in Silicon Valley and Wall Street. If Theranos had actually received the damn contract, they’d’ve had to come up with some excuse for why they couldn’t fulfill it.

p.139: Agressive pitbull lawyer David Boies “had accepted stock in lieu of his regular fees”! Amusing to see that he got conned too. Funny, as he probably saw himself as a hard-headed, street-smart man of the world. Or maybe he didn’t get conned at all; maybe he dumped some of that stock while it was still worth something.

p.151: Theranos insists on “absolute secrecy . . . need to protect their valuable intellectual property.” This one’s clever, kind of like an old-school detective story! By treating the maguffin as it if has value, we make it valuable. The funny thing is, academic researchers typically act in the opposite way: We think our ideas are soooo wonderful but we give them away for free.

p.154: “Elizabeth had stated on several occasions that the army was using her technology [Remember, they had no special technology—ed.] on the battlefield in Afghanistan and that it was saving soldier’s lives.” Liars are scary. I understand that people can have different perspectives, but I’m always thrown when people just make shit up, or stare disconfirming evidence in the eye and then walk away. I just can’t handle it.

p.155: “After all, there were laws against misleading advertising.” I love the idea that it was the ad agency that had the scruples here, contrary to our usual stereotypes.

p.168: “Elizabeth and Sunny decided to dust off the Edison and launch with the older device. That, in turn, led to another fateful decision—the decision to cheat.” The Armstrong principle!

p.174: An article (not by Carreyrou) in the Wall Street Journal described Theranos’s processes as requiring “only microscopic blood volumes” and as “faster, cheaper, and more accurate than the conventional methods.” Both claims were flat-out false. I’m disappointed. As noted above, I find it difficult to deal with liars. But a news reporter should be used to dealing with liars, right? It’s part of the job. So how does this sort of thing happen? This is flat-out malpractice. Sure, I know you can’t fact-check everything, but still.

p.187: “To Tyler’s dismay, data runs that didn’t achieve low enough CVs (coefficients of variations) were simply discarded and the experiments repeated until the desired number was repeated.” Amusing to see some old-school QRP’s coming up. Seems hardly necessary given all the other cheating going on. But I guess that’s part of the point: people who cheat in one place are likely to cheat elsewhere too.

p.190: “Elizabeth and Sunny had decided to make Phoenix their main launch market, drawn by Arizona’s pro-business reputation and its large number of uninsured patients.” Wow—that’s pretty upsetting to see people getting a direct financial benefit from other people being in desperate straits. Sure, I know this happens, but it still makes me uncomfortable to see it.

p.192: “Tyler conceded her point and made a mental note to check the vitamin D validation data.” Always a good idea to check the data.

p.199: “George said a top surgeon in New York had told him the company was going to revolutionize the field of surgery and this was someone his good friend Henry Kissinger considered to be the smartest man alive.” This one’s funny. American readers of a certain age will recall a joke involving Kissinger himself being described as “the smartest man in the world.”

p.207: “He talked to Schultz, Perry, Kissinger, Nunn, Mattis and to two new directors: Richard Kovacevich, the former CEO of the giant bank Wells Fargo, and former Senate majority leader Bill Frist.” This is interesting. You could imagine one or two of these guys getting conned, but all of them? How could that be? They key here, I think, is that these seven endorsements seem like independent evidence, but they’re not. It’s groupthink. We’ve seen this happen with junk science, too: respected scientists with reputations for careful skepticism endorse shaky claims on topics that they don’t really know anything about. Why? Because these claims have been endorsed by other people they trust. You get these chains of credulity (here’s a particularly embarrassing example, but I’ve seen many others) with nothing underneath. This is all interesting in that there’s a real statistical fallacy going on here, where multiple endorsements are taken as independent pieces of evidence, but they’re not.

p.209: “President Obama appointed [Holmes] a U.S. ambassador for global entrepreneurship, and Harvard Medical School invited her to join its prestigious board of fellows.” Harvard Medical School, huh? What happened to First Do No Harm?

p.253: “It was frustrating but also a sign that I [Carreyrou] was on the right track. They wouldn’t be stonewalling if they had nothing to hide.” This reminds me of so many scientists who won’t share their data. What are they so afraid of, indeed?

p.271: “In a last-ditch attempt to prevent publication [of Carreyrou’s Wall Street Journal article, Boies sent the Journal a third lengthy letter . . .” The letter included this passage: “That thesis, as Mr. Carreyrou explained in discussions with us, is that all of the recognition by the academic, scientific, and health-care communities of the breakthrough contributions of Threanos’s achievements is wrong . . .” Indeed, and that’s the problem: again, what is presented as a series of cascading pieces of evidence is actually nothing more than the results of a well-funded, well-connected echo chamber. Again, this reminds me a lot of various examples of super-hyped junk science that was promoted with faced no serious opposition. It was hard for people to believe there was nothing there: how could all these university researchers, and top scientific journals, and scientific award committees, and government funders, and NPR get it all wrong?

p.278: “Soon after the interview ended, Theranos posted a long document on its website that purported to rebut my reporting point by point. Mike and I went over it with the standards editors and the lawyers and concluded that it contained nothing that undermined what we had published. It was another smokescreen.” This makes me think of two things. First, what a waste of time, dealing with this sort of crap. Second, it reminds me of lots and lots of examples of scientists responding to serious, legitimate criticism with deflection and denial, never even considering the possibility that maybe they got things wrong the first time. All of this has to be even worse when lawyers are involved, threatening people.

p.294: “In January 2018, Theranos published a paper about the miniLab . . . The paper described the device’s components and inner workings and included some data . . . But there was one major catch: the blood Theranos had used in its study was drawn the old-fashioned way, with a needle in the arm. Holmes’s original premise—fast and accurate test results from just a drop or two pricked from a finger—was nowhere to be found in the paper.” This is an example of moving the goalposts: When the big claims get shot down, retreat to small, even empty claims, and act as if that’s what you cared about all along. We see this a lot with junk science after criticism. Not always—I think those ESP guys haven’t backed down an inch—but in a lot of other cases. A study is done, it doesn’t replicate, and the reply from the original authors is that they didn’t ever claim a general effect, just something very very specific.

Dogfooding it

Here’s a question to which I don’t know the answer. Theranos received reputational support from various bigshots such as George Schultz, Henry Kissinger, and David Boies. Would these guys have relied on Theranos to test their own blood? Maybe so: maybe they thought that Theranos was the best, most high-tech lab out there. But maybe not: maybe they thought that Theranos was just right for the low end, for the schmoes who would get their blood tested at Safeway and who couldn’t afford real health care.

I think about this with a lot of junk science, that its promoters think it applies to other people, not to them. For example, that ridiculous (and essentially unsupported by data) claim that certain women were 20 percentage points more likely to support Barack Obama during certain times of the month: Do you think the people promoting this work thought that their own political allegiances were so weak?

Summary

“Bad Blood” offers several take-home points. In no particular order:

– Theranos’s claims were obviously flawed, just ridiculous. Yet the company thrived for nearly a decade, after various people in the company realized the emptiness of the company’s efforts.

– Meanwhile, Theranos spent tons of money: I guess that even crappy prototypes are expensive to build and maintain.

– In addition to all the direct damage done by Theranos to patients, I wonder how much harm arose from crowding-out effects. If it really took so much $ to build crappy fake machines, just imagine how much money and organization it would take to build real machines using new technology. You’d either need a major corporation or some really dedicated group of people with access to some funding. Theranos sucked up resources that could’ve gone to such efforts.

– There are lots of incentives against criticizing unscrupulous people. People who will lie and cheat might also be the kind of people who will retaliate against criticism. I admire Carreyrou for sticking with this one, as it can be exhausting to deal with these situations.

– The cheaters relied for their success on a network of clueless people and team players, the sort of people who would express loyalty toward Theranos without knowing or even possibly caring what was in its black boxes, people who just loved the idea of entrepreneurship and wanted to be part of the success story. I’ve seen this many times with junk science: mediocre researchers will get prominent scientists and public figures on their side, people who want the science to be true or who have some direct or indirect personal connection or who just support the idea of science and don’t want to see it criticized. The clueless defenders and team players can do a lot of defending without ever looking carefully into that black box. They just want some reassurance that all is ok.

The stakes with Theranos were much bigger, in both dollar and public health terms, than with much of the junk science that we’ve discussed on this blog. Beauty and sex ratio, advice on eating behavior, Bible codes, ESP, ovulation and voting, air rage, etc.: These are small potatoes compared to the billions spent on blood testing. The only really high-stakes example I can think of that we’ve discussed here is the gremlins guy: To the extent he muddies the water enough to get people to think that global warming is good for the economy, I guess he could do some real damage.

Again, I applaud Carreyrou for tracking down the amazing story of Theranos. As with many cases of fraud or self-delusion, one of the most amazing aspects of the story is how simple and obvious it all was, how long the whole scheme stayed afloat given that there was nothing there at all but a black box with a robot arm squeezing pipettes of diluted blood.