Theranos, whistleblowing and speaking truth to power
-
0:00 - 0:04So, I had graduated
seven years ago from Berkeley -
0:04 - 0:09with a dual degree in molecular
and cell biology and linguistics, -
0:09 - 0:11and I had gone to a career fair
here on campus, -
0:11 - 0:15where I'd gotten an interview
with a start-up called Theranos. -
0:16 - 0:17And at the time,
-
0:17 - 0:20there wasn't really that much
information about the company, -
0:20 - 0:24but the little that was there
was really impressive. -
0:24 - 0:28Essentially, what the company
was doing was creating a medical device -
0:28 - 0:33where you would be able
to run your entire blood panel -
0:33 - 0:34on a finger-stick of blood.
-
0:34 - 0:38So you wouldn't have to get
a big needle stuck in your arm -
0:38 - 0:40in order to get your blood test done.
-
0:40 - 0:44So this was interesting
not only because it was less painful, -
0:44 - 0:49but also, it could potentially
open the door to predictive diagnostics. -
0:49 - 0:50If you had a device
-
0:50 - 0:55that allowed for more frequent
and continuous diagnosis, -
0:55 - 1:01potentially, you could diagnose disease
before someone got sick. -
1:01 - 1:05And this was confirmed in an interview
that the founder, Elizabeth Holmes, -
1:05 - 1:07had said in the Wall Street Journal.
-
1:07 - 1:09"The reality within
our health-care system today -
1:10 - 1:13is that when someone
you care about gets really sick, -
1:13 - 1:15by the time you find out
it's [most often] too late -
1:15 - 1:16to do anything about it,
-
1:17 - 1:18It's heartbreaking."
-
1:18 - 1:20This was a moon shot
that I really wanted to be a part of -
1:20 - 1:22and I really wanted to help build.
-
1:23 - 1:27And there was another reason
why I think the story of Elizabeth -
1:27 - 1:30really appealed to me.
-
1:30 - 1:32So there was a time
that someone had said to me, -
1:32 - 1:34"Erika, there are two types of people.
-
1:34 - 1:37There are those that thrive
and those that survive. -
1:37 - 1:39And you, my dear, are a survivor."
-
1:40 - 1:42Before I went to university,
-
1:42 - 1:46I had grown up in a one-bedroom trailer
with my six family members, -
1:46 - 1:48and when I told people
I wanted to go to Berkeley, -
1:48 - 1:51they would say, "Well, I want
to be an astronaut, so good luck." -
1:52 - 1:56And I stuck with it, and I worked hard,
and I managed to get in. -
1:56 - 1:59But honestly, my first year
was very challenging. -
1:59 - 2:01I was the victim of a series of crimes.
-
2:01 - 2:04I was robbed at gunpoint,
I was sexually assaulted, -
2:04 - 2:07and I was sexually assaulted a third time,
-
2:07 - 2:09spurring on very severe panic attacks,
-
2:10 - 2:11where I was failing my classes,
-
2:11 - 2:13and I dropped out of school.
-
2:13 - 2:16And at this moment, people had said to me,
-
2:16 - 2:18"Erika, maybe you're not cut out
for the sciences. -
2:18 - 2:21Maybe you should reconsider
doing something else." -
2:21 - 2:24And I told myself, "You know what?
-
2:24 - 2:26If I don't make the cut,
I don't make the cut, -
2:26 - 2:29but I cannot give up on myself,
and I'm going to go for this, -
2:29 - 2:33and even if I'm not the best for it,
I'm going to try and make it happen." -
2:33 - 2:37And luckily, I stuck with it,
and I got the degree, and I graduated. -
2:38 - 2:41(Applause and cheers)
-
2:41 - 2:42Thank you.
-
2:42 - 2:44(Applause)
-
2:45 - 2:50So when I heard Elizabeth Holmes
had dropped out of Stanford at age 19 -
2:50 - 2:52to start this company,
-
2:52 - 2:54and it was being quite successful,
-
2:54 - 2:56to me, it was a signal
-
2:56 - 3:00of, you know, it didn't matter
what your background was. -
3:00 - 3:03As long as you committed
to hard work and intelligence, -
3:03 - 3:06that was enough to make
an impact in the world. -
3:06 - 3:08And this was something,
for me, personally, -
3:08 - 3:10that I had to believe in my life,
-
3:10 - 3:13because it was one of the few
anchors that I had had -
3:13 - 3:15that got me through the day.
-
3:16 - 3:17So you can imagine,
-
3:17 - 3:22when I received this letter,
I was so excited. -
3:22 - 3:24I was over the moon.
-
3:24 - 3:28This was finally my opportunity
to contribute to society, -
3:28 - 3:32to solve the problems
that I had seen in the world, -
3:32 - 3:34and really, when I thought about Theranos,
-
3:34 - 3:37I really anticipated
that this would be the first -
3:37 - 3:40and the last company
that I was going to work for. -
3:41 - 3:44But I started to notice some problems.
-
3:45 - 3:50So, I started off as
an entry-level associate in the lab. -
3:50 - 3:53And we would be sitting in a lab meeting,
-
3:53 - 3:57reviewing data to confirm
whether the technology worked or not, -
3:57 - 3:59and we'd get datasets like this,
-
3:59 - 4:01and someone would say to me,
-
4:01 - 4:04"Well, let's get rid of the outlier
-
4:04 - 4:06and see how that affects
the accuracy rate." -
4:07 - 4:09So what constitutes an outlier here?
-
4:09 - 4:11Which one is the outlier?
-
4:12 - 4:14And the answer is, you have no idea.
-
4:14 - 4:17You don't know. Right?
-
4:17 - 4:18And deleting a data point
-
4:18 - 4:22is really violating one of the things
that I found so beautiful -
4:22 - 4:24about the scientific process --
-
4:24 - 4:29it really allows the data
to reveal the truth to you. -
4:29 - 4:33And as tempting as it might be
in certain scenarios -
4:33 - 4:37to place your story on the data
to confirm your own narrative, -
4:37 - 4:43when you do this, it has really bad
future consequences. -
4:43 - 4:47So this, to me, was almost
immediately a red flag, -
4:47 - 4:50and it kind of folded in
to the next experience -
4:50 - 4:51and the next red flag
-
4:51 - 4:54that I started to see
within the clinical laboratory. -
4:54 - 4:56So a clinical laboratory
-
4:56 - 4:59is where you actively process
patient samples. -
4:59 - 5:01And so before I would run
a patient's sample, -
5:01 - 5:05I would have a sample
where I knew what the concentration was, -
5:05 - 5:08and in this case, it was 0.2 for tPSA,
-
5:08 - 5:11which is an indicator
of whether someone has prostate cancer, -
5:11 - 5:13or is at risk of prostate cancer or not.
-
5:13 - 5:17But then, when I'd run it
in the Theranos device, -
5:17 - 5:19it would come out 8.9,
-
5:19 - 5:22and then I'd run it again,
and it would run out 5.1, -
5:22 - 5:26and I would run it again,
and it would come out 0.5, -
5:26 - 5:28which is technically in range,
-
5:28 - 5:30but what do you do in this scenario?
-
5:31 - 5:33What is the accurate answer?
-
5:34 - 5:38And this wasn't an instance
that I was seeing just one-off. -
5:38 - 5:41This was happening nearly every day,
-
5:41 - 5:44across so many different tests.
-
5:45 - 5:51And mind you, this is for a sample
where I know what the concentration is. -
5:51 - 5:54What happens when I don't know
what the concentration is, -
5:54 - 5:56like with a patient sample?
-
5:56 - 6:02How am I supposed to trust
what the result is, at that point? -
6:03 - 6:08So this led to, sort of,
the last and final red flag for me, -
6:08 - 6:11and this is when we were doing testing,
-
6:11 - 6:14in order to confirm and certify
-
6:14 - 6:17whether we could continue
processing patient samples. -
6:17 - 6:20So what regulators will do
is they'll give you a sample, -
6:20 - 6:23and they'll say, "Run this sample,
-
6:23 - 6:26just like the quality control,
through your normal workflow, -
6:26 - 6:28how you normally test on patients,
-
6:28 - 6:29and then give us the results,
-
6:29 - 6:33and we will tell you:
do you pass, or do you fail." -
6:33 - 6:37So because we were seeing
so many issues with the Theranos device -
6:37 - 6:40that was actively being used
to test on patients, -
6:40 - 6:43what we had done
is we had taken the sample -
6:43 - 6:46and we had run it
through an FDA-approved machine -
6:46 - 6:48and we had run it
through the Theranos device. -
6:49 - 6:50And guess what happened?
-
6:50 - 6:54We got two very, very different results.
-
6:55 - 6:57So what do you think
they did in this scenario? -
6:57 - 7:01You would anticipate
that you would tell the regulators, -
7:01 - 7:04like, "We have some discrepancies here
with this new technology." -
7:04 - 7:09But instead, Theranos had sent the result
of the FDA-approved machine. -
7:11 - 7:13So what does this signal to you?
-
7:13 - 7:17This signals to you
that even within your own organization, -
7:17 - 7:21you don't trust the results
that your technology is producing. -
7:22 - 7:26So how do we have any business
running patient samples -
7:26 - 7:28on this particular machine?
-
7:29 - 7:33So of course, you know,
I am a recent grad, -
7:33 - 7:36I have, at this point,
run all these different experiments, -
7:36 - 7:40I've compiled all this evidence,
and I'd gone into the office of the COO -
7:40 - 7:42and I was raising my concerns.
-
7:43 - 7:46"Within the lab, we're seeing
a lot of variability. -
7:46 - 7:48The accuracy rate doesn't seem right.
-
7:48 - 7:51I don't feel right
about testing on patients. -
7:51 - 7:54These things, I'm just
not comfortable with." -
7:54 - 7:56And the response I got back is,
-
7:57 - 7:59"You don't know
what you're talking about." -
7:59 - 8:01What you need to do
is what I'm paying you to do, -
8:01 - 8:03and you need to process patient samples."
-
8:04 - 8:08So that night, I called up
a colleague of mine -
8:08 - 8:12who I had befriended
within the organization, Tyler Shultz, -
8:12 - 8:17who also happened to have a grandfather
who was on the Board of Directors. -
8:17 - 8:20And so we had decided
to go to his grandfather's house -
8:20 - 8:23and tell him, at dinner,
-
8:23 - 8:26what the company
was telling him was going on -
8:26 - 8:30was actually not what was happening
behind closed doors. -
8:30 - 8:31And not to mention,
-
8:31 - 8:34Tyler's grandfather was George Schultz,
-
8:34 - 8:37the ex-secretary of state
of the United States. -
8:37 - 8:41So you can imagine me
as a 20-something-year-old -
8:41 - 8:44just shaking, like, "What are you
getting yourself into?" -
8:45 - 8:49But we had sat down
at his dinner table and said, -
8:49 - 8:51"When you think that they've taken
this blood sample -
8:51 - 8:55and they put it in this device,
and it pops out a result, -
8:55 - 9:00what's really happening is the moment
you step outside of the room, -
9:00 - 9:03they take that blood sample,
they run it to a back door, -
9:03 - 9:07and there are five people on standby
that are taking this tiny blood sample -
9:07 - 9:10and splitting it amongst
five different machines." -
9:11 - 9:15And he says to us,
"I know Tyler's very smart, -
9:15 - 9:16you seem very smart,
-
9:16 - 9:21but the fact of the matter is I've brought
in a wealth of intelligent people, -
9:21 - 9:25and they tell me that this device
is going to revolutionize health care. -
9:25 - 9:28And so maybe you should consider
doing something else." -
9:29 - 9:33So this had gone through a period
of about seven months, -
9:33 - 9:37and I decided to quit that very next day.
-
9:38 - 9:39And this --
-
9:39 - 9:46(Applause and cheers)
-
9:46 - 9:49But this was a moment
that I had to sit with myself -
9:49 - 9:51and do a bit of a mental health check.
-
9:51 - 9:53I'd raised concerns in the lab.
-
9:54 - 9:57I'd raised concerns with the COO.
-
9:57 - 10:00I had raised concerns with a board member.
-
10:00 - 10:02And meanwhile,
-
10:02 - 10:08Elizabeth is on the cover
of every major magazine across America. -
10:09 - 10:11So there's one common thread here,
-
10:11 - 10:12and that's me.
-
10:12 - 10:14Maybe I'm the problem?
-
10:14 - 10:16Maybe there's something
that I'm not seeing? -
10:16 - 10:18Maybe I'm the crazy one.
-
10:19 - 10:23And this is the part in my story
where I really get lucky. -
10:23 - 10:24I was approached
-
10:24 - 10:26by a very talented journalist,
John Carreyrou -
10:27 - 10:29from the Wall Street Journal, and he --
-
10:30 - 10:36And he had basically said
that he also had heard concerns -
10:36 - 10:39about the company
from other people in the industry -
10:39 - 10:41and working for the company.
-
10:41 - 10:43And in that moment, it clicked in my head:
-
10:43 - 10:45"Erika, you are not crazy.
-
10:45 - 10:47You're not the crazy one.
-
10:47 - 10:50In fact, there are other people
out there just like you -
10:50 - 10:53that are just as scared of coming forward,
-
10:53 - 10:57but see the same problems
and the same concerns that you do." -
10:58 - 11:02So before John's exposé
and investigative report had come out -
11:02 - 11:05to reveal the truth
of what was going on in the company, -
11:05 - 11:09the company decided to go on a witch hunt
for all sorts of former employees, -
11:09 - 11:11myself included,
-
11:11 - 11:18to basically intimidate us from coming
forward or talking to one another. -
11:18 - 11:20And the scary thing,
really, for me in this instance -
11:20 - 11:22was the fact that it triggered,
-
11:22 - 11:26and I realized that they were following me
once I received this letter, -
11:26 - 11:30but it was also, in a way,
a bit of a blessing, -
11:30 - 11:32because it forced me to call a lawyer.
-
11:32 - 11:35And I was lucky enough --
I called a free lawyer, -
11:35 - 11:36but he had suggested,
-
11:36 - 11:39"Why don't you report
to a regulatory agency?" -
11:40 - 11:44And this was something
that didn't even click in my head, -
11:45 - 11:47probably because I was so inexperienced,
-
11:47 - 11:51but once that happened,
that's exactly what I did. -
11:51 - 11:55I had decided to write a letter,
and a complaint letter, to regulators, -
11:55 - 11:59illustrating all the deficiencies
and the problems that I had seen -
11:59 - 12:01in the laboratory.
-
12:01 - 12:04And as endearingly as my dad
kind of notes this -
12:04 - 12:06as being my, like, dragon-slayer moment,
-
12:06 - 12:09where I had risen up
and fought this behemoth -
12:09 - 12:11and it caused this domino effect,
-
12:11 - 12:13I can tell you right now,
-
12:13 - 12:15I felt anything but courageous.
-
12:16 - 12:19I was scared, I was terrified,
-
12:19 - 12:20I was anxious,
-
12:21 - 12:24I was ashamed, slightly,
-
12:24 - 12:26that it took me a month
to write the letter. -
12:26 - 12:28There was a glimmer of hope in there
-
12:28 - 12:30that maybe somehow
no one would ever figure out -
12:30 - 12:32that it was me.
-
12:32 - 12:36But despite all that emotion
and all that volatility, -
12:36 - 12:37I still did it,
-
12:37 - 12:40and luckily, it triggered an investigation
-
12:40 - 12:42that shown to light
-
12:42 - 12:44that there were huge
deficiencies in the lab, -
12:44 - 12:47and it stopped Theranos
from processing patient samples. -
12:47 - 12:54(Applause)
-
12:56 - 12:59So you would hope,
going through a very challenging -
12:59 - 13:02and crazy situation like this,
-
13:02 - 13:05that I would be able
to sort of culminate some how-tos -
13:05 - 13:09or recipe for success for other people
that are in this situation. -
13:09 - 13:12But frankly, when it comes
to situations like this, -
13:12 - 13:17the only quote that kind of gets it right
is this Mike Tyson quote that says, -
13:17 - 13:20"Everyone has a plan
until you get punched in the mouth." -
13:20 - 13:22(Laughter)
-
13:22 - 13:25And that's exactly how this is.
-
13:25 - 13:27But today, you know,
-
13:27 - 13:29we're here to kind of
convene on moon shots, -
13:29 - 13:34and moon shots are these highly
innovative projects -
13:34 - 13:35that are very ambitious,
-
13:35 - 13:37that everyone wants to believe in.
-
13:38 - 13:42But what happens
when the vision is so compelling -
13:42 - 13:45and the desire to believe is so strong
-
13:45 - 13:50that it starts to cloud your judgment
about what reality is? -
13:51 - 13:54And particularly
when these innovative projects -
13:54 - 13:57start to be a detriment to society,
-
13:57 - 13:59what are the mechanisms in place
-
13:59 - 14:04in which we can prevent
these potential consequences? -
14:04 - 14:08And really, in my mind,
the simplest way to do that -
14:08 - 14:13is to foster stronger cultures
of people who speak up -
14:13 - 14:15and listening to those who speak up.
-
14:16 - 14:19So now the big question is,
-
14:19 - 14:24how do we make speaking up the norm
and not the exception? -
14:24 - 14:31(Applause and cheers)
-
14:31 - 14:34So luckily, in my own experience,
-
14:34 - 14:37I realized that when it comes
to speaking up, -
14:37 - 14:40the action tends to be
pretty straightforward in most cases, -
14:41 - 14:46but the hard part is really deciding
whether to act or not. -
14:46 - 14:48So how do we frame our decisions
-
14:48 - 14:53in a way that makes it
easier for us to act -
14:53 - 14:55and produce more ethical outcomes?
-
14:56 - 15:00So UC San Diego came up
with this excellent framework -
15:00 - 15:01called the "Three Cs,"
-
15:01 - 15:06and it's called commitment,
consciousness and competency. -
15:06 - 15:09And commitment is the desire
to do the right thing -
15:09 - 15:11regardless of the cost.
-
15:11 - 15:12In my case at Theranos,
-
15:13 - 15:14if I was wrong,
-
15:14 - 15:16I was going to have
to pay the consequences. -
15:16 - 15:18But if I was right,
-
15:18 - 15:20the fact that I could have been a person
-
15:20 - 15:24that knew what was going on
and didn't say something, -
15:24 - 15:25that was purgatory.
-
15:25 - 15:27Being silent was purgatory.
-
15:29 - 15:31Then there's consciousness,
-
15:31 - 15:34the awareness to act consistently
and apply moral convictions -
15:34 - 15:37to daily behavior,
-
15:37 - 15:38behavior.
-
15:38 - 15:41And the third aspect is competency.
-
15:41 - 15:45And competency is the ability
to collect and evaluate information -
15:45 - 15:48and foresee potential
consequences and risk. -
15:48 - 15:51And the reason I could trust my competency
-
15:51 - 15:55was because I was acting
in service of others. -
15:55 - 15:59So I think a simple process
is really taking those actions -
15:59 - 16:00and imagining,
-
16:00 - 16:03"If this happened to my children,
-
16:03 - 16:04to my parents,
-
16:04 - 16:06to my spouse,
-
16:06 - 16:09to my neighbors, to my community,
-
16:09 - 16:10if I took that ...
-
16:12 - 16:13How will it be remembered?"
-
16:15 - 16:16And with that,
-
16:16 - 16:18I hope, as we all leave here
-
16:18 - 16:21and venture off
to build our own moon shots, -
16:21 - 16:23we don't just conceptualize them,
-
16:23 - 16:27in a way, as a means
for people to survive -
16:27 - 16:33but really see them as opportunities
and chances for everybody to thrive. -
16:34 - 16:35Thank you.
-
16:35 - 16:36(Applause and cheers)
- Title:
- Theranos, whistleblowing and speaking truth to power
- Speaker:
- Erika Cheung
- Description:
-
In 2014, Erika Cheung made a discovery that would ultimately help bring down her employer, Theranos, as well as its founder, Elizabeth Holmes, who claimed to have invented technology that would transform medicine. The decision to become a whistleblower proved a hard lesson in figuring out how to do what's right in the face of both personal and professional obstacles. With candor and humility, Cheung shares her journey of speaking truth to power -- and offers a framework to encourage others to come forward and act in the service of all.
- Video Language:
- English
- Team:
- closed TED
- Project:
- TEDTalks
- Duration:
- 16:50
Erin Gregory edited English subtitles for Theranos, whistleblowing and speaking truth to power | ||
Erin Gregory approved English subtitles for Theranos, whistleblowing and speaking truth to power | ||
Erin Gregory edited English subtitles for Theranos, whistleblowing and speaking truth to power | ||
Krystian Aparta accepted English subtitles for Theranos, whistleblowing and speaking truth to power | ||
Krystian Aparta edited English subtitles for Theranos, whistleblowing and speaking truth to power | ||
Krystian Aparta edited English subtitles for Theranos, whistleblowing and speaking truth to power | ||
Krystian Aparta edited English subtitles for Theranos, whistleblowing and speaking truth to power | ||
Joseph Geni edited English subtitles for Theranos, whistleblowing and speaking truth to power |