Return to Video

Theranos, whistleblowing and speaking truth to power

  • 0:00 - 0:04
    So, I had graduated
    seven years ago from Berkeley
  • 0:04 - 0:09
    with a dual degree in molecular
    and cell biology and linguistics,
  • 0:09 - 0:11
    and I had gone to a career fair
    here on campus,
  • 0:11 - 0:15
    where I'd gotten an interview
    with a start-up called Theranos.
  • 0:16 - 0:17
    And at the time,
  • 0:17 - 0:20
    there wasn't really that much
    information about the company,
  • 0:20 - 0:24
    but the little that was there
    was really impressive.
  • 0:24 - 0:28
    Essentially, what the company
    was doing was creating a medical device
  • 0:28 - 0:33
    where you would be able
    to run your entire blood panel
  • 0:33 - 0:34
    on a finger-stick of blood.
  • 0:34 - 0:38
    So you wouldn't have to get
    a big needle stuck in your arm
  • 0:38 - 0:40
    in order to get your blood test done.
  • 0:40 - 0:44
    So this was interesting
    not only because it was less painful,
  • 0:44 - 0:49
    but also, it could potentially
    open the door to predictive diagnostics.
  • 0:49 - 0:50
    If you had a device
  • 0:50 - 0:55
    that allowed for more frequent
    and continuous diagnosis,
  • 0:55 - 1:01
    potentially, you could diagnose disease
    before someone got sick.
  • 1:01 - 1:05
    And this was confirmed in an interview
    that the founder, Elizabeth Holmes,
  • 1:05 - 1:07
    had said in the Wall Street Journal.
  • 1:07 - 1:09
    "The reality within
    our health-care system today
  • 1:10 - 1:13
    is that when someone
    you care about gets really sick,
  • 1:13 - 1:15
    by the time you find out
    it's [most often] too late
  • 1:15 - 1:16
    to do anything about it,
  • 1:17 - 1:18
    It's heartbreaking."
  • 1:18 - 1:20
    This was a moon shot
    that I really wanted to be a part of
  • 1:20 - 1:22
    and I really wanted to help build.
  • 1:23 - 1:27
    And there was another reason
    why I think the story of Elizabeth
  • 1:27 - 1:30
    really appealed to me.
  • 1:30 - 1:32
    So there was a time
    that someone had said to me,
  • 1:32 - 1:34
    "Erika, there are two types of people.
  • 1:34 - 1:37
    There are those that thrive
    and those that survive.
  • 1:37 - 1:39
    And you, my dear, are a survivor."
  • 1:40 - 1:42
    Before I went to university,
  • 1:42 - 1:46
    I had grown up in a one-bedroom trailer
    with my six family members,
  • 1:46 - 1:48
    and when I told people
    I wanted to go to Berkeley,
  • 1:48 - 1:51
    they would say, "Well, I want
    to be an astronaut, so good luck."
  • 1:52 - 1:56
    And I stuck with it, and I worked hard,
    and I managed to get in.
  • 1:56 - 1:59
    But honestly, my first year
    was very challenging.
  • 1:59 - 2:01
    I was the victim of a series of crimes.
  • 2:01 - 2:04
    I was robbed at gunpoint,
    I was sexually assaulted,
  • 2:04 - 2:07
    and I was sexually assaulted a third time,
  • 2:07 - 2:09
    spurring on very severe panic attacks,
  • 2:10 - 2:11
    where I was failing my classes,
  • 2:11 - 2:13
    and I dropped out of school.
  • 2:13 - 2:16
    And at this moment, people had said to me,
  • 2:16 - 2:18
    "Erika, maybe you're not cut out
    for the sciences.
  • 2:18 - 2:21
    Maybe you should reconsider
    doing something else."
  • 2:21 - 2:24
    And I told myself, "You know what?
  • 2:24 - 2:26
    If I don't make the cut,
    I don't make the cut,
  • 2:26 - 2:29
    but I cannot give up on myself,
    and I'm going to go for this,
  • 2:29 - 2:33
    and even if I'm not the best for it,
    I'm going to try and make it happen."
  • 2:33 - 2:37
    And luckily, I stuck with it,
    and I got the degree, and I graduated.
  • 2:38 - 2:41
    (Applause and cheers)
  • 2:41 - 2:42
    Thank you.
  • 2:42 - 2:44
    (Applause)
  • 2:45 - 2:50
    So when I heard Elizabeth Holmes
    had dropped out of Stanford at age 19
  • 2:50 - 2:52
    to start this company,
  • 2:52 - 2:54
    and it was being quite successful,
  • 2:54 - 2:56
    to me, it was a signal
  • 2:56 - 3:00
    of, you know, it didn't matter
    what your background was.
  • 3:00 - 3:03
    As long as you committed
    to hard work and intelligence,
  • 3:03 - 3:06
    that was enough to make
    an impact in the world.
  • 3:06 - 3:08
    And this was something,
    for me, personally,
  • 3:08 - 3:10
    that I had to believe in my life,
  • 3:10 - 3:13
    because it was one of the few
    anchors that I had had
  • 3:13 - 3:15
    that got me through the day.
  • 3:16 - 3:17
    So you can imagine,
  • 3:17 - 3:22
    when I received this letter,
    I was so excited.
  • 3:22 - 3:24
    I was over the moon.
  • 3:24 - 3:28
    This was finally my opportunity
    to contribute to society,
  • 3:28 - 3:32
    to solve the problems
    that I had seen in the world,
  • 3:32 - 3:34
    and really, when I thought about Theranos,
  • 3:34 - 3:37
    I really anticipated
    that this would be the first
  • 3:37 - 3:40
    and the last company
    that I was going to work for.
  • 3:41 - 3:44
    But I started to notice some problems.
  • 3:45 - 3:50
    So, I started off as
    an entry-level associate in the lab.
  • 3:50 - 3:53
    And we would be sitting in a lab meeting,
  • 3:53 - 3:57
    reviewing data to confirm
    whether the technology worked or not,
  • 3:57 - 3:59
    and we'd get datasets like this,
  • 3:59 - 4:01
    and someone would say to me,
  • 4:01 - 4:04
    "Well, let's get rid of the outlier
  • 4:04 - 4:06
    and see how that affects
    the accuracy rate."
  • 4:07 - 4:09
    So what constitutes an outlier here?
  • 4:09 - 4:11
    Which one is the outlier?
  • 4:12 - 4:14
    And the answer is, you have no idea.
  • 4:14 - 4:17
    You don't know. Right?
  • 4:17 - 4:18
    And deleting a data point
  • 4:18 - 4:22
    is really violating one of the things
    that I found so beautiful
  • 4:22 - 4:24
    about the scientific process --
  • 4:24 - 4:29
    it really allows the data
    to reveal the truth to you.
  • 4:29 - 4:33
    And as tempting as it might be
    in certain scenarios
  • 4:33 - 4:37
    to place your story on the data
    to confirm your own narrative,
  • 4:37 - 4:43
    when you do this, it has really bad
    future consequences.
  • 4:43 - 4:47
    So this, to me, was almost
    immediately a red flag,
  • 4:47 - 4:50
    and it kind of folded in
    to the next experience
  • 4:50 - 4:51
    and the next red flag
  • 4:51 - 4:54
    that I started to see
    within the clinical laboratory.
  • 4:54 - 4:56
    So a clinical laboratory
  • 4:56 - 4:59
    is where you actively process
    patient samples.
  • 4:59 - 5:01
    And so before I would run
    a patient's sample,
  • 5:01 - 5:05
    I would have a sample
    where I knew what the concentration was,
  • 5:05 - 5:08
    and in this case, it was 0.2 for tPSA,
  • 5:08 - 5:11
    which is an indicator
    of whether someone has prostate cancer,
  • 5:11 - 5:13
    or is at risk of prostate cancer or not.
  • 5:13 - 5:17
    But then, when I'd run it
    in the Theranos device,
  • 5:17 - 5:19
    it would come out 8.9,
  • 5:19 - 5:22
    and then I'd run it again,
    and it would run out 5.1,
  • 5:22 - 5:26
    and I would run it again,
    and it would come out 0.5,
  • 5:26 - 5:28
    which is technically in range,
  • 5:28 - 5:30
    but what do you do in this scenario?
  • 5:31 - 5:33
    What is the accurate answer?
  • 5:34 - 5:38
    And this wasn't an instance
    that I was seeing just one-off.
  • 5:38 - 5:41
    This was happening nearly every day,
  • 5:41 - 5:44
    across so many different tests.
  • 5:45 - 5:51
    And mind you, this is for a sample
    where I know what the concentration is.
  • 5:51 - 5:54
    What happens when I don't know
    what the concentration is,
  • 5:54 - 5:56
    like with a patient sample?
  • 5:56 - 6:02
    How am I supposed to trust
    what the result is, at that point?
  • 6:03 - 6:08
    So this led to, sort of,
    the last and final red flag for me,
  • 6:08 - 6:11
    and this is when we were doing testing,
  • 6:11 - 6:14
    in order to confirm and certify
  • 6:14 - 6:17
    whether we could continue
    processing patient samples.
  • 6:17 - 6:20
    So what regulators will do
    is they'll give you a sample,
  • 6:20 - 6:23
    and they'll say, "Run this sample,
  • 6:23 - 6:26
    just like the quality control,
    through your normal workflow,
  • 6:26 - 6:28
    how you normally test on patients,
  • 6:28 - 6:29
    and then give us the results,
  • 6:29 - 6:33
    and we will tell you:
    do you pass, or do you fail."
  • 6:33 - 6:37
    So because we were seeing
    so many issues with the Theranos device
  • 6:37 - 6:40
    that was actively being used
    to test on patients,
  • 6:40 - 6:43
    what we had done
    is we had taken the sample
  • 6:43 - 6:46
    and we had run it
    through an FDA-approved machine
  • 6:46 - 6:48
    and we had run it
    through the Theranos device.
  • 6:49 - 6:50
    And guess what happened?
  • 6:50 - 6:54
    We got two very, very different results.
  • 6:55 - 6:57
    So what do you think
    they did in this scenario?
  • 6:57 - 7:01
    You would anticipate
    that you would tell the regulators,
  • 7:01 - 7:04
    like, "We have some discrepancies here
    with this new technology."
  • 7:04 - 7:09
    But instead, Theranos had sent the result
    of the FDA-approved machine.
  • 7:11 - 7:13
    So what does this signal to you?
  • 7:13 - 7:17
    This signals to you
    that even within your own organization,
  • 7:17 - 7:21
    you don't trust the results
    that your technology is producing.
  • 7:22 - 7:26
    So how do we have any business
    running patient samples
  • 7:26 - 7:28
    on this particular machine?
  • 7:29 - 7:33
    So of course, you know,
    I am a recent grad,
  • 7:33 - 7:36
    I have, at this point,
    run all these different experiments,
  • 7:36 - 7:40
    I've compiled all this evidence,
    and I'd gone into the office of the COO
  • 7:40 - 7:42
    and I was raising my concerns.
  • 7:43 - 7:46
    "Within the lab, we're seeing
    a lot of variability.
  • 7:46 - 7:48
    The accuracy rate doesn't seem right.
  • 7:48 - 7:51
    I don't feel right
    about testing on patients.
  • 7:51 - 7:54
    These things, I'm just
    not comfortable with."
  • 7:54 - 7:56
    And the response I got back is,
  • 7:57 - 7:59
    "You don't know
    what you're talking about."
  • 7:59 - 8:01
    What you need to do
    is what I'm paying you to do,
  • 8:01 - 8:03
    and you need to process patient samples."
  • 8:04 - 8:08
    So that night, I called up
    a colleague of mine
  • 8:08 - 8:12
    who I had befriended
    within the organization, Tyler Shultz,
  • 8:12 - 8:17
    who also happened to have a grandfather
    who was on the Board of Directors.
  • 8:17 - 8:20
    And so we had decided
    to go to his grandfather's house
  • 8:20 - 8:23
    and tell him, at dinner,
  • 8:23 - 8:26
    what the company
    was telling him was going on
  • 8:26 - 8:30
    was actually not what was happening
    behind closed doors.
  • 8:30 - 8:31
    And not to mention,
  • 8:31 - 8:34
    Tyler's grandfather was George Schultz,
  • 8:34 - 8:37
    the ex-secretary of state
    of the United States.
  • 8:37 - 8:41
    So you can imagine me
    as a 20-something-year-old
  • 8:41 - 8:44
    just shaking, like, "What are you
    getting yourself into?"
  • 8:45 - 8:49
    But we had sat down
    at his dinner table and said,
  • 8:49 - 8:51
    "When you think that they've taken
    this blood sample
  • 8:51 - 8:55
    and they put it in this device,
    and it pops out a result,
  • 8:55 - 9:00
    what's really happening is the moment
    you step outside of the room,
  • 9:00 - 9:03
    they take that blood sample,
    they run it to a back door,
  • 9:03 - 9:07
    and there are five people on standby
    that are taking this tiny blood sample
  • 9:07 - 9:10
    and splitting it amongst
    five different machines."
  • 9:11 - 9:15
    And he says to us,
    "I know Tyler's very smart,
  • 9:15 - 9:16
    you seem very smart,
  • 9:16 - 9:21
    but the fact of the matter is I've brought
    in a wealth of intelligent people,
  • 9:21 - 9:25
    and they tell me that this device
    is going to revolutionize health care.
  • 9:25 - 9:28
    And so maybe you should consider
    doing something else."
  • 9:29 - 9:33
    So this had gone through a period
    of about seven months,
  • 9:33 - 9:37
    and I decided to quit that very next day.
  • 9:38 - 9:39
    And this --
  • 9:39 - 9:46
    (Applause and cheers)
  • 9:46 - 9:49
    But this was a moment
    that I had to sit with myself
  • 9:49 - 9:51
    and do a bit of a mental health check.
  • 9:51 - 9:53
    I'd raised concerns in the lab.
  • 9:54 - 9:57
    I'd raised concerns with the COO.
  • 9:57 - 10:00
    I had raised concerns with a board member.
  • 10:00 - 10:02
    And meanwhile,
  • 10:02 - 10:08
    Elizabeth is on the cover
    of every major magazine across America.
  • 10:09 - 10:11
    So there's one common thread here,
  • 10:11 - 10:12
    and that's me.
  • 10:12 - 10:14
    Maybe I'm the problem?
  • 10:14 - 10:16
    Maybe there's something
    that I'm not seeing?
  • 10:16 - 10:18
    Maybe I'm the crazy one.
  • 10:19 - 10:23
    And this is the part in my story
    where I really get lucky.
  • 10:23 - 10:24
    I was approached
  • 10:24 - 10:26
    by a very talented journalist,
    John Carreyrou
  • 10:27 - 10:29
    from the Wall Street Journal, and he --
  • 10:30 - 10:36
    And he had basically said
    that he also had heard concerns
  • 10:36 - 10:39
    about the company
    from other people in the industry
  • 10:39 - 10:41
    and working for the company.
  • 10:41 - 10:43
    And in that moment, it clicked in my head:
  • 10:43 - 10:45
    "Erika, you are not crazy.
  • 10:45 - 10:47
    You're not the crazy one.
  • 10:47 - 10:50
    In fact, there are other people
    out there just like you
  • 10:50 - 10:53
    that are just as scared of coming forward,
  • 10:53 - 10:57
    but see the same problems
    and the same concerns that you do."
  • 10:58 - 11:02
    So before John's exposé
    and investigative report had come out
  • 11:02 - 11:05
    to reveal the truth
    of what was going on in the company,
  • 11:05 - 11:09
    the company decided to go on a witch hunt
    for all sorts of former employees,
  • 11:09 - 11:11
    myself included,
  • 11:11 - 11:18
    to basically intimidate us from coming
    forward or talking to one another.
  • 11:18 - 11:20
    And the scary thing,
    really, for me in this instance
  • 11:20 - 11:22
    was the fact that it triggered,
  • 11:22 - 11:26
    and I realized that they were following me
    once I received this letter,
  • 11:26 - 11:30
    but it was also, in a way,
    a bit of a blessing,
  • 11:30 - 11:32
    because it forced me to call a lawyer.
  • 11:32 - 11:35
    And I was lucky enough --
    I called a free lawyer,
  • 11:35 - 11:36
    but he had suggested,
  • 11:36 - 11:39
    "Why don't you report
    to a regulatory agency?"
  • 11:40 - 11:44
    And this was something
    that didn't even click in my head,
  • 11:45 - 11:47
    probably because I was so inexperienced,
  • 11:47 - 11:51
    but once that happened,
    that's exactly what I did.
  • 11:51 - 11:55
    I had decided to write a letter,
    and a complaint letter, to regulators,
  • 11:55 - 11:59
    illustrating all the deficiencies
    and the problems that I had seen
  • 11:59 - 12:01
    in the laboratory.
  • 12:01 - 12:04
    And as endearingly as my dad
    kind of notes this
  • 12:04 - 12:06
    as being my, like, dragon-slayer moment,
  • 12:06 - 12:09
    where I had risen up
    and fought this behemoth
  • 12:09 - 12:11
    and it caused this domino effect,
  • 12:11 - 12:13
    I can tell you right now,
  • 12:13 - 12:15
    I felt anything but courageous.
  • 12:16 - 12:19
    I was scared, I was terrified,
  • 12:19 - 12:20
    I was anxious,
  • 12:21 - 12:24
    I was ashamed, slightly,
  • 12:24 - 12:26
    that it took me a month
    to write the letter.
  • 12:26 - 12:28
    There was a glimmer of hope in there
  • 12:28 - 12:30
    that maybe somehow
    no one would ever figure out
  • 12:30 - 12:32
    that it was me.
  • 12:32 - 12:36
    But despite all that emotion
    and all that volatility,
  • 12:36 - 12:37
    I still did it,
  • 12:37 - 12:40
    and luckily, it triggered an investigation
  • 12:40 - 12:42
    that shown to light
  • 12:42 - 12:44
    that there were huge
    deficiencies in the lab,
  • 12:44 - 12:47
    and it stopped Theranos
    from processing patient samples.
  • 12:47 - 12:54
    (Applause)
  • 12:56 - 12:59
    So you would hope,
    going through a very challenging
  • 12:59 - 13:02
    and crazy situation like this,
  • 13:02 - 13:05
    that I would be able
    to sort of culminate some how-tos
  • 13:05 - 13:09
    or recipe for success for other people
    that are in this situation.
  • 13:09 - 13:12
    But frankly, when it comes
    to situations like this,
  • 13:12 - 13:17
    the only quote that kind of gets it right
    is this Mike Tyson quote that says,
  • 13:17 - 13:20
    "Everyone has a plan
    until you get punched in the mouth."
  • 13:20 - 13:22
    (Laughter)
  • 13:22 - 13:25
    And that's exactly how this is.
  • 13:25 - 13:27
    But today, you know,
  • 13:27 - 13:29
    we're here to kind of
    convene on moon shots,
  • 13:29 - 13:34
    and moon shots are these highly
    innovative projects
  • 13:34 - 13:35
    that are very ambitious,
  • 13:35 - 13:37
    that everyone wants to believe in.
  • 13:38 - 13:42
    But what happens
    when the vision is so compelling
  • 13:42 - 13:45
    and the desire to believe is so strong
  • 13:45 - 13:50
    that it starts to cloud your judgment
    about what reality is?
  • 13:51 - 13:54
    And particularly
    when these innovative projects
  • 13:54 - 13:57
    start to be a detriment to society,
  • 13:57 - 13:59
    what are the mechanisms in place
  • 13:59 - 14:04
    in which we can prevent
    these potential consequences?
  • 14:04 - 14:08
    And really, in my mind,
    the simplest way to do that
  • 14:08 - 14:13
    is to foster stronger cultures
    of people who speak up
  • 14:13 - 14:15
    and listening to those who speak up.
  • 14:16 - 14:19
    So now the big question is,
  • 14:19 - 14:24
    how do we make speaking up the norm
    and not the exception?
  • 14:24 - 14:31
    (Applause and cheers)
  • 14:31 - 14:34
    So luckily, in my own experience,
  • 14:34 - 14:37
    I realized that when it comes
    to speaking up,
  • 14:37 - 14:40
    the action tends to be
    pretty straightforward in most cases,
  • 14:41 - 14:46
    but the hard part is really deciding
    whether to act or not.
  • 14:46 - 14:48
    So how do we frame our decisions
  • 14:48 - 14:53
    in a way that makes it
    easier for us to act
  • 14:53 - 14:55
    and produce more ethical outcomes?
  • 14:56 - 15:00
    So UC San Diego came up
    with this excellent framework
  • 15:00 - 15:01
    called the "Three Cs,"
  • 15:01 - 15:06
    and it's called commitment,
    consciousness and competency.
  • 15:06 - 15:09
    And commitment is the desire
    to do the right thing
  • 15:09 - 15:11
    regardless of the cost.
  • 15:11 - 15:12
    In my case at Theranos,
  • 15:13 - 15:14
    if I was wrong,
  • 15:14 - 15:16
    I was going to have
    to pay the consequences.
  • 15:16 - 15:18
    But if I was right,
  • 15:18 - 15:20
    the fact that I could have been a person
  • 15:20 - 15:24
    that knew what was going on
    and didn't say something,
  • 15:24 - 15:25
    that was purgatory.
  • 15:25 - 15:27
    Being silent was purgatory.
  • 15:29 - 15:31
    Then there's consciousness,
  • 15:31 - 15:34
    the awareness to act consistently
    and apply moral convictions
  • 15:34 - 15:37
    to daily behavior,
  • 15:37 - 15:38
    behavior.
  • 15:38 - 15:41
    And the third aspect is competency.
  • 15:41 - 15:45
    And competency is the ability
    to collect and evaluate information
  • 15:45 - 15:48
    and foresee potential
    consequences and risk.
  • 15:48 - 15:51
    And the reason I could trust my competency
  • 15:51 - 15:55
    was because I was acting
    in service of others.
  • 15:55 - 15:59
    So I think a simple process
    is really taking those actions
  • 15:59 - 16:00
    and imagining,
  • 16:00 - 16:03
    "If this happened to my children,
  • 16:03 - 16:04
    to my parents,
  • 16:04 - 16:06
    to my spouse,
  • 16:06 - 16:09
    to my neighbors, to my community,
  • 16:09 - 16:10
    if I took that ...
  • 16:12 - 16:13
    How will it be remembered?"
  • 16:15 - 16:16
    And with that,
  • 16:16 - 16:18
    I hope, as we all leave here
  • 16:18 - 16:21
    and venture off
    to build our own moon shots,
  • 16:21 - 16:23
    we don't just conceptualize them,
  • 16:23 - 16:27
    in a way, as a means
    for people to survive
  • 16:27 - 16:33
    but really see them as opportunities
    and chances for everybody to thrive.
  • 16:34 - 16:35
    Thank you.
  • 16:35 - 16:36
    (Applause and cheers)
Title:
Theranos, whistleblowing and speaking truth to power
Speaker:
Erika Cheung
Description:

In 2014, Erika Cheung made a discovery that would ultimately help bring down her employer, Theranos, as well as its founder, Elizabeth Holmes, who claimed to have invented technology that would transform medicine. The decision to become a whistleblower proved a hard lesson in figuring out how to do what's right in the face of both personal and professional obstacles. With candor and humility, Cheung shares her journey of speaking truth to power -- and offers a framework to encourage others to come forward and act in the service of all.

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDTalks
Duration:
16:50

English subtitles

Revisions Compare revisions