Return to Video

How synthetic biology could wipe out humanity -- and how we can stop it

  • 0:01 - 0:04
    So, there's about
    seven and a half billion of us.
  • 0:05 - 0:09
    The World Health Organization tells us
    that 300 million of us are depressed,
  • 0:09 - 0:12
    and about 800,000 people
    take their lives every year.
  • 0:13 - 0:17
    A tiny subset of them choose
    a profoundly nihilistic route,
  • 0:17 - 0:21
    which is they die in the act of killing
    as many people as possible.
  • 0:21 - 0:23
    These are some famous recent examples.
  • 0:24 - 0:27
    And here's a less famous one.
    It happened about nine weeks ago.
  • 0:27 - 0:28
    If you don't remember it,
  • 0:28 - 0:30
    it's because there's
    a lot of this going on.
  • 0:30 - 0:35
    Wikipedia just last year
    counted 323 mass shootings
  • 0:35 - 0:38
    in my home country, the United States.
  • 0:38 - 0:40
    Not all of those shooters were suicidal,
  • 0:40 - 0:43
    not all of them were maximizing
    their death tolls,
  • 0:43 - 0:44
    but many, many were.
  • 0:45 - 0:49
    An important question becomes:
    What limits do these people have?
  • 0:49 - 0:50
    Take the Vegas shooter.
  • 0:50 - 0:53
    He slaughtered 58 people.
  • 0:53 - 0:55
    Did he stop there because he'd had enough?
  • 0:56 - 1:01
    No, and we know this because
    he shot and injured another 422 people
  • 1:01 - 1:04
    who he surely would have
    preferred to kill.
  • 1:04 - 1:07
    We have no reason to think
    he would have stopped at 4,200.
  • 1:07 - 1:11
    In fact, with somebody this nihilistic,
    he may well have gladly killed us all.
  • 1:11 - 1:12
    We don't know.
  • 1:13 - 1:15
    What we do know is this:
  • 1:15 - 1:18
    when suicidal murderers really go all in,
  • 1:18 - 1:21
    technology is the force multiplier.
  • 1:22 - 1:23
    Here's an example.
  • 1:23 - 1:28
    Several years back, there was a rash
    of 10 mass school attacks in China
  • 1:28 - 1:32
    carried out with things
    like knives and hammers and cleavers,
  • 1:32 - 1:34
    because guns are really hard to get there.
  • 1:34 - 1:38
    By macabre coincidence,
    this last attack occurred
  • 1:38 - 1:42
    just hours before the massacre
    in Newtown, Connecticut.
  • 1:42 - 1:46
    But that one American attack killed
    roughly the same number of victims
  • 1:46 - 1:49
    as the 10 Chinese attacks combined.
  • 1:49 - 1:55
    So we can fairly say,
    knife: terrible; gun: way worse.
  • 1:55 - 1:58
    And airplane: massively worse,
  • 1:58 - 2:02
    as pilot Andreas Lubitz showed
    when he forced 149 people
  • 2:02 - 2:04
    to join him in his suicide,
  • 2:04 - 2:07
    smashing a plane into the French Alps.
  • 2:07 - 2:10
    And there are other examples of this.
  • 2:10 - 2:15
    And I'm afraid there are far more deadly
    weapons in our near future than airplanes,
  • 2:15 - 2:17
    ones not made of metal.
  • 2:17 - 2:22
    So let's consider the apocalyptic
    dynamics that will ensue
  • 2:22 - 2:27
    if suicidal mass murder hitches a ride
    on a rapidly advancing field
  • 2:27 - 2:31
    that for the most part holds
    boundless promise for society.
  • 2:32 - 2:35
    Somewhere out there in the world,
    there's a tiny group of people
  • 2:35 - 2:37
    who would attempt, however ineptly,
  • 2:37 - 2:40
    to kill us all if they
    could just figure out how.
  • 2:41 - 2:43
    The Vegas shooter may or may not
    have been one of them,
  • 2:43 - 2:45
    but with seven and a half billion of us,
  • 2:45 - 2:48
    this is a nonzero population.
  • 2:48 - 2:50
    There's plenty of suicidal
    nihilists out there.
  • 2:50 - 2:51
    We've already seen that.
  • 2:51 - 2:55
    There's people with severe mood disorders
    that they can't even control.
  • 2:55 - 3:00
    There are people who have just suffered
    deranging traumas, etc. etc.
  • 3:01 - 3:03
    As for the corollary group,
  • 3:03 - 3:07
    its size was simply zero forever
    until the Cold War,
  • 3:07 - 3:10
    when suddenly, the leaders
    of two global alliances
  • 3:10 - 3:13
    attained the ability to blow up the world.
  • 3:14 - 3:18
    The number of people
    with actual doomsday buttons
  • 3:18 - 3:20
    has stayed fairly stable since then.
  • 3:20 - 3:22
    But I'm afraid it's about to grow,
  • 3:22 - 3:24
    and not just to three.
  • 3:24 - 3:25
    This is going off the charts.
  • 3:26 - 3:28
    I mean, it's going to look
    like a tech business plan.
  • 3:28 - 3:30
    (Laughter)
  • 3:32 - 3:34
    And the reason is,
  • 3:34 - 3:36
    we're in the era
    of exponential technologies,
  • 3:36 - 3:41
    which routinely take
    eternal impossibilities
  • 3:41 - 3:47
    and make them the actual superpowers
    of one or two living geniuses
  • 3:47 - 3:49
    and -- this is the big part --
  • 3:49 - 3:51
    then diffuse those powers
    to more or less everybody.
  • 3:52 - 3:54
    Now, here's a benign example.
  • 3:54 - 3:58
    If you wanted to play checkers
    with a computer in 1952,
  • 3:58 - 4:02
    you literally had to be that guy,
  • 4:02 - 4:07
    then commandeer one of the world's
    19 copies of that computer,
  • 4:07 - 4:10
    then used your Nobel-adjacent brain
    to teach it checkers.
  • 4:10 - 4:12
    That was the bar.
  • 4:13 - 4:18
    Today, you just need to know someone
    who knows someone who owns a telephone,
  • 4:18 - 4:21
    because computing
    is an exponential technology.
  • 4:21 - 4:23
    So is synthetic biology,
  • 4:23 - 4:25
    which I'll now refer to as "synbio."
  • 4:26 - 4:32
    And in 2011, a couple of researchers
    did something every bit as ingenious
  • 4:32 - 4:35
    and unprecedented as the checkers trick
  • 4:35 - 4:37
    with H5N1 flu.
  • 4:37 - 4:42
    This is a strain that kills
    up to 60 percent of the people it infects,
  • 4:42 - 4:44
    more than Ebola.
  • 4:44 - 4:47
    But it is so uncontagious
  • 4:47 - 4:51
    that it's killed fewer
    than 50 people since 2015.
  • 4:51 - 4:55
    So these researchers edited H5N1's genome
  • 4:55 - 4:59
    and made it every bit as deadly,
    but also wildly contagious.
  • 4:59 - 5:03
    The news arm of one of the world's
    top two scientific journals
  • 5:03 - 5:06
    said if this thing got out,
    it would likely cause a pandemic
  • 5:06 - 5:09
    with perhaps millions of deaths.
  • 5:09 - 5:11
    And Dr. Paul Keim said
  • 5:11 - 5:15
    he could not think of an organism
    as scary as this,
  • 5:15 - 5:18
    which is the last thing
    I personally want to hear
  • 5:18 - 5:22
    from the Chairman of the National
    Science Advisory Board on Biosecurity.
  • 5:23 - 5:26
    And by the way, Dr. Keim also said this --
  • 5:26 - 5:28
    ["I don't think anthrax
    is scary at all compared to this."]
  • 5:28 - 5:30
    And he's also one of these.
  • 5:30 - 5:31
    [Anthrax expert] (Laughter)
  • 5:31 - 5:35
    Now, the good news about the 2011 biohack
  • 5:35 - 5:38
    is that the people who did it
    didn't mean us any harm.
  • 5:38 - 5:39
    They're virologists.
  • 5:39 - 5:41
    They believed they were advancing science.
  • 5:41 - 5:45
    The bad news is that technology
    does not freeze in place,
  • 5:45 - 5:47
    and over the next few decades,
  • 5:47 - 5:50
    their feat will become trivially easy.
  • 5:51 - 5:54
    In fact, it's already way easier,
    because as we learned yesterday morning,
  • 5:54 - 5:56
    just two years after they did their work,
  • 5:56 - 6:00
    the CRISPR system was harnessed
    for genome editing.
  • 6:00 - 6:02
    This was a radical breakthrough
  • 6:02 - 6:05
    that makes gene editing
    massively easier --
  • 6:05 - 6:08
    so easy that CRISPR
    is now taught in high schools.
  • 6:09 - 6:12
    And this stuff is moving
    quicker than computing.
  • 6:12 - 6:15
    That slow, stodgy white line up there?
  • 6:15 - 6:18
    That's Moore's law.
  • 6:18 - 6:21
    That shows us how quickly
    computing is getting cheaper.
  • 6:21 - 6:24
    That steep, crazy-fun green line,
  • 6:24 - 6:28
    that shows us how quickly
    genetic sequencing is getting cheaper.
  • 6:29 - 6:32
    Now, gene editing
    and synthesis and sequencing,
  • 6:32 - 6:35
    they're different disciplines,
    but they're tightly related.
  • 6:35 - 6:37
    And they're all moving
    in these headlong rates.
  • 6:37 - 6:41
    And the keys to the kingdom
    are these tiny, tiny data files.
  • 6:41 - 6:45
    That is an excerpt of H5N1's genome.
  • 6:45 - 6:48
    The whole thing can fit
    on just a few pages.
  • 6:48 - 6:51
    And yeah, don't worry, you can Google this
    as soon as you get home.
  • 6:51 - 6:53
    It's all over the internet, right?
  • 6:53 - 6:55
    And the part that made it contagious
  • 6:55 - 6:57
    could well fit on a single Post-it note.
  • 6:57 - 7:01
    And once a genius makes a data file,
  • 7:01 - 7:04
    any idiot can copy it,
  • 7:04 - 7:06
    distribute it worldwide
  • 7:06 - 7:07
    or print it.
  • 7:07 - 7:10
    And I don't just mean print it on this,
  • 7:10 - 7:13
    but soon enough, on this.
  • 7:13 - 7:15
    So let's imagine a scenario.
  • 7:15 - 7:19
    Let's say it's 2026,
    to pick an arbitrary year,
  • 7:19 - 7:22
    and a brilliant virologist,
    hoping to advance science
  • 7:22 - 7:24
    and better understand pandemics,
  • 7:24 - 7:25
    designs a new bug.
  • 7:26 - 7:28
    It's as contagious as chicken pox,
  • 7:28 - 7:30
    it's as deadly as Ebola,
  • 7:30 - 7:34
    and it incubates for months and months
    before causing an outbreak,
  • 7:34 - 7:38
    so the whole world can be infected
    before the first sign of trouble.
  • 7:39 - 7:41
    Then, her university gets hacked.
  • 7:41 - 7:44
    And of course,
    this is not science fiction.
  • 7:44 - 7:46
    In fact, just one recent US indictment
  • 7:46 - 7:49
    documents the hacking
    of over 300 universities.
  • 7:50 - 7:56
    So that file with the bug's genome on it
    spreads to the internet's dark corners.
  • 7:56 - 7:59
    And once a file is out there,
    it never comes back --
  • 7:59 - 8:03
    just ask anybody who runs
    a movie studio or a music label.
  • 8:04 - 8:07
    So now maybe in 2026,
  • 8:07 - 8:09
    it would take a true genius
    like our virologist
  • 8:09 - 8:12
    to make the actual living critter,
  • 8:12 - 8:14
    but 15 years later,
  • 8:14 - 8:17
    it may just take a DNA printer
    you can find at any high school.
  • 8:18 - 8:19
    And if not?
  • 8:20 - 8:21
    Give it a couple of decades.
  • 8:22 - 8:25
    So, a quick aside:
  • 8:25 - 8:26
    Remember this slide here?
  • 8:26 - 8:28
    Turn your attention to these two words.
  • 8:29 - 8:35
    If somebody tries this
    and is only 0.1 percent effective,
  • 8:35 - 8:37
    eight million people die.
  • 8:37 - 8:41
    That's 2,500 9/11s.
  • 8:41 - 8:43
    Civilization would survive,
  • 8:43 - 8:46
    but it would be permanently disfigured.
  • 8:47 - 8:50
    So this means we need
    to be concerned about anybody
  • 8:50 - 8:52
    who has the faintest shot on goal,
  • 8:52 - 8:54
    not just geniuses.
  • 8:55 - 9:00
    So today, there's a tiny
    handful of geniuses
  • 9:00 - 9:02
    who probably could make a doomsday bug
  • 9:02 - 9:05
    that's .1-percent effective
    and maybe even a little bit more.
  • 9:05 - 9:10
    They tend to be stable and successful
    and so not part of this group.
  • 9:10 - 9:13
    So I guess I'm sorta kinda
    barely OK-ish with that.
  • 9:14 - 9:17
    But what about after technology improves
  • 9:17 - 9:19
    and diffuses
  • 9:19 - 9:22
    and thousands of life science
    grad students are enabled?
  • 9:22 - 9:26
    Are every single one of them
    going to be perfectly stable?
  • 9:26 - 9:28
    Or how about a few years after that,
  • 9:28 - 9:31
    where every stress-ridden
    premed is fully enabled?
  • 9:31 - 9:33
    At some point in that time frame,
  • 9:33 - 9:36
    these circles are going to intersect,
  • 9:36 - 9:39
    because we're now starting to talk about
    hundreds of thousands of people
  • 9:39 - 9:40
    throughout the world.
  • 9:40 - 9:44
    And they recently included that guy
    who dressed up like the Joker
  • 9:44 - 9:48
    and shot 12 people to death
    at a Batman premiere.
  • 9:48 - 9:50
    That was a neuroscience PhD student
  • 9:50 - 9:52
    with an NIH grant.
  • 9:53 - 9:55
    OK, plot twist:
  • 9:55 - 10:00
    I think we can actually survive this one
    if we start focusing on it now.
  • 10:00 - 10:03
    And I say this, having spent
    countless hours
  • 10:03 - 10:06
    interviewing global leaders in synbio
  • 10:06 - 10:10
    and also researching their work
    for science podcasts I create.
  • 10:10 - 10:15
    I have come to fear their work, in case
    I haven't gotten that out there yet --
  • 10:15 - 10:17
    (Laughter)
  • 10:17 - 10:20
    but more than that,
    to revere its potential.
  • 10:20 - 10:24
    This stuff will cure cancer,
    heal our environment
  • 10:24 - 10:27
    and stop our cruel treatment
    of other creatures.
  • 10:28 - 10:32
    So how do we get all this without,
    you know, annihilating ourselves?
  • 10:32 - 10:36
    First thing: like it or not,
    synbio is here,
  • 10:36 - 10:38
    so let's embrace the technology.
  • 10:38 - 10:40
    If we do a tech ban,
  • 10:40 - 10:43
    that would only hand
    the wheel to bad actors.
  • 10:43 - 10:45
    Unlike nuclear programs,
  • 10:45 - 10:48
    biology can be practiced invisibly.
  • 10:48 - 10:51
    Massive Soviet cheating
    on bioweapons treaties
  • 10:51 - 10:55
    made that very clear, as does every
    illegal drug lab in the world.
  • 10:56 - 10:58
    Secondly, enlist the experts.
  • 10:58 - 11:00
    Let's sign them up and make more of them.
  • 11:00 - 11:03
    For every million and one
    bioengineers we have,
  • 11:03 - 11:06
    at least a million of them
    are going to be on our side.
  • 11:07 - 11:10
    I mean, Al Capone
    would be on our side in this one.
  • 11:10 - 11:13
    The bar to being a good guy
    is just so low.
  • 11:13 - 11:17
    And massive numerical
    advantages do matter,
  • 11:17 - 11:20
    even when a single bad guy
    can inflict grievous harm,
  • 11:20 - 11:21
    because among many other things,
  • 11:21 - 11:24
    they allow us to exploit
    the hell out of this:
  • 11:24 - 11:30
    we have years and hopefully decades
    to prepare and prevent.
  • 11:30 - 11:33
    The first person to try something awful --
    and there will be somebody --
  • 11:33 - 11:34
    may not even be born yet.
  • 11:35 - 11:39
    Next, this needs to be an effort
    that spans society,
  • 11:39 - 11:41
    and all of you need to be a part of it,
  • 11:41 - 11:46
    because we cannot ask
    a tiny group of experts
  • 11:46 - 11:51
    to be responsible for both containing
    and exploiting synthetic biology,
  • 11:51 - 11:54
    because we already tried that
    with the financial system,
  • 11:54 - 11:57
    and our stewards became
    massively corrupted
  • 11:57 - 12:00
    as they figured out
    how they could cut corners,
  • 12:00 - 12:03
    inflict massive, massive risks
    on the rest of us
  • 12:03 - 12:05
    and privatize the gains,
  • 12:05 - 12:07
    becoming repulsively wealthy
  • 12:07 - 12:09
    while they stuck us
    with the $22 trillion bill.
  • 12:11 - 12:12
    And more recently --
  • 12:12 - 12:14
    (Applause)
  • 12:14 - 12:17
    Are you the ones who have gotten
    the thank-you letters?
  • 12:17 - 12:19
    I'm still waiting for mine.
  • 12:19 - 12:21
    I just figured they were
    too busy to be grateful.
  • 12:21 - 12:23
    And much more recently,
  • 12:24 - 12:28
    online privacy started looming
    as a huge issue,
  • 12:28 - 12:30
    and we basically outsourced it.
  • 12:30 - 12:31
    And once again:
  • 12:31 - 12:34
    privatized gains, socialized losses.
  • 12:34 - 12:36
    Is anybody else sick of this pattern?
  • 12:36 - 12:40
    (Applause)
  • 12:40 - 12:46
    So we need a more inclusive way
    to safeguard our prosperity,
  • 12:46 - 12:47
    our privacy
  • 12:47 - 12:49
    and soon, our lives.
  • 12:49 - 12:52
    So how do we do all of this?
  • 12:52 - 12:55
    Well, when bodies fight pathogens,
  • 12:55 - 12:57
    they use ingenious immune systems,
  • 12:57 - 12:59
    which are very complex and multilayered.
  • 12:59 - 13:02
    Why don't we build one of these
    for the whole damn ecosystem?
  • 13:02 - 13:06
    There's a year of TED Talks that could
    be given on this first critical layer.
  • 13:06 - 13:09
    So these are just a couple
    of many great ideas that are out there.
  • 13:09 - 13:11
    Some R and D muscle
  • 13:11 - 13:16
    could take the very primitive
    pathogen sensors that we currently have
  • 13:16 - 13:19
    and put them on a very steep
    price performance curve
  • 13:19 - 13:21
    that would quickly become ingenious
  • 13:21 - 13:22
    and networked
  • 13:22 - 13:27
    and gradually as widespread
    as smoke detectors and even smartphones.
  • 13:27 - 13:29
    On a very related note:
  • 13:29 - 13:31
    vaccines have all kinds of problems
  • 13:31 - 13:35
    when it comes to manufacturing
    and distribution,
  • 13:35 - 13:39
    and once they're made, they can't adapt
    to new threats or mutations.
  • 13:39 - 13:42
    We need an agile biomanufacturing base
  • 13:42 - 13:46
    extending into every single pharmacy
    and maybe even our homes.
  • 13:46 - 13:50
    Printer technology for vaccines
    and medicines is within reach
  • 13:50 - 13:52
    if we prioritize it.
  • 13:52 - 13:54
    Next, mental health.
  • 13:54 - 13:57
    Many people who commit
    suicidal mass murder
  • 13:57 - 14:01
    suffer from crippling,
    treatment-resistant depression or PTSD.
  • 14:01 - 14:05
    We need noble researchers
    like Rick Doblin working on this,
  • 14:05 - 14:09
    but we also need the selfish jerks
    who are way more numerous
  • 14:09 - 14:15
    to appreciate the fact that acute
    suffering will soon endanger all of us,
  • 14:15 - 14:17
    not just those afflicted.
  • 14:17 - 14:21
    Those jerks will then
    join us and Al Capone
  • 14:21 - 14:23
    in fighting this condition.
  • 14:23 - 14:29
    Third, each and every one of us
    can be and should be a white blood cell
  • 14:29 - 14:31
    in this immune system.
  • 14:31 - 14:35
    Suicidal mass murderers
    can be despicable, yes,
  • 14:35 - 14:39
    but they're also terribly
    broken and sad people,
  • 14:39 - 14:41
    and those of us who aren't
    need to do what we can
  • 14:41 - 14:43
    to make sure nobody goes unloved.
  • 14:45 - 14:49
    (Applause)
  • 14:50 - 14:52
    Next, we need to make
    fighting these dangers
  • 14:52 - 14:55
    core to the discipline
    of synthetic biology.
  • 14:55 - 14:57
    There are companies out there
    that at least claim
  • 14:57 - 15:00
    they let their engineers
    spend 20 percent of their time
  • 15:00 - 15:02
    doing whatever they want.
  • 15:02 - 15:04
    What if those who hire bioengineers
  • 15:04 - 15:06
    and become them
  • 15:06 - 15:10
    give 20 percent of their time
    to building defenses for the common good?
  • 15:11 - 15:13
    Not a bad idea, right?
  • 15:13 - 15:14
    (Applause)
  • 15:14 - 15:17
    Then, finally: this won't be any fun.
  • 15:17 - 15:21
    But we need to let our minds
    go to some very, very dark places,
  • 15:21 - 15:24
    and thank you for letting me
    take you there this evening.
  • 15:24 - 15:26
    We survived the Cold War
  • 15:26 - 15:31
    because every one of us understood
    and respected the danger,
  • 15:31 - 15:33
    in part, because we had spent decades
  • 15:33 - 15:36
    telling ourselves terrifying ghost stories
  • 15:36 - 15:38
    with names like "Dr. Strangelove"
  • 15:38 - 15:39
    and "War Games."
  • 15:39 - 15:42
    This is no time to remain calm.
  • 15:42 - 15:45
    This is one of those rare times
    when it's incredibly productive
  • 15:45 - 15:47
    to freak the hell out --
  • 15:48 - 15:50
    (Laughter)
  • 15:50 - 15:52
    to come up with some ghost stories
  • 15:52 - 15:57
    and use our fear as fuel
    to fight this danger.
  • 15:58 - 16:02
    Because, all these
    terrible scenarios I've painted --
  • 16:02 - 16:04
    they are not destiny.
  • 16:04 - 16:06
    They're optional.
  • 16:06 - 16:10
    The danger is still kind of distant.
  • 16:10 - 16:13
    And that means it will only befall us
  • 16:13 - 16:15
    if we allow it to.
  • 16:16 - 16:17
    Let's not.
  • 16:17 - 16:19
    Thank you very much for listening.
  • 16:19 - 16:23
    (Applause)
Title:
How synthetic biology could wipe out humanity -- and how we can stop it
Speaker:
Rob Reid
Description:

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDTalks
Duration:
16:36

English subtitles

Revisions Compare revisions