Return to Video

How synthetic biology could wipe out humanity -- and how we can stop it

  • 0:01 - 0:05
    So there's about
    seven and a half billion of us.
  • 0:05 - 0:09
    The World Health Organization tells us
    that 300 million of us are depressed,
  • 0:09 - 0:12
    and about 800,000 people
    take their lives every year.
  • 0:12 - 0:17
    A tiny subset of them choose
    a profoundly nihilistic route,
  • 0:17 - 0:21
    which is they die in the act
    of killing as many people as possible.
  • 0:21 - 0:24
    These are some famous recent examples.
  • 0:24 - 0:27
    And here's a less famous one.
    It happened about nine weeks ago.
  • 0:27 - 0:28
    If you don't remember it,
  • 0:28 - 0:31
    it's because there's
    a lot of this going on.
  • 0:31 - 0:35
    Wikipedia just last year
    counted 323 mass shootings
  • 0:35 - 0:38
    in my home country, the United States.
  • 0:38 - 0:41
    Not all of those shooters were suicidal,
  • 0:41 - 0:43
    not all of them were maximizing
    their death tolls,
  • 0:43 - 0:45
    but many, many were.
  • 0:45 - 0:49
    An important question becomes,
    what limits do these people have?
  • 0:49 - 0:50
    Take the Vegas shooter.
  • 0:50 - 0:54
    He slaughtered 58 people.
  • 0:54 - 0:56
    Did he stop there because he'd had enough?
  • 0:56 - 1:02
    No, and we know this because
    he shot and injured another 422 people
  • 1:02 - 1:04
    who he surely would have
    preferred to kill.
  • 1:04 - 1:07
    We have no reason to think
    he would have stopped 4,200.
  • 1:07 - 1:11
    In fact, with somebody this nihilistic,
    he may well have gladly killed us all.
  • 1:11 - 1:13
    We don't know.
  • 1:13 - 1:15
    What we do know is this:
  • 1:15 - 1:19
    when suicidal murderers really go all in,
  • 1:19 - 1:22
    technology is the force multiplier.
  • 1:22 - 1:23
    Here's an example.
  • 1:23 - 1:28
    Several years back, there was a rash
    of 10 mass school attacks in China
  • 1:29 - 1:32
    carried out with things
    like knives and hammers and cleavers
  • 1:32 - 1:34
    because guns are really hard to get there.
  • 1:34 - 1:37
    By macabre coincidence, this last attack
  • 1:37 - 1:42
    occurred just hours before
    the massacre in Newtown, Connecticut.
  • 1:42 - 1:47
    But that one American attack
    killed roughly the same number of victims
  • 1:47 - 1:50
    as the 10 Chinese attacks combined.
  • 1:50 - 1:55
    So we can fairly say,
    knife terrible, gun way worse,
  • 1:55 - 1:58
    and airplane massively worse,
  • 1:58 - 2:02
    as pilot Andreas Lubitz showed
    when he forced 149 people
  • 2:02 - 2:05
    to join him in his suicide
  • 2:05 - 2:07
    smashing a plane into the French Alps.
  • 2:07 - 2:10
    And there are other examples of this.
  • 2:10 - 2:15
    And I'm afraid there are far more deadly
    weapons in our near future than airplanes,
  • 2:15 - 2:17
    ones not made of metal.
  • 2:17 - 2:21
    So let's consider the apocalyptic dynamics
  • 2:21 - 2:22
    that will ensue
  • 2:22 - 2:26
    if suicidal mass murder hitches a ride
  • 2:26 - 2:27
    on a rapidly advancing field
  • 2:27 - 2:31
    that for the most part
    holds boundless promise for society.
  • 2:32 - 2:35
    Somewhere out there in the world,
    there's a tiny group of people
  • 2:35 - 2:37
    who would attempt, however ineptly,
  • 2:37 - 2:41
    to kill us all if they
    could just figure out how.
  • 2:41 - 2:44
    The Vegas shooter may or may not
    have been one of them,
  • 2:44 - 2:46
    but with seven and a half billion of us,
  • 2:46 - 2:48
    this is a non-zero population.
  • 2:48 - 2:50
    There's plenty of suicidal
    nihilists out there.
  • 2:50 - 2:52
    We've already seen that.
  • 2:52 - 2:55
    There's people with severe mood disorders
    that they can't even control.
  • 2:55 - 3:01
    There are people who have just suffered
    deranging traumas, etc. etc.
  • 3:01 - 3:03
    As for the corollary group,
  • 3:03 - 3:06
    its size was simply zero forever
  • 3:06 - 3:08
    until the Cold War,
  • 3:08 - 3:10
    when suddenly the leaders
    of two global alliances
  • 3:10 - 3:14
    attained the ability to blow up the world.
  • 3:14 - 3:18
    The number of people
    with actual doomsday buttons
  • 3:18 - 3:20
    has stayed fairly stable since then,
  • 3:20 - 3:22
    but I'm afraid it's about to grow,
  • 3:22 - 3:23
    and not just to three.
  • 3:23 - 3:26
    This is going off the charts.
  • 3:26 - 3:28
    I mean, it's going to look
    like a tech business plan.
  • 3:28 - 3:30
    (Laughter)
  • 3:32 - 3:34
    And the reason is,
  • 3:34 - 3:36
    we're in the era
    of exponential technologies,
  • 3:36 - 3:41
    which routinely
    take eternal impossibilities
  • 3:41 - 3:47
    and make them the actual superpowers
    of one or two living geniuses
  • 3:47 - 3:49
    and -- this is the big part --
  • 3:49 - 3:52
    then diffuse those powers
    to more or less everybody.
  • 3:52 - 3:54
    Now, here's a benign example.
  • 3:54 - 3:58
    If you wanted to play checkers
    with a computer in 1952,
  • 3:58 - 4:02
    you literally had to be that guy,
  • 4:02 - 4:07
    then commandeer one of the world's
    19 copies of that computer,
  • 4:07 - 4:11
    then used your Nobel-adjacent brain
    to teach it checkers.
  • 4:11 - 4:12
    That was the bar.
  • 4:12 - 4:18
    Today, you just need to know someone
    who knows someone who owns a telephone,
  • 4:18 - 4:21
    because computing
    is an exponential technology.
  • 4:21 - 4:24
    So is synthetic biology,
  • 4:24 - 4:26
    which I'll now refer to as synbio,
  • 4:26 - 4:32
    and in 2011, a couple of researchers
    did something every bit as ingenious
  • 4:32 - 4:35
    and unprecedented as the checkers trick
  • 4:35 - 4:37
    with H5N1 flu.
  • 4:37 - 4:42
    This is a strain that kills
    up to 60 percent of the people it infects,
  • 4:42 - 4:44
    more than ebola,
  • 4:44 - 4:46
    but it is so uncontagious
  • 4:46 - 4:51
    that it's killed fewer
    than 50 people since 2015.
  • 4:51 - 4:55
    So these researchers edited H5N1's genome
  • 4:55 - 4:59
    and made it every bit as deadly,
    but also wildly contagious.
  • 5:00 - 5:03
    The news arm of one of
    the world's top two scientific journals
  • 5:03 - 5:07
    said if this thing got out,
    it would likely cause a pandemic
  • 5:07 - 5:09
    with perhaps millions of deaths,
  • 5:09 - 5:11
    and Dr. Paul Keim said
  • 5:11 - 5:15
    he could not think of an organism
    as scary as this,
  • 5:15 - 5:18
    which is the last thing
    I personally want to hear
  • 5:18 - 5:23
    from the Chairman of the National
    Science Advisory Board on Biosecurity.
  • 5:23 - 5:25
    And by the way, Dr. Keim also said this --
  • 5:25 - 5:27
    ["I don't think anthrax
    is scary at all compared to this."]
  • 5:27 - 5:29
    and he's also one of these.
  • 5:29 - 5:30
    [Anthrax expert]
    (Laughter)
  • 5:30 - 5:35
    Now, the good news about the 2011 biohack
  • 5:35 - 5:38
    is that the people who did it
    didn't mean us any harm.
  • 5:38 - 5:39
    They're virologists.
  • 5:39 - 5:41
    They believe they were advancing science.
  • 5:41 - 5:45
    The bad news is that technology
    does not freeze in place,
  • 5:45 - 5:47
    and over the next few decades,
  • 5:47 - 5:51
    their feat will become trivially easy.
  • 5:51 - 5:55
    In fact, it's already way easier,
    because as we learned yesterday morning,
  • 5:55 - 5:57
    just two years after they did their work,
  • 5:57 - 6:00
    the CRISPR system was harnessed
    for genome editing.
  • 6:00 - 6:05
    This was a radical breakthrough
    that makes gene editing massively easier,
  • 6:05 - 6:09
    so easy that CRISPR
    is now taught in high schools.
  • 6:09 - 6:12
    And this stuff is moving
    quicker than computing.
  • 6:12 - 6:15
    That slow, stodgy white line up there?
  • 6:15 - 6:18
    That's Moore's Law.
  • 6:18 - 6:22
    That shows us how quickly
    computing is getting cheaper.
  • 6:22 - 6:24
    That steep, crazy fun green line,
  • 6:24 - 6:29
    that shows us how quickly
    genetic sequencing is getting cheaper.
  • 6:29 - 6:31
    Now, gene editing
    and synthesis and sequencing,
  • 6:31 - 6:35
    they're different disciplines,
    but they're tightly related,
  • 6:35 - 6:37
    and they're all moving
    in these headlong rates,
  • 6:37 - 6:41
    and the keys to the kingdom
    are these tiny, tiny data files.
  • 6:41 - 6:45
    That is an excerpt of H5N1's genome.
  • 6:45 - 6:48
    The whole thing can fit
    on just a few pages.
  • 6:48 - 6:52
    And yeah, don't worry, you can Google this
    as soon as you get home.
  • 6:52 - 6:53
    It's all over the internet, right?
  • 6:53 - 6:55
    And the part that made it contagious
  • 6:55 - 6:58
    could well fit on a single post-it note,
  • 6:58 - 7:00
    and once a genius
  • 7:00 - 7:01
    makes a data file,
  • 7:01 - 7:04
    any idiot can copy it,
  • 7:04 - 7:06
    distribute it worldwide,
  • 7:06 - 7:07
    or print it.
  • 7:07 - 7:10
    And I don't mean print it on this,
  • 7:10 - 7:13
    but soon enough on this.
  • 7:13 - 7:16
    So let's imagine a scenario.
  • 7:16 - 7:19
    Let's say it's 2026,
    to pick an arbitrary year,
  • 7:19 - 7:22
    and a brilliant virologist,
    hoping to advance science
  • 7:22 - 7:24
    and better understand pandemics,
  • 7:24 - 7:26
    designs a new bug.
  • 7:26 - 7:28
    It's as contagious as chicken pox,
  • 7:28 - 7:30
    it's as deadly as ebola,
  • 7:30 - 7:34
    and it incubates for months and months
    before causing an outbreak,
  • 7:34 - 7:37
    so the whole world can be infected
    before the first sign of trouble.
  • 7:37 - 7:39
    Then, her university gets hacked,
    and of course this is not science fiction.
  • 7:39 - 7:46
    In fact, just one recent US indictment
  • 7:46 - 7:50
    documents the hacking
    of over 300 universities.
  • 7:50 - 7:53
    So that file with the bug's genome on it
  • 7:53 - 7:56
    spreads to the internet's dark corners,
  • 7:56 - 7:58
    and once a file is out there,
    it never comes back.
  • 7:58 - 8:04
    Just ask anybody who runs
    a movie studio or a music label.
  • 8:04 - 8:07
    So now maybe in 2026,
  • 8:07 - 8:10
    it would take a true genius
    like our virologist
  • 8:10 - 8:12
    to make the actual living critter,
  • 8:12 - 8:14
    but 15 years later,
  • 8:14 - 8:18
    it may just take a DNA printer
    you can find in any high school.
  • 8:18 - 8:22
    And if not? Give it a couple of decades.
  • 8:22 - 8:25
    So, a quick aside.
  • 8:25 - 8:26
    Remember this slide here?
  • 8:26 - 8:29
    Turn your attention to these two words.
  • 8:29 - 8:31
    If somebody tries this,
  • 8:31 - 8:37
    and is only 0.1 percent effective,
    eight million people die.
  • 8:37 - 8:41
    That's 2,500 9/11s.
  • 8:41 - 8:43
    Civilization would survive,
    but it would be permanently disfigured.
  • 8:43 - 8:47
    So this means we need
    to be concerned about anybody
  • 8:47 - 8:54
    who has the faintest shot on goal,
    not just geniuses.
  • 8:54 - 9:00
    So today, there's
    a tiny handful of geniuses
  • 9:00 - 9:02
    who probably could make a doomsday bug
  • 9:02 - 9:05
    that's .1 percent effective
    and maybe even a little bit more.
  • 9:05 - 9:10
    They tend to be stable and successful
    and so not part of this group.
  • 9:10 - 9:14
    So I guess I'm sorta kinda
    barely OK-ish with that.
  • 9:14 - 9:18
    But what about after technology improves
  • 9:18 - 9:19
    and diffuses
  • 9:19 - 9:22
    and thousands of life science
    grad students are enabled.
  • 9:22 - 9:26
    Are every single one of them
    going to be perfectly stable?
  • 9:26 - 9:28
    Or how about a few years after that,
  • 9:28 - 9:32
    where every stress-ridden
    pre-med is fully enabled?
  • 9:32 - 9:34
    At some point in that timeframe,
  • 9:34 - 9:36
    these circles are going to intersect,
  • 9:36 - 9:39
    because we are now starting to talk about
    hundreds of thousands of people
  • 9:39 - 9:41
    throughout the world,
  • 9:41 - 9:43
    and they recently included that guy
  • 9:43 - 9:45
    who dressed up like the Joker
  • 9:45 - 9:48
    and shot 12 people to death
    at a Batman premiere.
  • 9:48 - 9:50
    That was a neuroscience PhD student
  • 9:50 - 9:53
    with an NIH grant.
  • 9:53 - 9:55
    OK, plot twist.
  • 9:55 - 9:57
    I think we can actually survive this one
  • 9:57 - 10:00
    if we start focusing on it now,
  • 10:00 - 10:03
    and I say this having spent
    countless hours
  • 10:03 - 10:06
    interviewing global leaders in synbio
  • 10:06 - 10:10
    and also researching their work
    for a science podcast I'd create.
  • 10:10 - 10:16
    I have come to fear their work, in case
    I haven't gotten that out there yet
  • 10:16 - 10:17
    (Laughter)
  • 10:17 - 10:20
    but more than that,
    to revere its potential.
  • 10:20 - 10:23
    This stuff will cure cancer,
    heal our environment,
  • 10:23 - 10:28
    and stop our cruel treatment
    of other creatures.
  • 10:28 - 10:33
    So how do we get all this without,
    you know, annihilating ourselves?
  • 10:33 - 10:36
    First thing, like it or not,
    synbio is here,
  • 10:36 - 10:38
    so let's embrace the technology.
  • 10:38 - 10:40
    If we do a tech ban,
  • 10:40 - 10:43
    that would only hand
    the wheel to bad actors.
  • 10:43 - 10:46
    Unlike nuclear programs,
  • 10:46 - 10:49
    biology can be practiced invisibly.
  • 10:49 - 10:51
    Massive Soviet cheating
    on bioweapons treaties
  • 10:51 - 10:56
    made that very clear, as does every
    illegal drug lab in the world.
  • 10:56 - 10:58
    Secondly, enlist the experts.
  • 10:58 - 11:00
    Let's sign them up and make more of them.
  • 11:00 - 11:02
    For every million and one bioengineers,
  • 11:02 - 11:07
    we have at least a million of them
    are going to be on our side.
  • 11:07 - 11:10
    I mean, Al Capone
    would be on our side in this one.
  • 11:10 - 11:13
    The bar to being a good guy
    is just so low.
  • 11:13 - 11:17
    And massive numerical
    advantages do matter,
  • 11:17 - 11:20
    even when a single bad guy
    can inflict grievous harm,
  • 11:20 - 11:24
    because among many other things, they
    allow us to exploit the hell out of this.
  • 11:24 - 11:27
    We have years and hopefully decades
  • 11:27 - 11:30
    to prepare and prevent.
  • 11:30 - 11:33
    The first person to try something awful,
    and there will be somebody,
  • 11:33 - 11:35
    may not even be born yet.
  • 11:35 - 11:39
    Next, this needs to be an effort
    that spans society,
  • 11:39 - 11:42
    and all of you need to be a part of it,
  • 11:42 - 11:46
    because we cannot ask
    a tiny group of experts
  • 11:46 - 11:51
    to be responsible for both containing
    and exploiting synthetic biology,
  • 11:51 - 11:54
    because we already tried that
    with the financial system,
  • 11:54 - 11:57
    and our stewards became
    massively corrupted
  • 11:57 - 12:00
    as they figured out
    how they could cut corners,
  • 12:00 - 12:03
    inflict massive, massive risks
    on the rest of us
  • 12:03 - 12:05
    and privatize the gains,
  • 12:05 - 12:07
    becoming repulsively wealthy
  • 12:07 - 12:10
    while they stuck us
    with the $22 trillion bill.
  • 12:11 - 12:13
    (Applause)
  • 12:13 - 12:15
    And more recently --
  • 12:15 - 12:16
    Are you the ones who have gotten
    the thank you letters?
  • 12:16 - 12:18
    I'm still waiting for mine.
  • 12:18 - 12:21
    I just figured they were
    too busy to be grateful.
  • 12:21 - 12:25
    And much more recently,
  • 12:25 - 12:28
    online privacy started looming
    as a huge issue,
  • 12:28 - 12:30
    and we basically outsourced it,
  • 12:30 - 12:31
    and once again,
  • 12:31 - 12:34
    privatized gains, socialized losses.
  • 12:34 - 12:36
    Is anybody else sick of this pattern?
  • 12:36 - 12:40
    (Applause)
  • 12:40 - 12:44
    So we need a more inclusive way
    to safeguard our prosperity,
  • 12:44 - 12:50
    our privacy, and soon our lives.
  • 12:50 - 12:54
    So how do we do all of this?
  • 12:54 - 12:55
    Well, when bodies fight pathogens,
  • 12:55 - 12:58
    they use ingenious immune systems
  • 12:58 - 13:00
    which are very complex and multi-layered.
  • 13:00 - 13:02
    Why don't we build one of these
    for the whole damn ecosystem?
  • 13:02 - 13:06
    There's a year of TED Talks that could
    be given on this first critical layer.
  • 13:06 - 13:09
    So these are just a couple
    of many great ideas that are out there.
  • 13:09 - 13:12
    Some R&D muscle
  • 13:12 - 13:17
    could take the very primitive
    pathogen sensors that we currently have
  • 13:17 - 13:19
    and put them on a very steep
    price performance curve
  • 13:19 - 13:22
    that would quickly become ingenious
  • 13:22 - 13:23
    and networked
  • 13:23 - 13:27
    and gradually as widespread
    as smoke detectors and even smartphones.
  • 13:27 - 13:29
    On a very related note,
  • 13:29 - 13:32
    vaccines have all kinds of problems
  • 13:32 - 13:35
    when it comes to manufacturing
    and distribution,
  • 13:35 - 13:39
    and once they're made, they can't adapt
    to new threats or mutations.
  • 13:39 - 13:42
    We need an agile bio-manufacturing base
  • 13:42 - 13:46
    extending into every single pharmacy
    and maybe even our homes.
  • 13:46 - 13:50
    Printer technology for vaccines
    and medicines is within reach
  • 13:50 - 13:52
    if we prioritize it.
  • 13:52 - 13:54
    Next, mental health.
  • 13:54 - 13:57
    Many people who commit
    suicidal mass murder
  • 13:57 - 14:01
    suffer from crippling,
    treatment-resistant depression or PTSD.
  • 14:01 - 14:05
    We need noble researchers
    like Rick Doblin working on this,
  • 14:05 - 14:09
    but we also need the selfish jerks
    who are way more numerous
  • 14:09 - 14:11
    to appreciate the fact
  • 14:11 - 14:14
    that acute suffering
  • 14:14 - 14:17
    will soon endanger all of us,
    not just those afflicted.
  • 14:17 - 14:21
    Those jerks will then
    join us and Al Capone
  • 14:21 - 14:24
    in fighting this condition.
  • 14:24 - 14:29
    Third, each and every one of us
    can be and should be a white blood cell
  • 14:29 - 14:32
    in this immune system.
  • 14:32 - 14:35
    Suicidal mass murderers
    can be despicable, yes,
  • 14:35 - 14:39
    but they're also terribly
    broken and sad people,
  • 14:39 - 14:42
    and those of us who aren't
    need to do what we can
  • 14:42 - 14:44
    to make sure nobody goes unloved.
  • 14:44 - 14:48
    (Applause)
  • 14:50 - 14:54
    Next, we need to make
    fighting these dangers
  • 14:54 - 14:56
    core to the discipline
    of synthetic biology.
  • 14:56 - 14:58
    There are companies out there
    that at least claim
  • 14:58 - 15:00
    they let their engineers
    spend 20 percent of their time
  • 15:00 - 15:02
    doing whatever they want.
  • 15:02 - 15:05
    What if those who hire bioengineers
  • 15:05 - 15:06
    and become them
  • 15:06 - 15:10
    give 20 percent of their time
    to building defenses for the common good?
  • 15:11 - 15:13
    Not a bad idea, right?
  • 15:13 - 15:14
    (Applause)
  • 15:14 - 15:16
    Then finally, this won't be any fun,
    but we need to let our minds
  • 15:16 - 15:21
    go to some very, very dark places,
  • 15:21 - 15:24
    and thank you for letting me
    take you there this evening.
  • 15:24 - 15:27
    We survived the Cold War
  • 15:27 - 15:31
    because every one of us understood
    and respected the danger
  • 15:31 - 15:33
    in part because we had spent decades
  • 15:33 - 15:36
    telling ourselves terrifying ghost stories
  • 15:36 - 15:38
    with names like "Dr. Strangelove"
  • 15:38 - 15:40
    and "War Games."
  • 15:40 - 15:42
    This is no time to remain calm.
  • 15:42 - 15:45
    This is one of those rare times
    when it's incredibly productive
  • 15:45 - 15:47
    to freak the hell out,
  • 15:47 - 15:50
    (Laughter)
  • 15:50 - 15:53
    to come up with some ghost stories
  • 15:53 - 15:58
    and use our fear as fuel
    to fight this danger.
  • 15:58 - 16:02
    Because, all these
    terrible scenarios I've painted,
  • 16:02 - 16:04
    they are not destiny.
  • 16:04 - 16:06
    They're optional.
  • 16:06 - 16:10
    The danger is still kind of distant,
  • 16:10 - 16:13
    and that means it will only befall us
  • 16:13 - 16:16
    if we allow it to.
  • 16:16 - 16:17
    Let's not.
  • 16:17 - 16:20
    Thank you very much for listening.
  • 16:20 - 16:23
    (Applause)
Title:
How synthetic biology could wipe out humanity -- and how we can stop it
Speaker:
Rob Reid
Description:

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDTalks
Duration:
16:36

English subtitles

Revisions Compare revisions