Return to Video

How we can eliminate child sexual abuse material from the internet

  • 0:01 - 0:03
    [This talk contains mature content]
  • 0:06 - 0:07
    Five years ago,
  • 0:07 - 0:11
    I received a phone call
    that would change my life.
  • 0:12 - 0:14
    I remember so vividly that day.
  • 0:15 - 0:17
    It was about this time of year,
  • 0:17 - 0:19
    and I was sitting in my office.
  • 0:20 - 0:23
    I remember the sun
    streaming through the window.
  • 0:24 - 0:25
    And my phone rang.
  • 0:26 - 0:27
    And I picked it up,
  • 0:28 - 0:32
    and it was two federal agents,
    asking for my help
  • 0:32 - 0:34
    in identifying a little girl
  • 0:34 - 0:40
    featured in hundreds of child
    sexual abuse images they had found online.
  • 0:41 - 0:44
    They had just started working the case,
  • 0:44 - 0:47
    but what they knew
  • 0:47 - 0:51
    was that her abuse had been broadcast
    to the world for years
  • 0:51 - 0:57
    on dark web sites dedicated
    to the sexual abuse of children.
  • 0:58 - 1:02
    And her abuser was incredibly
    technologically sophisticated:
  • 1:02 - 1:07
    new images and new videos every few weeks,
  • 1:07 - 1:11
    but very few clues as to who she was
  • 1:11 - 1:12
    or where she was.
  • 1:13 - 1:15
    And so they called us,
  • 1:15 - 1:17
    because they had heard
    we were a new nonprofit
  • 1:17 - 1:21
    building technology
    to fight child sexual abuse.
  • 1:22 - 1:24
    But we were only two years old,
  • 1:24 - 1:27
    and we had only worked
    on child sex trafficking.
  • 1:28 - 1:30
    And I had to tell them
  • 1:30 - 1:31
    we had nothing.
  • 1:32 - 1:36
    We had nothing that could
    help them stop this abuse.
  • 1:37 - 1:41
    It took those agents another year
  • 1:41 - 1:44
    to ultimately find that child.
  • 1:45 - 1:47
    And by the time she was rescued,
  • 1:47 - 1:54
    hundreds of images and videos
    documenting her rape had gone viral,
  • 1:54 - 1:55
    from the dark web
  • 1:55 - 1:58
    to peer-to-peer networks,
    private chat rooms
  • 1:58 - 2:02
    and to the websites you and I use
  • 2:02 - 2:04
    every single day.
  • 2:05 - 2:09
    And today, as she struggles to recover,
  • 2:09 - 2:13
    she lives with the fact
    that thousands around the world
  • 2:13 - 2:16
    continue to watch her abuse.
  • 2:18 - 2:20
    I have come to learn
    in the last five years
  • 2:20 - 2:23
    that this case is far from unique.
  • 2:24 - 2:28
    How did we get here as a society?
  • 2:29 - 2:33
    In the late 1980s, child pornography --
  • 2:33 - 2:39
    or what it actually is,
    child sexual abuse material --
  • 2:39 - 2:40
    was nearly eliminated.
  • 2:41 - 2:46
    New laws and increased prosecutions
    made it simply too risky
  • 2:46 - 2:47
    to trade it through the mail.
  • 2:48 - 2:52
    And then came the internet,
    and the market exploded.
  • 2:53 - 2:57
    The amount of content in circulation today
  • 2:57 - 3:00
    is massive and growing.
  • 3:00 - 3:04
    This is a truly global problem,
  • 3:04 - 3:06
    but if we just look at the US:
  • 3:06 - 3:08
    in the US alone last year,
  • 3:08 - 3:14
    more than 45 million images and videos
    of child sexual abuse material
  • 3:14 - 3:17
    were reported to the National Center
    for Missing and Exploited Children,
  • 3:17 - 3:22
    and that is nearly double
    the amount the year prior.
  • 3:23 - 3:28
    And the details behind these numbers
    are hard to contemplate,
  • 3:28 - 3:34
    with more than 60 percent of the images
    featuring children younger than 12,
  • 3:34 - 3:38
    and most of them including
    extreme acts of sexual violence.
  • 3:39 - 3:44
    Abusers are cheered on in chat rooms
    dedicated to the abuse of children,
  • 3:44 - 3:47
    where they gain rank and notoriety
  • 3:47 - 3:50
    with more abuse and more victims.
  • 3:50 - 3:53
    In this market,
  • 3:53 - 3:57
    the currency has become
    the content itself.
  • 3:58 - 4:02
    It's clear that abusers have been quick
    to leverage new technologies,
  • 4:02 - 4:05
    but our response as a society has not.
  • 4:06 - 4:10
    These abusers don't read
    user agreements of websites,
  • 4:10 - 4:14
    and the content doesn't honor
    geographic boundaries.
  • 4:15 - 4:21
    And they win when we look
    at one piece of the puzzle at a time,
  • 4:21 - 4:25
    which is exactly how
    our response today is designed.
  • 4:25 - 4:28
    Law enforcement works in one jurisdiction.
  • 4:28 - 4:32
    Companies look at just their platform.
  • 4:32 - 4:34
    And whatever data they learn along the way
  • 4:35 - 4:37
    is rarely shared.
  • 4:37 - 4:43
    It is so clear that this
    disconnected approach is not working.
  • 4:44 - 4:48
    We have to redesign
    our response to this epidemic
  • 4:48 - 4:49
    for the digital age.
  • 4:50 - 4:53
    And that's exactly
    what we're doing at Thorn.
  • 4:53 - 4:57
    We're building the technology
    to connect these dots,
  • 4:57 - 4:59
    to arm everyone on the front lines --
  • 4:59 - 5:02
    law enforcement, NGOs and companies --
  • 5:02 - 5:06
    with the tools they need
    to ultimately eliminate
  • 5:06 - 5:08
    child sexual abuse material
    from the internet.
  • 5:10 - 5:11
    Let's talk for a minute --
  • 5:11 - 5:12
    (Applause)
  • 5:12 - 5:14
    Thank you.
  • 5:14 - 5:16
    (Applause)
  • 5:18 - 5:20
    Let's talk for a minute
    about what those dots are.
  • 5:21 - 5:25
    As you can imagine,
    this content is horrific.
  • 5:25 - 5:28
    If you don't have to look at it,
    you don't want to look at it.
  • 5:28 - 5:33
    And so, most companies
    or law enforcement agencies
  • 5:33 - 5:35
    that have this content
  • 5:35 - 5:39
    can translate every file
    into a unique string of numbers.
  • 5:39 - 5:40
    This is called a "hash."
  • 5:40 - 5:42
    It's essentially a fingerprint
  • 5:42 - 5:45
    for each file or each video.
  • 5:45 - 5:49
    And what this allows them to do
    is use the information in investigations
  • 5:49 - 5:52
    or for a company to remove
    the content from their platform,
  • 5:52 - 5:58
    without having to relook
    at every image and every video each time.
  • 5:58 - 6:00
    The problem today, though,
  • 6:00 - 6:04
    is that there are hundreds
    of millions of these hashes
  • 6:04 - 6:08
    sitting in siloed databases
    all around the world.
  • 6:08 - 6:09
    In a silo,
  • 6:09 - 6:12
    it might work for the one agency
    that has control over it,
  • 6:12 - 6:17
    but not connecting this data means
    we don't know how many are unique.
  • 6:17 - 6:20
    We don't know which ones represent
    children who have already been rescued
  • 6:20 - 6:23
    or need to be identified still.
  • 6:23 - 6:27
    So our first, most basic premise
    is that all of this data
  • 6:27 - 6:30
    must be connected.
  • 6:30 - 6:36
    There are two ways where this data,
    combined with software on a global scale,
  • 6:37 - 6:40
    can have transformative
    impact in this space.
  • 6:40 - 6:43
    The first is with law enforcement:
  • 6:43 - 6:47
    helping them identify new victims faster,
  • 6:47 - 6:48
    stopping abuse
  • 6:48 - 6:51
    and stopping those producing this content.
  • 6:51 - 6:54
    The second is with companies:
  • 6:54 - 6:58
    using it as clues to identify
    the hundreds of millions of files
  • 6:58 - 6:59
    in circulation today,
  • 6:59 - 7:01
    pulling it down
  • 7:01 - 7:07
    and then stopping the upload
    of new material before it ever goes viral.
  • 7:10 - 7:11
    Four years ago,
  • 7:11 - 7:13
    when that case ended,
  • 7:14 - 7:18
    our team sat there,
    and we just felt this, um ...
  • 7:20 - 7:23
    ... deep sense of failure,
    is the way I can put it,
  • 7:23 - 7:27
    because we watched that whole year
  • 7:27 - 7:28
    while they looked for her.
  • 7:28 - 7:32
    And we saw every place
    in the investigation
  • 7:32 - 7:34
    where, if the technology
    would have existed,
  • 7:34 - 7:37
    they would have found her faster.
  • 7:38 - 7:40
    And so we walked away from that
  • 7:40 - 7:43
    and we went and we did
    the only thing we knew how to do:
  • 7:43 - 7:45
    we began to build software.
  • 7:46 - 7:48
    So we've started with law enforcement.
  • 7:48 - 7:52
    Our dream was an alarm bell on the desks
    of officers all around the world
  • 7:52 - 7:57
    so that if anyone dare post
    a new victim online,
  • 7:57 - 8:00
    someone would start
    looking for them immediately.
  • 8:01 - 8:04
    I obviously can't talk about
    the details of that software,
  • 8:04 - 8:07
    but today it's at work in 38 countries,
  • 8:07 - 8:10
    having reduced the time it takes
    to get to a child
  • 8:10 - 8:12
    by more than 65 percent.
  • 8:12 - 8:17
    (Applause)
  • 8:21 - 8:24
    And now we're embarking
    on that second horizon:
  • 8:24 - 8:30
    building the software to help companies
    identify and remove this content.
  • 8:31 - 8:34
    Let's talk for a minute
    about these companies.
  • 8:34 - 8:40
    So, I told you -- 45 million images
    and videos in the US alone last year.
  • 8:40 - 8:44
    Those come from just 12 companies.
  • 8:46 - 8:52
    Twelve companies, 45 million files
    of child sexual abuse material.
  • 8:52 - 8:55
    These come from those companies
    that have the money
  • 8:55 - 9:00
    to build the infrastructure that it takes
    to pull this content down.
  • 9:00 - 9:02
    But there are hundreds of other companies,
  • 9:02 - 9:05
    small- to medium-size companies
    around the world,
  • 9:05 - 9:07
    that need to do this work,
  • 9:07 - 9:12
    but they either: 1) can't imagine that
    their platform would be used for abuse,
  • 9:12 - 9:18
    or 2) don't have the money to spend
    on something that is not driving revenue.
  • 9:19 - 9:22
    So we went ahead and built it for them,
  • 9:22 - 9:27
    and this system now gets smarter
    with the more companies that participate.
  • 9:28 - 9:30
    Let me give you an example.
  • 9:30 - 9:34
    Our first partner, Imgur --
    if you haven't heard of this company,
  • 9:34 - 9:38
    it's one of the most visited
    websites in the US --
  • 9:38 - 9:43
    millions of pieces of user-generated
    content uploaded every single day,
  • 9:43 - 9:45
    in a mission to make the internet
    a more fun place.
  • 9:46 - 9:48
    They partnered with us first.
  • 9:48 - 9:51
    Within 20 minutes
    of going live on our system,
  • 9:51 - 9:55
    someone tried to upload
    a known piece of abuse material.
  • 9:55 - 9:57
    They were able to stop it,
    they pull it down,
  • 9:57 - 10:00
    they report it to the National Center
    for Missing and Exploited Children.
  • 10:00 - 10:02
    But they went a step further,
  • 10:02 - 10:07
    and they went and inspected the account
    of the person who had uploaded it.
  • 10:07 - 10:12
    Hundreds more pieces
    of child sexual abuse material
  • 10:12 - 10:14
    that we had never seen.
  • 10:14 - 10:18
    And this is where we start
    to see exponential impact.
  • 10:18 - 10:19
    We pull that material down,
  • 10:20 - 10:23
    it gets reported to the National Center
    for Missing and Exploited Children
  • 10:23 - 10:26
    and then those hashes
    go back into the system
  • 10:26 - 10:28
    and benefit every other company on it.
  • 10:28 - 10:33
    And when the millions of hashes we have
    lead to millions more and, in real time,
  • 10:33 - 10:37
    companies around the world are identifying
    and pulling this content down,
  • 10:37 - 10:42
    we will have dramatically increased
    the speed at which we are removing
  • 10:42 - 10:46
    child sexual abuse material
    from the internet around the world.
  • 10:46 - 10:52
    (Applause)
  • 10:54 - 10:57
    But this is why it can't just be
    about software and data,
  • 10:57 - 10:59
    it has to be about scale.
  • 10:59 - 11:03
    We have to activate thousands of officers,
  • 11:03 - 11:05
    hundreds of companies around the world
  • 11:05 - 11:09
    if technology will allow us
    to outrun the perpetrators
  • 11:09 - 11:13
    and dismantle the communities
    that are normalizing child sexual abuse
  • 11:13 - 11:15
    around the world today.
  • 11:15 - 11:18
    And the time to do this is now.
  • 11:18 - 11:24
    We can no longer say we don't know
    the impact this is having on our children.
  • 11:25 - 11:29
    The first generation of children
    whose abuse has gone viral
  • 11:29 - 11:31
    are now young adults.
  • 11:31 - 11:34
    The Canadian Centre for Child Protection
  • 11:34 - 11:37
    just did a recent study
    of these young adults
  • 11:37 - 11:41
    to understand the unique trauma
    they try to recover from,
  • 11:41 - 11:44
    knowing that their abuse lives on.
  • 11:45 - 11:50
    Eighty percent of these young adults
    have thought about suicide.
  • 11:51 - 11:55
    More than 60 percent
    have attempted suicide.
  • 11:56 - 12:01
    And most of them live
    with the fear every single day
  • 12:01 - 12:05
    that as they walk down the street
    or they interview for a job
  • 12:05 - 12:08
    or they go to school
  • 12:08 - 12:10
    or they meet someone online,
  • 12:10 - 12:14
    that that person has seen their abuse.
  • 12:15 - 12:19
    And the reality came true
    for more than 30 percent of them.
  • 12:20 - 12:25
    They had been recognized
    from their abuse material online.
  • 12:26 - 12:29
    This is not going to be easy,
  • 12:29 - 12:32
    but it is not impossible.
  • 12:32 - 12:35
    Now it's going to take the will,
  • 12:35 - 12:36
    the will of our society
  • 12:37 - 12:40
    to look at something
    that is really hard to look at,
  • 12:40 - 12:42
    to take something out of the darkness
  • 12:42 - 12:45
    so these kids have a voice;
  • 12:46 - 12:51
    the will of companies to take action
    and make sure that their platforms
  • 12:51 - 12:54
    are not complicit in the abuse of a child;
  • 12:55 - 12:59
    the will of governments to invest
    with their law enforcement
  • 12:59 - 13:04
    for the tools they need to investigate
    a digital first crime,
  • 13:04 - 13:08
    even when the victims
    cannot speak for themselves.
  • 13:10 - 13:13
    This audacious commitment
    is part of that will.
  • 13:14 - 13:20
    It's a declaration of war
    against one of humanity's darkest evils.
  • 13:20 - 13:22
    But what I hang on to
  • 13:22 - 13:26
    is that it's actually
    an investment in a future
  • 13:26 - 13:29
    where every child can simply be a kid.
  • 13:29 - 13:31
    Thank you.
  • 13:31 - 13:37
    (Applause)
Title:
How we can eliminate child sexual abuse material from the internet
Speaker:
Julie Cordua
Description:

Social entrepreneur Julie Cordua works on a problem that isn't easy to talk about: the sexual abuse of children in images and videos on the internet. At Thorn, she's building technology to connect the dots between the tech industry, law enforcement and government -- so we can swiftly end the viral distribution of abuse material and rescue children faster. Learn more about how this scalable solution could help dismantle the communities normalizing child sexual abuse around the world today. (This ambitious plan is part of the Audacious Project, TED's initiative to inspire and fund global change.)

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDTalks
Duration:
13:50

English subtitles

Revisions Compare revisions