How we can eliminate child sexual abuse material from the internet
-
0:01 - 0:03[This talk contains mature content]
-
0:06 - 0:07Five years ago,
-
0:07 - 0:11I received a phone call
that would change my life. -
0:12 - 0:14I remember so vividly that day.
-
0:15 - 0:17It was about this time of year,
-
0:17 - 0:19and I was sitting in my office.
-
0:20 - 0:23I remember the sun
streaming through the window. -
0:24 - 0:25And my phone rang.
-
0:26 - 0:27And I picked it up,
-
0:28 - 0:32and it was two federal agents,
asking for my help -
0:32 - 0:34in identifying a little girl
-
0:34 - 0:40featured in hundreds of child
sexual abuse images they had found online. -
0:41 - 0:44They had just started working the case,
-
0:44 - 0:47but what they knew
-
0:47 - 0:51was that her abuse had been broadcast
to the world for years -
0:51 - 0:57on dark web sites dedicated
to the sexual abuse of children. -
0:58 - 1:02And her abuser was incredibly
technologically sophisticated: -
1:02 - 1:07new images and new videos every few weeks,
-
1:07 - 1:11but very few clues as to who she was
-
1:11 - 1:12or where she was.
-
1:13 - 1:15And so they called us,
-
1:15 - 1:17because they had heard
we were a new nonprofit -
1:17 - 1:21building technology
to fight child sexual abuse. -
1:22 - 1:24But we were only two years old,
-
1:24 - 1:27and we had only worked
on child sex trafficking. -
1:28 - 1:30And I had to tell them
-
1:30 - 1:31we had nothing.
-
1:32 - 1:36We had nothing that could
help them stop this abuse. -
1:37 - 1:41It took those agents another year
-
1:41 - 1:44to ultimately find that child.
-
1:45 - 1:47And by the time she was rescued,
-
1:47 - 1:54hundreds of images and videos
documenting her rape had gone viral, -
1:54 - 1:55from the dark web
-
1:55 - 1:58to peer-to-peer networks,
private chat rooms -
1:58 - 2:02and to the websites you and I use
-
2:02 - 2:04every single day.
-
2:05 - 2:09And today, as she struggles to recover,
-
2:09 - 2:13she lives with the fact
that thousands around the world -
2:13 - 2:16continue to watch her abuse.
-
2:18 - 2:20I have come to learn
in the last five years -
2:20 - 2:23that this case is far from unique.
-
2:24 - 2:28How did we get here as a society?
-
2:29 - 2:33In the late 1980s, child pornography --
-
2:33 - 2:39or what it actually is,
child sexual abuse material -- -
2:39 - 2:40was nearly eliminated.
-
2:41 - 2:46New laws and increased prosecutions
made it simply too risky -
2:46 - 2:47to trade it through the mail.
-
2:48 - 2:52And then came the internet,
and the market exploded. -
2:53 - 2:57The amount of content in circulation today
-
2:57 - 3:00is massive and growing.
-
3:00 - 3:04This is a truly global problem,
-
3:04 - 3:06but if we just look at the US:
-
3:06 - 3:08in the US alone last year,
-
3:08 - 3:14more than 45 million images and videos
of child sexual abuse material -
3:14 - 3:17were reported to the National Center
for Missing and Exploited Children, -
3:17 - 3:22and that is nearly double
the amount the year prior. -
3:23 - 3:28And the details behind these numbers
are hard to contemplate, -
3:28 - 3:34with more than 60 percent of the images
featuring children younger than 12, -
3:34 - 3:38and most of them including
extreme acts of sexual violence. -
3:39 - 3:44Abusers are cheered on in chat rooms
dedicated to the abuse of children, -
3:44 - 3:47where they gain rank and notoriety
-
3:47 - 3:50with more abuse and more victims.
-
3:50 - 3:53In this market,
-
3:53 - 3:57the currency has become
the content itself. -
3:58 - 4:02It's clear that abusers have been quick
to leverage new technologies, -
4:02 - 4:05but our response as a society has not.
-
4:06 - 4:10These abusers don't read
user agreements of websites, -
4:10 - 4:14and the content doesn't honor
geographic boundaries. -
4:15 - 4:21And they win when we look
at one piece of the puzzle at a time, -
4:21 - 4:25which is exactly how
our response today is designed. -
4:25 - 4:28Law enforcement works in one jurisdiction.
-
4:28 - 4:32Companies look at just their platform.
-
4:32 - 4:34And whatever data they learn along the way
-
4:35 - 4:37is rarely shared.
-
4:37 - 4:43It is so clear that this
disconnected approach is not working. -
4:44 - 4:48We have to redesign
our response to this epidemic -
4:48 - 4:49for the digital age.
-
4:50 - 4:53And that's exactly
what we're doing at Thorn. -
4:53 - 4:57We're building the technology
to connect these dots, -
4:57 - 4:59to arm everyone on the front lines --
-
4:59 - 5:02law enforcement, NGOs and companies --
-
5:02 - 5:06with the tools they need
to ultimately eliminate -
5:06 - 5:08child sexual abuse material
from the internet. -
5:10 - 5:11Let's talk for a minute --
-
5:11 - 5:12(Applause)
-
5:12 - 5:14Thank you.
-
5:14 - 5:16(Applause)
-
5:18 - 5:20Let's talk for a minute
about what those dots are. -
5:21 - 5:25As you can imagine,
this content is horrific. -
5:25 - 5:28If you don't have to look at it,
you don't want to look at it. -
5:28 - 5:33And so, most companies
or law enforcement agencies -
5:33 - 5:35that have this content
-
5:35 - 5:39can translate every file
into a unique string of numbers. -
5:39 - 5:40This is called a "hash."
-
5:40 - 5:42It's essentially a fingerprint
-
5:42 - 5:45for each file or each video.
-
5:45 - 5:49And what this allows them to do
is use the information in investigations -
5:49 - 5:52or for a company to remove
the content from their platform, -
5:52 - 5:58without having to relook
at every image and every video each time. -
5:58 - 6:00The problem today, though,
-
6:00 - 6:04is that there are hundreds
of millions of these hashes -
6:04 - 6:08sitting in siloed databases
all around the world. -
6:08 - 6:09In a silo,
-
6:09 - 6:12it might work for the one agency
that has control over it, -
6:12 - 6:17but not connecting this data means
we don't know how many are unique. -
6:17 - 6:20We don't know which ones represent
children who have already been rescued -
6:20 - 6:23or need to be identified still.
-
6:23 - 6:27So our first, most basic premise
is that all of this data -
6:27 - 6:30must be connected.
-
6:30 - 6:36There are two ways where this data,
combined with software on a global scale, -
6:37 - 6:40can have transformative
impact in this space. -
6:40 - 6:43The first is with law enforcement:
-
6:43 - 6:47helping them identify new victims faster,
-
6:47 - 6:48stopping abuse
-
6:48 - 6:51and stopping those producing this content.
-
6:51 - 6:54The second is with companies:
-
6:54 - 6:58using it as clues to identify
the hundreds of millions of files -
6:58 - 6:59in circulation today,
-
6:59 - 7:01pulling it down
-
7:01 - 7:07and then stopping the upload
of new material before it ever goes viral. -
7:10 - 7:11Four years ago,
-
7:11 - 7:13when that case ended,
-
7:14 - 7:18our team sat there,
and we just felt this, um ... -
7:20 - 7:23... deep sense of failure,
is the way I can put it, -
7:23 - 7:27because we watched that whole year
-
7:27 - 7:28while they looked for her.
-
7:28 - 7:32And we saw every place
in the investigation -
7:32 - 7:34where, if the technology
would have existed, -
7:34 - 7:37they would have found her faster.
-
7:38 - 7:40And so we walked away from that
-
7:40 - 7:43and we went and we did
the only thing we knew how to do: -
7:43 - 7:45we began to build software.
-
7:46 - 7:48So we've started with law enforcement.
-
7:48 - 7:52Our dream was an alarm bell on the desks
of officers all around the world -
7:52 - 7:57so that if anyone dare post
a new victim online, -
7:57 - 8:00someone would start
looking for them immediately. -
8:01 - 8:04I obviously can't talk about
the details of that software, -
8:04 - 8:07but today it's at work in 38 countries,
-
8:07 - 8:10having reduced the time it takes
to get to a child -
8:10 - 8:12by more than 65 percent.
-
8:12 - 8:17(Applause)
-
8:21 - 8:24And now we're embarking
on that second horizon: -
8:24 - 8:30building the software to help companies
identify and remove this content. -
8:31 - 8:34Let's talk for a minute
about these companies. -
8:34 - 8:40So, I told you -- 45 million images
and videos in the US alone last year. -
8:40 - 8:44Those come from just 12 companies.
-
8:46 - 8:52Twelve companies, 45 million files
of child sexual abuse material. -
8:52 - 8:55These come from those companies
that have the money -
8:55 - 9:00to build the infrastructure that it takes
to pull this content down. -
9:00 - 9:02But there are hundreds of other companies,
-
9:02 - 9:05small- to medium-size companies
around the world, -
9:05 - 9:07that need to do this work,
-
9:07 - 9:12but they either: 1) can't imagine that
their platform would be used for abuse, -
9:12 - 9:18or 2) don't have the money to spend
on something that is not driving revenue. -
9:19 - 9:22So we went ahead and built it for them,
-
9:22 - 9:27and this system now gets smarter
with the more companies that participate. -
9:28 - 9:30Let me give you an example.
-
9:30 - 9:34Our first partner, Imgur --
if you haven't heard of this company, -
9:34 - 9:38it's one of the most visited
websites in the US -- -
9:38 - 9:43millions of pieces of user-generated
content uploaded every single day, -
9:43 - 9:45in a mission to make the internet
a more fun place. -
9:46 - 9:48They partnered with us first.
-
9:48 - 9:51Within 20 minutes
of going live on our system, -
9:51 - 9:55someone tried to upload
a known piece of abuse material. -
9:55 - 9:57They were able to stop it,
they pull it down, -
9:57 - 10:00they report it to the National Center
for Missing and Exploited Children. -
10:00 - 10:02But they went a step further,
-
10:02 - 10:07and they went and inspected the account
of the person who had uploaded it. -
10:07 - 10:12Hundreds more pieces
of child sexual abuse material -
10:12 - 10:14that we had never seen.
-
10:14 - 10:18And this is where we start
to see exponential impact. -
10:18 - 10:19We pull that material down,
-
10:20 - 10:23it gets reported to the National Center
for Missing and Exploited Children -
10:23 - 10:26and then those hashes
go back into the system -
10:26 - 10:28and benefit every other company on it.
-
10:28 - 10:33And when the millions of hashes we have
lead to millions more and, in real time, -
10:33 - 10:37companies around the world are identifying
and pulling this content down, -
10:37 - 10:42we will have dramatically increased
the speed at which we are removing -
10:42 - 10:46child sexual abuse material
from the internet around the world. -
10:46 - 10:52(Applause)
-
10:54 - 10:57But this is why it can't just be
about software and data, -
10:57 - 10:59it has to be about scale.
-
10:59 - 11:03We have to activate thousands of officers,
-
11:03 - 11:05hundreds of companies around the world
-
11:05 - 11:09if technology will allow us
to outrun the perpetrators -
11:09 - 11:13and dismantle the communities
that are normalizing child sexual abuse -
11:13 - 11:15around the world today.
-
11:15 - 11:18And the time to do this is now.
-
11:18 - 11:24We can no longer say we don't know
the impact this is having on our children. -
11:25 - 11:29The first generation of children
whose abuse has gone viral -
11:29 - 11:31are now young adults.
-
11:31 - 11:34The Canadian Centre for Child Protection
-
11:34 - 11:37just did a recent study
of these young adults -
11:37 - 11:41to understand the unique trauma
they try to recover from, -
11:41 - 11:44knowing that their abuse lives on.
-
11:45 - 11:50Eighty percent of these young adults
have thought about suicide. -
11:51 - 11:55More than 60 percent
have attempted suicide. -
11:56 - 12:01And most of them live
with the fear every single day -
12:01 - 12:05that as they walk down the street
or they interview for a job -
12:05 - 12:08or they go to school
-
12:08 - 12:10or they meet someone online,
-
12:10 - 12:14that that person has seen their abuse.
-
12:15 - 12:19And the reality came true
for more than 30 percent of them. -
12:20 - 12:25They had been recognized
from their abuse material online. -
12:26 - 12:29This is not going to be easy,
-
12:29 - 12:32but it is not impossible.
-
12:32 - 12:35Now it's going to take the will,
-
12:35 - 12:36the will of our society
-
12:37 - 12:40to look at something
that is really hard to look at, -
12:40 - 12:42to take something out of the darkness
-
12:42 - 12:45so these kids have a voice;
-
12:46 - 12:51the will of companies to take action
and make sure that their platforms -
12:51 - 12:54are not complicit in the abuse of a child;
-
12:55 - 12:59the will of governments to invest
with their law enforcement -
12:59 - 13:04for the tools they need to investigate
a digital first crime, -
13:04 - 13:08even when the victims
cannot speak for themselves. -
13:10 - 13:13This audacious commitment
is part of that will. -
13:14 - 13:20It's a declaration of war
against one of humanity's darkest evils. -
13:20 - 13:22But what I hang on to
-
13:22 - 13:26is that it's actually
an investment in a future -
13:26 - 13:29where every child can simply be a kid.
-
13:29 - 13:31Thank you.
-
13:31 - 13:37(Applause)
- Title:
- How we can eliminate child sexual abuse material from the internet
- Speaker:
- Julie Cordua
- Description:
-
Social entrepreneur Julie Cordua works on a problem that isn't easy to talk about: the sexual abuse of children in images and videos on the internet. At Thorn, she's building technology to connect the dots between the tech industry, law enforcement and government -- so we can swiftly end the viral distribution of abuse material and rescue children faster. Learn more about how this scalable solution could help dismantle the communities normalizing child sexual abuse around the world today. (This ambitious plan is part of the Audacious Project, TED's initiative to inspire and fund global change.)
- Video Language:
- English
- Team:
- closed TED
- Project:
- TEDTalks
- Duration:
- 13:50
Brian Greene edited English subtitles for How we can eliminate child sexual abuse material from the internet | ||
Brian Greene edited English subtitles for How we can eliminate child sexual abuse material from the internet | ||
Oliver Friedman approved English subtitles for How we can eliminate child sexual abuse material from the internet | ||
Oliver Friedman edited English subtitles for How we can eliminate child sexual abuse material from the internet | ||
Camille Martínez accepted English subtitles for How we can eliminate child sexual abuse material from the internet | ||
Camille Martínez edited English subtitles for How we can eliminate child sexual abuse material from the internet | ||
Camille Martínez edited English subtitles for How we can eliminate child sexual abuse material from the internet | ||
Joseph Geni edited English subtitles for How we can eliminate child sexual abuse material from the internet |