How deepfakes undermine truth and threaten democracy
-
0:01 - 0:03[This talk contains mature content]
-
0:06 - 0:09Rana Ayyub is a journalist in India
-
0:09 - 0:11whose work has exposed
government corruption -
0:12 - 0:15and human rights violations.
-
0:15 - 0:16And over the years,
-
0:16 - 0:19she's gotten used to vitriol
and controversy around her work. -
0:20 - 0:25But none of it could have prepared her
for what she faced in April 2018. -
0:26 - 0:30She was sitting in a café with a friend
when she first saw it: -
0:30 - 0:35a two-minute, 20-second video
of her engaged in a sex act. -
0:35 - 0:38And she couldn't believe her eyes.
-
0:38 - 0:40She had never made a sex video.
-
0:41 - 0:44But unfortunately, thousands
upon thousands of people -
0:44 - 0:46would believe it was her.
-
0:47 - 0:50I interviewed Ms. Ayyub
about three months ago, -
0:50 - 0:52in connection with my book
on sexual privacy. -
0:53 - 0:56I'm a law professor, lawyer
and civil rights advocate. -
0:56 - 1:01So it's incredibly frustrating
knowing that right now, -
1:01 - 1:03law could do very little to help her.
-
1:03 - 1:05And as we talked,
-
1:05 - 1:10she explained that she should have seen
the fake sex video coming. -
1:10 - 1:16She said, "After all, sex is so often used
to demean and to shame women, -
1:16 - 1:18especially minority women,
-
1:18 - 1:22and especially minority women
who dare to challenge powerful men," -
1:22 - 1:24as she had in her work.
-
1:25 - 1:29The fake sex video went viral in 48 hours.
-
1:30 - 1:35All of her online accounts were flooded
with screenshots of the video, -
1:35 - 1:38with graphic rape and death threats
-
1:38 - 1:41and with slurs about her Muslim faith.
-
1:41 - 1:46Online posts suggested that
she was "available" for sex. -
1:46 - 1:48And she was doxed,
-
1:48 - 1:50which means that her home address
and her cell phone number -
1:50 - 1:52were spread across the internet.
-
1:53 - 1:57The video was shared
more than 40,000 times. -
1:58 - 2:02Now, when someone is targeted
with this kind of cybermob attack, -
2:02 - 2:04the harm is profound.
-
2:04 - 2:08Rana Ayyub's life was turned upside down.
-
2:08 - 2:12For weeks, she could hardly eat or speak.
-
2:12 - 2:16She stopped writing and closed
all of her social media accounts, -
2:16 - 2:19which is, you know, a tough thing to do
when you're a journalist. -
2:19 - 2:23And she was afraid to go outside
her family's home. -
2:23 - 2:26What if the posters
made good on their threats? -
2:26 - 2:31The UN Council on Human Rights
confirmed that she wasn't being crazy. -
2:31 - 2:35It issued a public statement saying
that they were worried about her safety. -
2:37 - 2:41What Rana Ayyub faced was a deepfake:
-
2:41 - 2:44machine-learning technology
-
2:44 - 2:48that manipulates or fabricates
audio and video recordings -
2:48 - 2:50to show people doing and saying things
-
2:50 - 2:52that they never did or said.
-
2:53 - 2:56Deepfakes appear authentic
and realistic, but they're not; -
2:56 - 2:58they're total falsehoods.
-
2:59 - 3:03Although the technology
is still developing in its sophistication, -
3:03 - 3:05it is widely available.
-
3:05 - 3:08Now, the most recent attention
to deepfakes arose, -
3:08 - 3:11as so many things do online,
-
3:11 - 3:12with pornography.
-
3:12 - 3:15In early 2018,
-
3:15 - 3:17someone posted a tool on Reddit
-
3:17 - 3:22to allow users to insert faces
into porn videos. -
3:22 - 3:25And what followed was a cascade
of fake porn videos -
3:25 - 3:28featuring people's favorite
female celebrities. -
3:29 - 3:32And today, you can go on YouTube
and pull up countless tutorials -
3:32 - 3:34with step-by-step instructions
-
3:35 - 3:38on how to make a deepfake
on your desktop application. -
3:38 - 3:42And soon we may be even able
to make them on our cell phones. -
3:43 - 3:48Now, it's the interaction
of some of our most basic human frailties -
3:48 - 3:50and network tools
-
3:50 - 3:53that can turn deepfakes into weapons.
-
3:53 - 3:54So let me explain.
-
3:55 - 3:59As human beings, we have
a visceral reaction to audio and video. -
4:00 - 4:01We believe they're true,
-
4:01 - 4:03on the notion that
of course you can believe -
4:03 - 4:06what your eyes and ears are telling you.
-
4:06 - 4:08And it's that mechanism
-
4:08 - 4:12that might undermine our shared
sense of reality. -
4:12 - 4:15Although we believe deepfakes
to be true, they're not. -
4:16 - 4:20And we're attracted
to the salacious, the provocative. -
4:20 - 4:23We tend to believe
and to share information -
4:23 - 4:25that's negative and novel.
-
4:26 - 4:31And researchers have found that online
hoaxes spread 10 times faster -
4:31 - 4:32than accurate stories.
-
4:34 - 4:38Now, we're also drawn to information
-
4:38 - 4:40that aligns with our viewpoints.
-
4:41 - 4:45Psychologists call that tendency
"confirmation bias." -
4:45 - 4:50And social media platforms
supercharge that tendency, -
4:50 - 4:54by allowing us to instantly
and widely share information -
4:54 - 4:55that accords with our viewpoints.
-
4:57 - 5:02Now, deepfakes have the potential to cause
grave individual and societal harm. -
5:03 - 5:05So, imagine a deepfake
-
5:05 - 5:09that shows American soldiers
in Afganistan burning a Koran. -
5:11 - 5:14You can imagine that that deepfake
would provoke violence -
5:14 - 5:15against those soldiers.
-
5:16 - 5:19And what if the very next day
-
5:19 - 5:21there's another deepfake that drops,
-
5:21 - 5:24that shows a well-known imam
based in London -
5:24 - 5:27praising the attack on those soldiers?
-
5:28 - 5:31We might see violence and civil unrest,
-
5:31 - 5:34not only in Afganistan
and the United Kingdom, -
5:34 - 5:36but across the globe.
-
5:36 - 5:37And you might say to me,
-
5:37 - 5:40"Come on, Danielle, that's far-fetched."
-
5:40 - 5:41But it's not.
-
5:41 - 5:43We've seen falsehoods spread
-
5:44 - 5:46on WhatsApp and other
online message services -
5:46 - 5:49lead to violence
against ethnic minorities. -
5:49 - 5:51And that was just text --
-
5:51 - 5:53imagine if it were video.
-
5:55 - 6:00Now, deepfakes have the potential
to corrode the trust that we have -
6:00 - 6:02in democratic institutions.
-
6:03 - 6:06So, imagine the night before an election.
-
6:06 - 6:09There's a deepfake showing
one of the major party candidates -
6:09 - 6:10gravely sick.
-
6:11 - 6:14The deepfake could tip the election
-
6:14 - 6:17and shake our sense
that elections are legitimate. -
6:19 - 6:22Imagine if the night before
an initial public offering -
6:22 - 6:24of a major global bank,
-
6:24 - 6:27there was a deepfake
showing the bank's CEO -
6:27 - 6:30drunkenly spouting conspiracy theories.
-
6:31 - 6:34The deepfake could tank the IPO,
-
6:34 - 6:38and worse, shake our sense
that financial markets are stable. -
6:39 - 6:46So deepfakes can exploit and magnify
the deep distrust that we already have -
6:46 - 6:51in politicians, business leaders
and other influential leaders. -
6:51 - 6:54They find an audience
primed to believe them. -
6:55 - 6:58And the pursuit of truth
is on the line as well. -
6:59 - 7:03Technologists expect
that with advances in AI, -
7:03 - 7:06soon it may be difficult if not impossible
-
7:06 - 7:10to tell the difference between
a real video and a fake one. -
7:11 - 7:16So how can the truth emerge
in a deepfake-ridden marketplace of ideas? -
7:17 - 7:20Will we just proceed along
the path of least resistance -
7:20 - 7:23and believe what we want to believe,
-
7:23 - 7:24truth be damned?
-
7:25 - 7:28And not only might we believe the fakery,
-
7:28 - 7:31we might start disbelieving the truth.
-
7:32 - 7:36We've already seen people invoke
the phenomenon of deepfakes -
7:36 - 7:40to cast doubt on real evidence
of their wrongdoing. -
7:40 - 7:46We've heard politicians say of audio
of their disturbing comments, -
7:46 - 7:48"Come on, that's fake news.
-
7:48 - 7:52You can't believe what your eyes
and ears are telling you." -
7:52 - 7:54And it's that risk
-
7:54 - 8:00that professor Robert Chesney and I
call the "liar's dividend": -
8:00 - 8:03the risk that liars will invoke deepfakes
-
8:03 - 8:06to escape accountability
for their wrongdoing. -
8:07 - 8:10So we've got our work cut out for us,
there's no doubt about it. -
8:11 - 8:14And we're going to need
a proactive solution -
8:14 - 8:17from tech companies, from lawmakers,
-
8:17 - 8:19law enforcers and the media.
-
8:20 - 8:24And we're going to need
a healthy dose of societal resilience. -
8:26 - 8:29So now, we're right now engaged
in a very public conversation -
8:29 - 8:32about the responsibility
of tech companies. -
8:33 - 8:36And my advice to social media platforms
-
8:36 - 8:40has been to change their terms of service
and community guidelines -
8:40 - 8:42to ban deepfakes that cause harm.
-
8:43 - 8:47That determination,
that's going to require human judgment, -
8:47 - 8:48and it's expensive.
-
8:49 - 8:51But we need human beings
-
8:51 - 8:55to look at the content
and context of a deepfake -
8:55 - 8:59to figure out if it is
a harmful impersonation -
8:59 - 9:03or instead, if it's valuable
satire, art or education. -
9:04 - 9:06So now, what about the law?
-
9:07 - 9:09Law is our educator.
-
9:10 - 9:14It teaches us about
what's harmful and what's wrong. -
9:14 - 9:18And it shapes behavior it deters
by punishing perpetrators -
9:18 - 9:20and securing remedies for victims.
-
9:21 - 9:25Right now, law is not up to
the challenge of deepfakes. -
9:26 - 9:28Across the globe,
-
9:28 - 9:30we lack well-tailored laws
-
9:30 - 9:34that would be designed to tackle
digital impersonations -
9:34 - 9:36that invade sexual privacy,
-
9:36 - 9:37that damage reputations
-
9:37 - 9:39and that cause emotional distress.
-
9:40 - 9:44What happened to Rana Ayyub
is increasingly commonplace. -
9:44 - 9:46Yet, when she went
to law enforcement in Delhi, -
9:46 - 9:48she was told nothing could be done.
-
9:49 - 9:52And the sad truth is
that the same would be true -
9:52 - 9:55in the United States and in Europe.
-
9:55 - 10:00So we have a legal vacuum
that needs to be filled. -
10:00 - 10:04My colleague Dr. Mary Anne Franks and I
are working with US lawmakers -
10:04 - 10:09to devise legislation that would ban
harmful digital impersonations -
10:09 - 10:12that are tantamount to identity theft.
-
10:12 - 10:14And we've seen similar moves
-
10:14 - 10:18in Iceland, the UK and Australia.
-
10:18 - 10:21But of course, that's just a small piece
of the regulatory puzzle. -
10:23 - 10:26Now, I know law is not a cure-all. Right?
-
10:26 - 10:28It's a blunt instrument.
-
10:28 - 10:30And we've got to use it wisely.
-
10:30 - 10:33It also has some practical impediments.
-
10:34 - 10:39You can't leverage law against people
you can't identify and find. -
10:39 - 10:43And if a perpetrator lives
outside the country -
10:43 - 10:45where a victim lives,
-
10:45 - 10:46then you may not be able to insist
-
10:46 - 10:49that the perpetrator
come into local courts -
10:49 - 10:50to face justice.
-
10:50 - 10:54And so we're going to need
a coordinated international response. -
10:56 - 10:59Education has to be part
of our response as well. -
11:00 - 11:04Law enforcers are not
going to enforce laws -
11:04 - 11:05they don't know about
-
11:05 - 11:08and proffer problems
they don't understand. -
11:08 - 11:11In my research on cyberstalking,
-
11:11 - 11:14I found that law enforcement
lacked the training -
11:14 - 11:17to understand the laws available to them
-
11:17 - 11:19and the problem of online abuse.
-
11:19 - 11:22And so often they told victims,
-
11:22 - 11:26"Just turn your computer off.
Ignore it. It'll go away." -
11:26 - 11:29And we saw that in Rana Ayyub's case.
-
11:29 - 11:33She was told, "Come on,
you're making such a big deal about this. -
11:33 - 11:34It's boys being boys."
-
11:35 - 11:41And so we need to pair new legislation
with efforts at training. -
11:42 - 11:45And education has to be aimed
on the media as well. -
11:46 - 11:50Journalists need educating
about the phenomenon of deepfakes -
11:50 - 11:54so they don't amplify and spread them.
-
11:55 - 11:57And this is the part
where we're all involved. -
11:57 - 12:01Each and every one of us needs educating.
-
12:01 - 12:05We click, we share, we like,
and we don't even think about it. -
12:06 - 12:07We need to do better.
-
12:08 - 12:11We need far better radar for fakery.
-
12:14 - 12:18So as we're working
through these solutions, -
12:18 - 12:20there's going to be
a lot of suffering to go around. -
12:21 - 12:24Rana Ayyub is still wrestling
with the fallout. -
12:25 - 12:29She still doesn't feel free
to express herself on- and offline. -
12:30 - 12:31And as she told me,
-
12:31 - 12:36she still feels like there are thousands
of eyes on her naked body, -
12:36 - 12:40even though, intellectually,
she knows it wasn't her body. -
12:40 - 12:43And she has frequent panic attacks,
-
12:43 - 12:47especially when someone she doesn't know
tries to take her picture. -
12:47 - 12:50"What if they're going to make
another deepfake?" she thinks to herself. -
12:51 - 12:55And so for the sake of
individuals like Rana Ayyub -
12:55 - 12:57and the sake of our democracy,
-
12:57 - 13:00we need to do something right now.
-
13:00 - 13:01Thank you.
-
13:01 - 13:03(Applause)
- Title:
- How deepfakes undermine truth and threaten democracy
- Speaker:
- Danielle Citron
- Description:
-
The use of deepfake technology to manipulate video and audio for malicious purposes -- whether it's to stoke violence or defame politicians and journalists -- is becoming a real threat. As these tools become more accessible and their products more realistic, how will they shape what we believe about the world? In a portentous talk, law professor Danielle Citron reveals how deepfakes magnify our distrust -- and suggests approaches to safeguarding the truth.
- Video Language:
- English
- Team:
- closed TED
- Project:
- TEDTalks
- Duration:
- 13:16
Brian Greene edited English subtitles for How deepfakes undermine truth and threaten democracy | ||
Brian Greene edited English subtitles for How deepfakes undermine truth and threaten democracy | ||
Brian Greene edited English subtitles for How deepfakes undermine truth and threaten democracy | ||
Oliver Friedman approved English subtitles for How deepfakes undermine truth and threaten democracy | ||
Oliver Friedman edited English subtitles for How deepfakes undermine truth and threaten democracy | ||
Camille Martínez accepted English subtitles for How deepfakes undermine truth and threaten democracy | ||
Camille Martínez edited English subtitles for How deepfakes undermine truth and threaten democracy | ||
Camille Martínez edited English subtitles for How deepfakes undermine truth and threaten democracy |