WEBVTT 00:00:00.535 --> 00:00:03.302 [This talk contains mature content] NOTE Paragraph 00:00:05.762 --> 00:00:08.754 Rana Ayyub is a journalist in India 00:00:08.778 --> 00:00:11.380 whose work has exposed government corruption 00:00:12.411 --> 00:00:14.568 and human rights violations. 00:00:14.990 --> 00:00:16.157 And over the years, 00:00:16.181 --> 00:00:19.484 she's gotten used to vitriol and controversy around her work. 00:00:20.149 --> 00:00:25.258 But none of it could have prepared her for what she faced in April 2018. NOTE Paragraph 00:00:26.125 --> 00:00:29.776 She was sitting in a café with a friend when she first saw it: 00:00:29.800 --> 00:00:34.743 a two-minute, 20-second video of her engaged in a sex act. 00:00:35.188 --> 00:00:37.537 And she couldn't believe her eyes. 00:00:37.561 --> 00:00:39.834 She had never made a sex video. 00:00:40.506 --> 00:00:43.971 But unfortunately, thousands upon thousands of people 00:00:43.995 --> 00:00:45.661 would believe it was her. NOTE Paragraph 00:00:46.673 --> 00:00:49.617 I interviewed Ms. Ayyub about three months ago, 00:00:49.641 --> 00:00:52.145 in connection with my book on sexual privacy. 00:00:52.681 --> 00:00:55.879 I'm a law professor, lawyer and civil rights advocate. 00:00:56.204 --> 00:01:00.815 So it's incredibly frustrating knowing that right now, 00:01:00.839 --> 00:01:03.077 law could do very little to help her. 00:01:03.458 --> 00:01:05.005 And as we talked, 00:01:05.029 --> 00:01:09.541 she explained that she should have seen the fake sex video coming. 00:01:10.038 --> 00:01:15.634 She said, "After all, sex is so often used to demean and to shame women, 00:01:15.658 --> 00:01:18.086 especially minority women, 00:01:18.110 --> 00:01:22.422 and especially minority women who dare to challenge powerful men," 00:01:22.446 --> 00:01:23.979 as she had in her work. 00:01:25.191 --> 00:01:29.167 The fake sex video went viral in 48 hours. 00:01:30.064 --> 00:01:35.371 All of her online accounts were flooded with screenshots of the video, 00:01:35.395 --> 00:01:38.022 with graphic rape and death threats 00:01:38.046 --> 00:01:40.579 and with slurs about her Muslim faith. 00:01:41.426 --> 00:01:45.990 Online posts suggested that she was "available" for sex. 00:01:46.014 --> 00:01:47.624 And she was doxed, 00:01:47.648 --> 00:01:50.426 which means that her home address and her cell phone number 00:01:50.450 --> 00:01:52.196 were spread across the internet. 00:01:52.879 --> 00:01:56.963 The video was shared more than 40,000 times. NOTE Paragraph 00:01:57.760 --> 00:02:01.696 Now, when someone is targeted with this kind of cybermob attack, 00:02:01.720 --> 00:02:03.783 the harm is profound. 00:02:04.482 --> 00:02:07.521 Rana Ayyub's life was turned upside down. 00:02:08.211 --> 00:02:11.545 For weeks, she could hardly eat or speak. 00:02:11.919 --> 00:02:15.608 She stopped writing and closed all of her social media accounts, 00:02:15.632 --> 00:02:18.790 which is, you know, a tough thing to do when you're a journalist. 00:02:19.188 --> 00:02:22.672 And she was afraid to go outside her family's home. 00:02:22.696 --> 00:02:25.718 What if the posters made good on their threats? 00:02:26.395 --> 00:02:30.760 The UN Council on Human Rights confirmed that she wasn't being crazy. 00:02:30.784 --> 00:02:35.421 It issued a public statement saying that they were worried about her safety. NOTE Paragraph 00:02:36.776 --> 00:02:41.005 What Rana Ayyub faced was a deepfake: 00:02:41.029 --> 00:02:43.569 machine-learning technology 00:02:43.593 --> 00:02:47.704 that manipulates or fabricates audio and video recordings 00:02:47.728 --> 00:02:50.451 to show people doing and saying things 00:02:50.475 --> 00:02:52.341 that they never did or said. 00:02:52.807 --> 00:02:56.168 Deepfakes appear authentic and realistic, but they're not; 00:02:56.192 --> 00:02:57.964 they're total falsehoods. 00:02:59.228 --> 00:03:03.022 Although the technology is still developing in its sophistication, 00:03:03.046 --> 00:03:04.660 it is widely available. NOTE Paragraph 00:03:05.371 --> 00:03:08.443 Now, the most recent attention to deepfakes arose, 00:03:08.467 --> 00:03:10.628 as so many things do online, 00:03:10.652 --> 00:03:11.907 with pornography. NOTE Paragraph 00:03:12.498 --> 00:03:14.609 In early 2018, 00:03:14.633 --> 00:03:17.101 someone posted a tool on Reddit 00:03:17.125 --> 00:03:21.537 to allow users to insert faces into porn videos. 00:03:21.561 --> 00:03:25.001 And what followed was a cascade of fake porn videos 00:03:25.025 --> 00:03:27.822 featuring people's favorite female celebrities. 00:03:28.712 --> 00:03:32.189 And today, you can go on YouTube and pull up countless tutorials 00:03:32.213 --> 00:03:34.499 with step-by-step instructions 00:03:34.523 --> 00:03:37.686 on how to make a deepfake on your desktop application. 00:03:38.260 --> 00:03:41.966 And soon we may be even able to make them on our cell phones. 00:03:43.072 --> 00:03:48.454 Now, it's the interaction of some of our most basic human frailties 00:03:48.478 --> 00:03:50.160 and network tools 00:03:50.184 --> 00:03:52.850 that can turn deepfakes into weapons. 00:03:52.874 --> 00:03:54.074 So let me explain. NOTE Paragraph 00:03:54.875 --> 00:03:59.441 As human beings, we have a visceral reaction to audio and video. 00:03:59.860 --> 00:04:01.348 We believe they're true, 00:04:01.372 --> 00:04:03.450 on the notion that of course you can believe 00:04:03.474 --> 00:04:05.952 what your eyes and ears are telling you. 00:04:06.476 --> 00:04:08.175 And it's that mechanism 00:04:08.199 --> 00:04:11.897 that might undermine our shared sense of reality. 00:04:11.921 --> 00:04:15.068 Although we believe deepfakes to be true, they're not. 00:04:15.604 --> 00:04:19.761 And we're attracted to the salacious, the provocative. 00:04:20.365 --> 00:04:23.412 We tend to believe and to share information 00:04:23.436 --> 00:04:25.459 that's negative and novel. 00:04:25.809 --> 00:04:30.828 And researchers have found that online hoaxes spread 10 times faster 00:04:30.852 --> 00:04:32.479 than accurate stories. 00:04:34.015 --> 00:04:38.395 Now, we're also drawn to information 00:04:38.419 --> 00:04:40.311 that aligns with our viewpoints. 00:04:40.950 --> 00:04:44.511 Psychologists call that tendency "confirmation bias." 00:04:45.300 --> 00:04:49.687 And social media platforms supercharge that tendency, 00:04:49.711 --> 00:04:53.592 by allowing us to instantly and widely share information 00:04:53.616 --> 00:04:55.408 that accords with our viewpoints. NOTE Paragraph 00:04:56.735 --> 00:05:02.303 Now, deepfakes have the potential to cause grave individual and societal harm. 00:05:03.204 --> 00:05:05.228 So, imagine a deepfake 00:05:05.252 --> 00:05:09.434 that shows American soldiers in Afganistan burning a Koran. 00:05:10.807 --> 00:05:13.831 You can imagine that that deepfake would provoke violence 00:05:13.855 --> 00:05:15.388 against those soldiers. 00:05:15.847 --> 00:05:18.720 And what if the very next day 00:05:18.744 --> 00:05:20.998 there's another deepfake that drops, 00:05:21.022 --> 00:05:24.339 that shows a well-known imam based in London 00:05:24.363 --> 00:05:26.830 praising the attack on those soldiers? 00:05:27.617 --> 00:05:30.780 We might see violence and civil unrest, 00:05:30.804 --> 00:05:34.053 not only in Afganistan and the United Kingdom, 00:05:34.077 --> 00:05:35.592 but across the globe. NOTE Paragraph 00:05:36.251 --> 00:05:37.409 And you might say to me, 00:05:37.433 --> 00:05:39.680 "Come on, Danielle, that's far-fetched." 00:05:39.704 --> 00:05:40.854 But it's not. 00:05:41.293 --> 00:05:43.484 We've seen falsehoods spread 00:05:43.508 --> 00:05:46.230 on WhatsApp and other online message services 00:05:46.254 --> 00:05:49.015 lead to violence against ethnic minorities. 00:05:49.039 --> 00:05:50.926 And that was just text -- 00:05:50.950 --> 00:05:52.974 imagine if it were video. NOTE Paragraph 00:05:54.593 --> 00:05:59.950 Now, deepfakes have the potential to corrode the trust that we have 00:05:59.974 --> 00:06:01.966 in democratic institutions. 00:06:03.006 --> 00:06:05.673 So, imagine the night before an election. 00:06:05.996 --> 00:06:09.234 There's a deepfake showing one of the major party candidates 00:06:09.258 --> 00:06:10.408 gravely sick. 00:06:11.202 --> 00:06:13.535 The deepfake could tip the election 00:06:13.559 --> 00:06:16.934 and shake our sense that elections are legitimate. 00:06:18.515 --> 00:06:21.841 Imagine if the night before an initial public offering 00:06:21.865 --> 00:06:24.198 of a major global bank, 00:06:24.222 --> 00:06:27.371 there was a deepfake showing the bank's CEO 00:06:27.395 --> 00:06:30.092 drunkenly spouting conspiracy theories. 00:06:30.887 --> 00:06:33.934 The deepfake could tank the IPO, 00:06:33.958 --> 00:06:38.073 and worse, shake our sense that financial markets are stable. NOTE Paragraph 00:06:39.385 --> 00:06:46.374 So deepfakes can exploit and magnify the deep distrust that we already have 00:06:46.398 --> 00:06:50.612 in politicians, business leaders and other influential leaders. 00:06:50.945 --> 00:06:54.229 They find an audience primed to believe them. 00:06:55.287 --> 00:06:58.052 And the pursuit of truth is on the line as well. 00:06:59.077 --> 00:07:02.641 Technologists expect that with advances in AI, 00:07:02.665 --> 00:07:06.347 soon it may be difficult if not impossible 00:07:06.371 --> 00:07:10.140 to tell the difference between a real video and a fake one. NOTE Paragraph 00:07:11.022 --> 00:07:16.363 So how can the truth emerge in a deepfake-ridden marketplace of ideas? 00:07:16.752 --> 00:07:20.172 Will we just proceed along the path of least resistance 00:07:20.196 --> 00:07:22.633 and believe what we want to believe, 00:07:22.657 --> 00:07:23.807 truth be damned? 00:07:24.831 --> 00:07:28.006 And not only might we believe the fakery, 00:07:28.030 --> 00:07:31.356 we might start disbelieving the truth. 00:07:31.887 --> 00:07:35.966 We've already seen people invoke the phenomenon of deepfakes 00:07:35.990 --> 00:07:39.910 to cast doubt on real evidence of their wrongdoing. 00:07:39.934 --> 00:07:45.903 We've heard politicians say of audio of their disturbing comments, 00:07:45.927 --> 00:07:47.673 "Come on, that's fake news. 00:07:47.697 --> 00:07:51.617 You can't believe what your eyes and ears are telling you." 00:07:52.402 --> 00:07:54.133 And it's that risk 00:07:54.157 --> 00:07:59.593 that professor Robert Chesney and I call the "liar's dividend": 00:07:59.617 --> 00:08:02.974 the risk that liars will invoke deepfakes 00:08:02.998 --> 00:08:05.903 to escape accountability for their wrongdoing. NOTE Paragraph 00:08:06.963 --> 00:08:10.034 So we've got our work cut out for us, there's no doubt about it. 00:08:10.606 --> 00:08:13.931 And we're going to need a proactive solution 00:08:13.955 --> 00:08:17.466 from tech companies, from lawmakers, 00:08:17.490 --> 00:08:19.474 law enforcers and the media. 00:08:20.093 --> 00:08:24.109 And we're going to need a healthy dose of societal resilience. 00:08:25.506 --> 00:08:29.402 So now, we're right now engaged in a very public conversation 00:08:29.426 --> 00:08:32.339 about the responsibility of tech companies. 00:08:32.926 --> 00:08:35.958 And my advice to social media platforms 00:08:35.982 --> 00:08:39.855 has been to change their terms of service and community guidelines 00:08:39.879 --> 00:08:42.215 to ban deepfakes that cause harm. 00:08:42.712 --> 00:08:46.672 That determination, that's going to require human judgment, 00:08:46.696 --> 00:08:48.267 and it's expensive. 00:08:48.673 --> 00:08:50.958 But we need human beings 00:08:50.982 --> 00:08:54.855 to look at the content and context of a deepfake 00:08:54.879 --> 00:08:58.561 to figure out if it is a harmful impersonation 00:08:58.585 --> 00:09:02.967 or instead, if it's valuable satire, art or education. NOTE Paragraph 00:09:04.118 --> 00:09:05.613 So now, what about the law? 00:09:06.666 --> 00:09:09.015 Law is our educator. 00:09:09.515 --> 00:09:13.553 It teaches us about what's harmful and what's wrong. 00:09:13.577 --> 00:09:18.132 And it shapes behavior it deters by punishing perpetrators 00:09:18.156 --> 00:09:20.423 and securing remedies for victims. 00:09:21.148 --> 00:09:25.428 Right now, law is not up to the challenge of deepfakes. 00:09:26.116 --> 00:09:27.506 Across the globe, 00:09:27.530 --> 00:09:29.974 we lack well-tailored laws 00:09:29.998 --> 00:09:33.568 that would be designed to tackle digital impersonations 00:09:33.592 --> 00:09:35.823 that invade sexual privacy, 00:09:35.847 --> 00:09:37.234 that damage reputations 00:09:37.258 --> 00:09:39.209 and that cause emotional distress. 00:09:39.725 --> 00:09:43.598 What happened to Rana Ayyub is increasingly commonplace. 00:09:44.074 --> 00:09:46.288 Yet, when she went to law enforcement in Delhi, 00:09:46.312 --> 00:09:48.447 she was told nothing could be done. 00:09:49.101 --> 00:09:52.284 And the sad truth is that the same would be true 00:09:52.308 --> 00:09:54.574 in the United States and in Europe. NOTE Paragraph 00:09:55.300 --> 00:09:59.656 So we have a legal vacuum that needs to be filled. 00:10:00.292 --> 00:10:04.384 My colleague Dr. Mary Anne Franks and I are working with US lawmakers 00:10:04.408 --> 00:10:09.212 to devise legislation that would ban harmful digital impersonations 00:10:09.236 --> 00:10:11.769 that are tantamount to identity theft. 00:10:12.252 --> 00:10:14.378 And we've seen similar moves 00:10:14.402 --> 00:10:17.703 in Iceland, the UK and Australia. 00:10:18.157 --> 00:10:21.416 But of course, that's just a small piece of the regulatory puzzle. NOTE Paragraph 00:10:22.911 --> 00:10:26.080 Now, I know law is not a cure-all. Right? 00:10:26.104 --> 00:10:27.704 It's a blunt instrument. 00:10:28.346 --> 00:10:29.885 And we've got to use it wisely. 00:10:30.411 --> 00:10:33.223 It also has some practical impediments. 00:10:33.657 --> 00:10:38.701 You can't leverage law against people you can't identify and find. 00:10:39.463 --> 00:10:42.749 And if a perpetrator lives outside the country 00:10:42.773 --> 00:10:44.527 where a victim lives, 00:10:44.551 --> 00:10:46.180 then you may not be able to insist 00:10:46.204 --> 00:10:48.553 that the perpetrator come into local courts 00:10:48.577 --> 00:10:49.727 to face justice. 00:10:50.236 --> 00:10:54.299 And so we're going to need a coordinated international response. 00:10:55.819 --> 00:10:59.152 Education has to be part of our response as well. 00:10:59.803 --> 00:11:03.534 Law enforcers are not going to enforce laws 00:11:03.558 --> 00:11:05.016 they don't know about 00:11:05.040 --> 00:11:07.636 and proffer problems they don't understand. 00:11:08.376 --> 00:11:10.567 In my research on cyberstalking, 00:11:10.591 --> 00:11:14.090 I found that law enforcement lacked the training 00:11:14.114 --> 00:11:16.696 to understand the laws available to them 00:11:16.720 --> 00:11:19.069 and the problem of online abuse. 00:11:19.093 --> 00:11:21.775 And so often they told victims, 00:11:21.799 --> 00:11:25.770 "Just turn your computer off. Ignore it. It'll go away." 00:11:26.261 --> 00:11:28.727 And we saw that in Rana Ayyub's case. 00:11:29.102 --> 00:11:32.570 She was told, "Come on, you're making such a big deal about this. 00:11:32.594 --> 00:11:34.337 It's boys being boys." 00:11:35.268 --> 00:11:40.520 And so we need to pair new legislation with efforts at training. NOTE Paragraph 00:11:42.053 --> 00:11:45.482 And education has to be aimed on the media as well. 00:11:46.180 --> 00:11:50.440 Journalists need educating about the phenomenon of deepfakes 00:11:50.464 --> 00:11:53.503 so they don't amplify and spread them. 00:11:54.583 --> 00:11:56.751 And this is the part where we're all involved. 00:11:56.775 --> 00:12:00.630 Each and every one of us needs educating. 00:12:01.375 --> 00:12:05.050 We click, we share, we like, and we don't even think about it. 00:12:05.551 --> 00:12:07.098 We need to do better. 00:12:07.726 --> 00:12:10.535 We need far better radar for fakery. NOTE Paragraph 00:12:13.744 --> 00:12:17.585 So as we're working through these solutions, 00:12:17.609 --> 00:12:20.172 there's going to be a lot of suffering to go around. 00:12:21.093 --> 00:12:23.839 Rana Ayyub is still wrestling with the fallout. 00:12:24.669 --> 00:12:28.858 She still doesn't feel free to express herself on- and offline. 00:12:29.566 --> 00:12:30.931 And as she told me, 00:12:30.955 --> 00:12:36.029 she still feels like there are thousands of eyes on her naked body, 00:12:36.053 --> 00:12:39.714 even though, intellectually, she knows it wasn't her body. 00:12:40.371 --> 00:12:42.720 And she has frequent panic attacks, 00:12:42.744 --> 00:12:46.844 especially when someone she doesn't know tries to take her picture. 00:12:46.868 --> 00:12:50.379 "What if they're going to make another deepfake?" she thinks to herself. 00:12:51.082 --> 00:12:55.003 And so for the sake of individuals like Rana Ayyub 00:12:55.027 --> 00:12:57.333 and the sake of our democracy, 00:12:57.357 --> 00:12:59.539 we need to do something right now. NOTE Paragraph 00:12:59.563 --> 00:13:00.714 Thank you. NOTE Paragraph 00:13:00.738 --> 00:13:03.246 (Applause)