1 00:00:00,535 --> 00:00:03,302 [This talk contains mature content] 2 00:00:05,762 --> 00:00:08,754 Rana Ayyub is a journalist in India 3 00:00:08,778 --> 00:00:11,380 whose work has exposed government corruption 4 00:00:12,411 --> 00:00:14,568 and human rights violations. 5 00:00:14,990 --> 00:00:16,157 And over the years, 6 00:00:16,181 --> 00:00:19,484 she's gotten used to vitriol and controversy around her work. 7 00:00:20,149 --> 00:00:25,258 But none of it could have prepared her for what she faced in April 2018. 8 00:00:26,125 --> 00:00:29,776 She was sitting in a café with a friend when she first saw it: 9 00:00:29,800 --> 00:00:34,743 a two-minute, 20-second video of her engaged in a sex act. 10 00:00:35,188 --> 00:00:37,537 And she couldn't believe her eyes. 11 00:00:37,561 --> 00:00:39,834 She had never made a sex video. 12 00:00:40,506 --> 00:00:43,971 But unfortunately, thousands upon thousands of people 13 00:00:43,995 --> 00:00:45,661 would believe it was her. 14 00:00:46,673 --> 00:00:49,617 I interviewed Ms. Ayyub about three months ago, 15 00:00:49,641 --> 00:00:52,145 in connection with my book on sexual privacy. 16 00:00:52,681 --> 00:00:55,879 I'm a law professor, lawyer and civil rights advocate. 17 00:00:56,204 --> 00:01:00,815 So it's incredibly frustrating knowing that right now, 18 00:01:00,839 --> 00:01:03,077 law could do very little to help her. 19 00:01:03,458 --> 00:01:05,005 And as we talked, 20 00:01:05,029 --> 00:01:09,541 she explained that she should have seen the fake sex video coming. 21 00:01:10,038 --> 00:01:15,634 She said, "After all, sex is so often used to demean and to shame women, 22 00:01:15,658 --> 00:01:18,086 especially minority women, 23 00:01:18,110 --> 00:01:22,422 and especially minority women who dare to challenge powerful men," 24 00:01:22,446 --> 00:01:23,979 as she had in her work. 25 00:01:25,191 --> 00:01:29,167 The fake sex video went viral in 48 hours. 26 00:01:30,064 --> 00:01:35,371 All of her online accounts were flooded with screenshots of the video, 27 00:01:35,395 --> 00:01:38,022 with graphic rape and death threats 28 00:01:38,046 --> 00:01:40,579 and with slurs about her Muslim faith. 29 00:01:41,426 --> 00:01:45,990 Online posts suggested that she was "available" for sex. 30 00:01:46,014 --> 00:01:47,624 And she was doxed, 31 00:01:47,648 --> 00:01:50,426 which means that her home address and her cell phone number 32 00:01:50,450 --> 00:01:52,196 were spread across the internet. 33 00:01:52,879 --> 00:01:56,963 The video was shared more than 40,000 times. 34 00:01:57,760 --> 00:02:01,696 Now, when someone is targeted with this kind of cybermob attack, 35 00:02:01,720 --> 00:02:03,783 the harm is profound. 36 00:02:04,482 --> 00:02:07,521 Rana Ayyub's life was turned upside down. 37 00:02:08,211 --> 00:02:11,545 For weeks, she could hardly eat or speak. 38 00:02:11,919 --> 00:02:15,608 She stopped writing and closed all of her social media accounts, 39 00:02:15,632 --> 00:02:18,790 which is, you know, a tough thing to do when you're a journalist. 40 00:02:19,188 --> 00:02:22,672 And she was afraid to go outside her family's home. 41 00:02:22,696 --> 00:02:25,718 What if the posters made good on their threats? 42 00:02:26,395 --> 00:02:30,760 The UN Council on Human Rights confirmed that she wasn't being crazy. 43 00:02:30,784 --> 00:02:35,421 It issued a public statement saying that they were worried about her safety. 44 00:02:36,776 --> 00:02:41,005 What Rana Ayyub faced was a deepfake: 45 00:02:41,029 --> 00:02:43,569 machine-learning technology 46 00:02:43,593 --> 00:02:47,704 that manipulates or fabricates audio and video recordings 47 00:02:47,728 --> 00:02:50,451 to show people doing and saying things 48 00:02:50,475 --> 00:02:52,341 that they never did or said. 49 00:02:52,807 --> 00:02:56,168 Deepfakes appear authentic and realistic, but they're not; 50 00:02:56,192 --> 00:02:57,964 they're total falsehoods. 51 00:02:59,228 --> 00:03:03,022 Although the technology is still developing in its sophistication, 52 00:03:03,046 --> 00:03:04,660 it is widely available. 53 00:03:05,371 --> 00:03:08,443 Now, the most recent attention to deepfakes arose, 54 00:03:08,467 --> 00:03:10,628 as so many things do online, 55 00:03:10,652 --> 00:03:11,907 with pornography. 56 00:03:12,498 --> 00:03:14,609 In early 2018, 57 00:03:14,633 --> 00:03:17,101 someone posted a tool on Reddit 58 00:03:17,125 --> 00:03:21,537 to allow users to insert faces into porn videos. 59 00:03:21,561 --> 00:03:25,001 And what followed was a cascade of fake porn videos 60 00:03:25,025 --> 00:03:27,822 featuring people's favorite female celebrities. 61 00:03:28,712 --> 00:03:32,189 And today, you can go on YouTube and pull up countless tutorials 62 00:03:32,213 --> 00:03:34,499 with step-by-step instructions 63 00:03:34,523 --> 00:03:37,686 on how to make a deepfake on your desktop application. 64 00:03:38,260 --> 00:03:41,966 And soon we may be even able to make them on our cell phones. 65 00:03:43,072 --> 00:03:48,454 Now, it's the interaction of some of our most basic human frailties 66 00:03:48,478 --> 00:03:50,160 and network tools 67 00:03:50,184 --> 00:03:52,850 that can turn deepfakes into weapons. 68 00:03:52,874 --> 00:03:54,074 So let me explain. 69 00:03:54,875 --> 00:03:59,441 As human beings, we have a visceral reaction to audio and video. 70 00:03:59,860 --> 00:04:01,348 We believe they're true, 71 00:04:01,372 --> 00:04:03,450 on the notion that of course you can believe 72 00:04:03,474 --> 00:04:05,952 what your eyes and ears are telling you. 73 00:04:06,476 --> 00:04:08,175 And it's that mechanism 74 00:04:08,199 --> 00:04:11,897 that might undermine our shared sense of reality. 75 00:04:11,921 --> 00:04:15,068 Although we believe deepfakes to be true, they're not. 76 00:04:15,604 --> 00:04:19,761 And we're attracted to the salacious, the provocative. 77 00:04:20,365 --> 00:04:23,412 We tend to believe and to share information 78 00:04:23,436 --> 00:04:25,459 that's negative and novel. 79 00:04:25,809 --> 00:04:30,828 And researchers have found that online hoaxes spread 10 times faster 80 00:04:30,852 --> 00:04:32,479 than accurate stories. 81 00:04:34,015 --> 00:04:38,395 Now, we're also drawn to information 82 00:04:38,419 --> 00:04:40,311 that aligns with our viewpoints. 83 00:04:40,950 --> 00:04:44,511 Psychologists call that tendency "confirmation bias." 84 00:04:45,300 --> 00:04:49,687 And social media platforms supercharge that tendency, 85 00:04:49,711 --> 00:04:53,592 by allowing us to instantly and widely share information 86 00:04:53,616 --> 00:04:55,408 that accords with our viewpoints. 87 00:04:56,735 --> 00:05:02,303 Now, deepfakes have the potential to cause grave individual and societal harm. 88 00:05:03,204 --> 00:05:05,228 So, imagine a deepfake 89 00:05:05,252 --> 00:05:09,434 that shows American soldiers in Afganistan burning a Koran. 90 00:05:10,807 --> 00:05:13,831 You can imagine that that deepfake would provoke violence 91 00:05:13,855 --> 00:05:15,388 against those soldiers. 92 00:05:15,847 --> 00:05:18,720 And what if the very next day 93 00:05:18,744 --> 00:05:20,998 there's another deepfake that drops, 94 00:05:21,022 --> 00:05:24,339 that shows a well-known imam based in London 95 00:05:24,363 --> 00:05:26,830 praising the attack on those soldiers? 96 00:05:27,617 --> 00:05:30,780 We might see violence and civil unrest, 97 00:05:30,804 --> 00:05:34,053 not only in Afganistan and the United Kingdom, 98 00:05:34,077 --> 00:05:35,592 but across the globe. 99 00:05:36,251 --> 00:05:37,409 And you might say to me, 100 00:05:37,433 --> 00:05:39,680 "Come on, Danielle, that's far-fetched." 101 00:05:39,704 --> 00:05:40,854 But it's not. 102 00:05:41,293 --> 00:05:43,484 We've seen falsehoods spread 103 00:05:43,508 --> 00:05:46,230 on WhatsApp and other online message services 104 00:05:46,254 --> 00:05:49,015 lead to violence against ethnic minorities. 105 00:05:49,039 --> 00:05:50,926 And that was just text -- 106 00:05:50,950 --> 00:05:52,974 imagine if it were video. 107 00:05:54,593 --> 00:05:59,950 Now, deepfakes have the potential to corrode the trust that we have 108 00:05:59,974 --> 00:06:01,966 in democratic institutions. 109 00:06:03,006 --> 00:06:05,673 So, imagine the night before an election. 110 00:06:05,996 --> 00:06:09,234 There's a deepfake showing one of the major party candidates 111 00:06:09,258 --> 00:06:10,408 gravely sick. 112 00:06:11,202 --> 00:06:13,535 The deepfake could tip the election 113 00:06:13,559 --> 00:06:16,934 and shake our sense that elections are legitimate. 114 00:06:18,515 --> 00:06:21,841 Imagine if the night before an initial public offering 115 00:06:21,865 --> 00:06:24,198 of a major global bank, 116 00:06:24,222 --> 00:06:27,371 there was a deepfake showing the bank's CEO 117 00:06:27,395 --> 00:06:30,092 drunkenly spouting conspiracy theories. 118 00:06:30,887 --> 00:06:33,934 The deepfake could tank the IPO, 119 00:06:33,958 --> 00:06:38,073 and worse, shake our sense that financial markets are stable. 120 00:06:39,385 --> 00:06:46,374 So deepfakes can exploit and magnify the deep distrust that we already have 121 00:06:46,398 --> 00:06:50,612 in politicians, business leaders and other influential leaders. 122 00:06:50,945 --> 00:06:54,229 They find an audience primed to believe them. 123 00:06:55,287 --> 00:06:58,052 And the pursuit of truth is on the line as well. 124 00:06:59,077 --> 00:07:02,641 Technologists expect that with advances in AI, 125 00:07:02,665 --> 00:07:06,347 soon it may be difficult if not impossible 126 00:07:06,371 --> 00:07:10,140 to tell the difference between a real video and a fake one. 127 00:07:11,022 --> 00:07:16,363 So how can the truth emerge in a deepfake-ridden marketplace of ideas? 128 00:07:16,752 --> 00:07:20,172 Will we just proceed along the path of least resistance 129 00:07:20,196 --> 00:07:22,633 and believe what we want to believe, 130 00:07:22,657 --> 00:07:23,807 truth be damned? 131 00:07:24,831 --> 00:07:28,006 And not only might we believe the fakery, 132 00:07:28,030 --> 00:07:31,356 we might start disbelieving the truth. 133 00:07:31,887 --> 00:07:35,966 We've already seen people invoke the phenomenon of deepfakes 134 00:07:35,990 --> 00:07:39,910 to cast doubt on real evidence of their wrongdoing. 135 00:07:39,934 --> 00:07:45,903 We've heard politicians say of audio of their disturbing comments, 136 00:07:45,927 --> 00:07:47,673 "Come on, that's fake news. 137 00:07:47,697 --> 00:07:51,617 You can't believe what your eyes and ears are telling you." 138 00:07:52,402 --> 00:07:54,133 And it's that risk 139 00:07:54,157 --> 00:07:59,593 that professor Robert Chesney and I call the "liar's dividend": 140 00:07:59,617 --> 00:08:02,974 the risk that liars will invoke deepfakes 141 00:08:02,998 --> 00:08:05,903 to escape accountability for their wrongdoing. 142 00:08:06,963 --> 00:08:10,034 So we've got our work cut out for us, there's no doubt about it. 143 00:08:10,606 --> 00:08:13,931 And we're going to need a proactive solution 144 00:08:13,955 --> 00:08:17,466 from tech companies, from lawmakers, 145 00:08:17,490 --> 00:08:19,474 law enforcers and the media. 146 00:08:20,093 --> 00:08:24,109 And we're going to need a healthy dose of societal resilience. 147 00:08:25,506 --> 00:08:29,402 So now, we're right now engaged in a very public conversation 148 00:08:29,426 --> 00:08:32,339 about the responsibility of tech companies. 149 00:08:32,926 --> 00:08:35,958 And my advice to social media platforms 150 00:08:35,982 --> 00:08:39,855 has been to change their terms of service and community guidelines 151 00:08:39,879 --> 00:08:42,215 to ban deepfakes that cause harm. 152 00:08:42,712 --> 00:08:46,672 That determination, that's going to require human judgment, 153 00:08:46,696 --> 00:08:48,267 and it's expensive. 154 00:08:48,673 --> 00:08:50,958 But we need human beings 155 00:08:50,982 --> 00:08:54,855 to look at the content and context of a deepfake 156 00:08:54,879 --> 00:08:58,561 to figure out if it is a harmful impersonation 157 00:08:58,585 --> 00:09:02,967 or instead, if it's valuable satire, art or education. 158 00:09:04,118 --> 00:09:05,613 So now, what about the law? 159 00:09:06,666 --> 00:09:09,015 Law is our educator. 160 00:09:09,515 --> 00:09:13,553 It teaches us about what's harmful and what's wrong. 161 00:09:13,577 --> 00:09:18,132 And it shapes behavior it deters by punishing perpetrators 162 00:09:18,156 --> 00:09:20,423 and securing remedies for victims. 163 00:09:21,148 --> 00:09:25,428 Right now, law is not up to the challenge of deepfakes. 164 00:09:26,116 --> 00:09:27,506 Across the globe, 165 00:09:27,530 --> 00:09:29,974 we lack well-tailored laws 166 00:09:29,998 --> 00:09:33,568 that would be designed to tackle digital impersonations 167 00:09:33,592 --> 00:09:35,823 that invade sexual privacy, 168 00:09:35,847 --> 00:09:37,234 that damage reputations 169 00:09:37,258 --> 00:09:39,209 and that cause emotional distress. 170 00:09:39,725 --> 00:09:43,598 What happened to Rana Ayyub is increasingly commonplace. 171 00:09:44,074 --> 00:09:46,288 Yet, when she went to law enforcement in Delhi, 172 00:09:46,312 --> 00:09:48,447 she was told nothing could be done. 173 00:09:49,101 --> 00:09:52,284 And the sad truth is that the same would be true 174 00:09:52,308 --> 00:09:54,574 in the United States and in Europe. 175 00:09:55,300 --> 00:09:59,656 So we have a legal vacuum that needs to be filled. 176 00:10:00,292 --> 00:10:04,384 My colleague Dr. Mary Anne Franks and I are working with US lawmakers 177 00:10:04,408 --> 00:10:09,212 to devise legislation that would ban harmful digital impersonations 178 00:10:09,236 --> 00:10:11,769 that are tantamount to identity theft. 179 00:10:12,252 --> 00:10:14,378 And we've seen similar moves 180 00:10:14,402 --> 00:10:17,703 in Iceland, the UK and Australia. 181 00:10:18,157 --> 00:10:21,416 But of course, that's just a small piece of the regulatory puzzle. 182 00:10:22,911 --> 00:10:26,080 Now, I know law is not a cure-all. Right? 183 00:10:26,104 --> 00:10:27,704 It's a blunt instrument. 184 00:10:28,346 --> 00:10:29,885 And we've got to use it wisely. 185 00:10:30,411 --> 00:10:33,223 It also has some practical impediments. 186 00:10:33,657 --> 00:10:38,701 You can't leverage law against people you can't identify and find. 187 00:10:39,463 --> 00:10:42,749 And if a perpetrator lives outside the country 188 00:10:42,773 --> 00:10:44,527 where a victim lives, 189 00:10:44,551 --> 00:10:46,180 then you may not be able to insist 190 00:10:46,204 --> 00:10:48,553 that the perpetrator come into local courts 191 00:10:48,577 --> 00:10:49,727 to face justice. 192 00:10:50,236 --> 00:10:54,299 And so we're going to need a coordinated international response. 193 00:10:55,819 --> 00:10:59,152 Education has to be part of our response as well. 194 00:10:59,803 --> 00:11:03,534 Law enforcers are not going to enforce laws 195 00:11:03,558 --> 00:11:05,016 they don't know about 196 00:11:05,040 --> 00:11:07,636 and proffer problems they don't understand. 197 00:11:08,376 --> 00:11:10,567 In my research on cyberstalking, 198 00:11:10,591 --> 00:11:14,090 I found that law enforcement lacked the training 199 00:11:14,114 --> 00:11:16,696 to understand the laws available to them 200 00:11:16,720 --> 00:11:19,069 and the problem of online abuse. 201 00:11:19,093 --> 00:11:21,775 And so often they told victims, 202 00:11:21,799 --> 00:11:25,770 "Just turn your computer off. Ignore it. It'll go away." 203 00:11:26,261 --> 00:11:28,727 And we saw that in Rana Ayyub's case. 204 00:11:29,102 --> 00:11:32,570 She was told, "Come on, you're making such a big deal about this. 205 00:11:32,594 --> 00:11:34,337 It's boys being boys." 206 00:11:35,268 --> 00:11:40,520 And so we need to pair new legislation with efforts at training. 207 00:11:42,053 --> 00:11:45,482 And education has to be aimed on the media as well. 208 00:11:46,180 --> 00:11:50,440 Journalists need educating about the phenomenon of deepfakes 209 00:11:50,464 --> 00:11:53,503 so they don't amplify and spread them. 210 00:11:54,583 --> 00:11:56,751 And this is the part where we're all involved. 211 00:11:56,775 --> 00:12:00,630 Each and every one of us needs educating. 212 00:12:01,375 --> 00:12:05,050 We click, we share, we like, and we don't even think about it. 213 00:12:05,551 --> 00:12:07,098 We need to do better. 214 00:12:07,726 --> 00:12:10,535 We need far better radar for fakery. 215 00:12:13,744 --> 00:12:17,585 So as we're working through these solutions, 216 00:12:17,609 --> 00:12:20,172 there's going to be a lot of suffering to go around. 217 00:12:21,093 --> 00:12:23,839 Rana Ayyub is still wrestling with the fallout. 218 00:12:24,669 --> 00:12:28,858 She still doesn't feel free to express herself on- and offline. 219 00:12:29,566 --> 00:12:30,931 And as she told me, 220 00:12:30,955 --> 00:12:36,029 she still feels like there are thousands of eyes on her naked body, 221 00:12:36,053 --> 00:12:39,714 even though, intellectually, she knows it wasn't her body. 222 00:12:40,371 --> 00:12:42,720 And she has frequent panic attacks, 223 00:12:42,744 --> 00:12:46,844 especially when someone she doesn't know tries to take her picture. 224 00:12:46,868 --> 00:12:50,379 "What if they're going to make another deepfake?" she thinks to herself. 225 00:12:51,082 --> 00:12:55,003 And so for the sake of individuals like Rana Ayyub 226 00:12:55,027 --> 00:12:57,333 and the sake of our democracy, 227 00:12:57,357 --> 00:12:59,539 we need to do something right now. 228 00:12:59,563 --> 00:13:00,714 Thank you. 229 00:13:00,738 --> 00:13:03,246 (Applause)