YouTube

Got a YouTube account?

New: enable viewer-created translations and captions on your YouTube channel!

English subtitles

← How deepfakes undermine truth and threaten democracy

Get Embed Code
17 Languages

Showing Revision 8 created 09/11/2019 by Brian Greene.

  1. [This talk contains mature content]
  2. Rana Ayyub is a journalist in India

  3. whose work has exposed
    government corruption
  4. and human rights violations.
  5. And over the years,
  6. she's gotten used to vitriol
    and controversy around her work.
  7. But none of it could have prepared her
    for what she faced in April 2018.
  8. She was sitting in a café with a friend
    when she first saw it:

  9. a two-minute, 20-second video
    of her engaged in a sex act.
  10. And she couldn't believe her eyes.
  11. She had never made a sex video.
  12. But unfortunately, thousands
    upon thousands of people
  13. would believe it was her.
  14. I interviewed Ms. Ayyub
    about three months ago,

  15. in connection with my book
    on sexual privacy.
  16. I'm a law professor, lawyer
    and civil rights advocate.
  17. So it's incredibly frustrating
    knowing that right now,
  18. law could do very little to help her.
  19. And as we talked,
  20. she explained that she should have seen
    the fake sex video coming.
  21. She said, "After all, sex is so often used
    to demean and to shame women,
  22. especially minority women,
  23. and especially minority women
    who dare to challenge powerful men,"
  24. as she had in her work.
  25. The fake sex video went viral in 48 hours.
  26. All of her online accounts were flooded
    with screenshots of the video,
  27. with graphic rape and death threats
  28. and with slurs about her Muslim faith.
  29. Online posts suggested that
    she was "available" for sex.
  30. And she was doxed,
  31. which means that her home address
    and her cell phone number
  32. were spread across the internet.
  33. The video was shared
    more than 40,000 times.
  34. Now, when someone is targeted
    with this kind of cybermob attack,

  35. the harm is profound.
  36. Rana Ayyub's life was turned upside down.
  37. For weeks, she could hardly eat or speak.
  38. She stopped writing and closed
    all of her social media accounts,
  39. which is, you know, a tough thing to do
    when you're a journalist.
  40. And she was afraid to go outside
    her family's home.
  41. What if the posters
    made good on their threats?
  42. The UN Council on Human Rights
    confirmed that she wasn't being crazy.
  43. It issued a public statement saying
    that they were worried about her safety.
  44. What Rana Ayyub faced was a deepfake:

  45. machine-learning technology
  46. that manipulates or fabricates
    audio and video recordings
  47. to show people doing and saying things
  48. that they never did or said.
  49. Deepfakes appear authentic
    and realistic, but they're not;
  50. they're total falsehoods.
  51. Although the technology
    is still developing in its sophistication,
  52. it is widely available.
  53. Now, the most recent attention
    to deepfakes arose,

  54. as so many things do online,
  55. with pornography.
  56. In early 2018,

  57. someone posted a tool on Reddit
  58. to allow users to insert faces
    into porn videos.
  59. And what followed was a cascade
    of fake porn videos
  60. featuring people's favorite
    female celebrities.
  61. And today, you can go on YouTube
    and pull up countless tutorials
  62. with step-by-step instructions
  63. on how to make a deepfake
    on your desktop application.
  64. And soon we may be even able
    to make them on our cell phones.
  65. Now, it's the interaction
    of some of our most basic human frailties
  66. and network tools
  67. that can turn deepfakes into weapons.
  68. So let me explain.
  69. As human beings, we have
    a visceral reaction to audio and video.

  70. We believe they're true,
  71. on the notion that
    of course you can believe
  72. what your eyes and ears are telling you.
  73. And it's that mechanism
  74. that might undermine our shared
    sense of reality.
  75. Although we believe deepfakes
    to be true, they're not.
  76. And we're attracted
    to the salacious, the provocative.
  77. We tend to believe
    and to share information
  78. that's negative and novel.
  79. And researchers have found that online
    hoaxes spread 10 times faster
  80. than accurate stories.
  81. Now, we're also drawn to information
  82. that aligns with our viewpoints.
  83. Psychologists call that tendency
    "confirmation bias."
  84. And social media platforms
    supercharge that tendency,
  85. by allowing us to instantly
    and widely share information
  86. that accords with our viewpoints.
  87. Now, deepfakes have the potential to cause
    grave individual and societal harm.

  88. So, imagine a deepfake
  89. that shows American soldiers
    in Afganistan burning a Koran.
  90. You can imagine that that deepfake
    would provoke violence
  91. against those soldiers.
  92. And what if the very next day
  93. there's another deepfake that drops,
  94. that shows a well-known imam
    based in London
  95. praising the attack on those soldiers?
  96. We might see violence and civil unrest,
  97. not only in Afganistan
    and the United Kingdom,
  98. but across the globe.
  99. And you might say to me,

  100. "Come on, Danielle, that's far-fetched."
  101. But it's not.
  102. We've seen falsehoods spread
  103. on WhatsApp and other
    online message services
  104. lead to violence
    against ethnic minorities.
  105. And that was just text --
  106. imagine if it were video.
  107. Now, deepfakes have the potential
    to corrode the trust that we have

  108. in democratic institutions.
  109. So, imagine the night before an election.
  110. There's a deepfake showing
    one of the major party candidates
  111. gravely sick.
  112. The deepfake could tip the election
  113. and shake our sense
    that elections are legitimate.
  114. Imagine if the night before
    an initial public offering
  115. of a major global bank,
  116. there was a deepfake
    showing the bank's CEO
  117. drunkenly spouting conspiracy theories.
  118. The deepfake could tank the IPO,
  119. and worse, shake our sense
    that financial markets are stable.
  120. So deepfakes can exploit and magnify
    the deep distrust that we already have

  121. in politicians, business leaders
    and other influential leaders.
  122. They find an audience
    primed to believe them.
  123. And the pursuit of truth
    is on the line as well.
  124. Technologists expect
    that with advances in AI,
  125. soon it may be difficult if not impossible
  126. to tell the difference between
    a real video and a fake one.
  127. So how can the truth emerge
    in a deepfake-ridden marketplace of ideas?

  128. Will we just proceed along
    the path of least resistance
  129. and believe what we want to believe,
  130. truth be damned?
  131. And not only might we believe the fakery,
  132. we might start disbelieving the truth.
  133. We've already seen people invoke
    the phenomenon of deepfakes
  134. to cast doubt on real evidence
    of their wrongdoing.
  135. We've heard politicians say of audio
    of their disturbing comments,
  136. "Come on, that's fake news.
  137. You can't believe what your eyes
    and ears are telling you."
  138. And it's that risk
  139. that professor Robert Chesney and I
    call the "liar's dividend":
  140. the risk that liars will invoke deepfakes
  141. to escape accountability
    for their wrongdoing.
  142. So we've got our work cut out for us,
    there's no doubt about it.

  143. And we're going to need
    a proactive solution
  144. from tech companies, from lawmakers,
  145. law enforcers and the media.
  146. And we're going to need
    a healthy dose of societal resilience.
  147. So now, we're right now engaged
    in a very public conversation
  148. about the responsibility
    of tech companies.
  149. And my advice to social media platforms
  150. has been to change their terms of service
    and community guidelines
  151. to ban deepfakes that cause harm.
  152. That determination,
    that's going to require human judgment,
  153. and it's expensive.
  154. But we need human beings
  155. to look at the content
    and context of a deepfake
  156. to figure out if it is
    a harmful impersonation
  157. or instead, if it's valuable
    satire, art or education.
  158. So now, what about the law?

  159. Law is our educator.
  160. It teaches us about
    what's harmful and what's wrong.
  161. And it shapes behavior it deters
    by punishing perpetrators
  162. and securing remedies for victims.
  163. Right now, law is not up to
    the challenge of deepfakes.
  164. Across the globe,
  165. we lack well-tailored laws
  166. that would be designed to tackle
    digital impersonations
  167. that invade sexual privacy,
  168. that damage reputations
  169. and that cause emotional distress.
  170. What happened to Rana Ayyub
    is increasingly commonplace.
  171. Yet, when she went
    to law enforcement in Delhi,
  172. she was told nothing could be done.
  173. And the sad truth is
    that the same would be true
  174. in the United States and in Europe.
  175. So we have a legal vacuum
    that needs to be filled.

  176. My colleague Dr. Mary Anne Franks and I
    are working with US lawmakers
  177. to devise legislation that would ban
    harmful digital impersonations
  178. that are tantamount to identity theft.
  179. And we've seen similar moves
  180. in Iceland, the UK and Australia.
  181. But of course, that's just a small piece
    of the regulatory puzzle.
  182. Now, I know law is not a cure-all. Right?

  183. It's a blunt instrument.
  184. And we've got to use it wisely.
  185. It also has some practical impediments.
  186. You can't leverage law against people
    you can't identify and find.
  187. And if a perpetrator lives
    outside the country
  188. where a victim lives,
  189. then you may not be able to insist
  190. that the perpetrator
    come into local courts
  191. to face justice.
  192. And so we're going to need
    a coordinated international response.
  193. Education has to be part
    of our response as well.
  194. Law enforcers are not
    going to enforce laws
  195. they don't know about
  196. and proffer problems
    they don't understand.
  197. In my research on cyberstalking,
  198. I found that law enforcement
    lacked the training
  199. to understand the laws available to them
  200. and the problem of online abuse.
  201. And so often they told victims,
  202. "Just turn your computer off.
    Ignore it. It'll go away."
  203. And we saw that in Rana Ayyub's case.
  204. She was told, "Come on,
    you're making such a big deal about this.
  205. It's boys being boys."
  206. And so we need to pair new legislation
    with efforts at training.
  207. And education has to be aimed
    on the media as well.

  208. Journalists need educating
    about the phenomenon of deepfakes
  209. so they don't amplify and spread them.
  210. And this is the part
    where we're all involved.
  211. Each and every one of us needs educating.
  212. We click, we share, we like,
    and we don't even think about it.
  213. We need to do better.
  214. We need far better radar for fakery.
  215. So as we're working
    through these solutions,

  216. there's going to be
    a lot of suffering to go around.
  217. Rana Ayyub is still wrestling
    with the fallout.
  218. She still doesn't feel free
    to express herself on- and offline.
  219. And as she told me,
  220. she still feels like there are thousands
    of eyes on her naked body,
  221. even though, intellectually,
    she knows it wasn't her body.
  222. And she has frequent panic attacks,
  223. especially when someone she doesn't know
    tries to take her picture.
  224. "What if they're going to make
    another deepfake?" she thinks to herself.
  225. And so for the sake of
    individuals like Rana Ayyub
  226. and the sake of our democracy,
  227. we need to do something right now.
  228. Thank you.

  229. (Applause)