0:00:01.548,0:00:04.064 No matter who you are or where you live, 0:00:04.088,0:00:06.444 I'm guessing that you have[br]at least one relative 0:00:06.468,0:00:08.802 that likes to forward those emails. 0:00:09.206,0:00:11.106 You know the ones I'm talking about -- 0:00:11.130,0:00:13.984 the ones with dubious claims[br]or conspiracy videos. 0:00:14.315,0:00:16.983 And you've probably[br]already muted them on Facebook 0:00:17.007,0:00:19.355 for sharing social posts like this one. 0:00:19.379,0:00:20.783 It's an image of a banana 0:00:20.807,0:00:23.474 with a strange red cross[br]running through the center. 0:00:23.498,0:00:25.637 And the text around it is warning people 0:00:25.661,0:00:27.819 not to eat fruits that look like this, 0:00:27.843,0:00:29.871 suggesting they've been[br]injected with blood 0:00:29.895,0:00:32.025 contaminated with the HIV virus. 0:00:32.049,0:00:34.652 And the social share message[br]above it simply says, 0:00:34.676,0:00:36.857 "Please forward to save lives." 0:00:37.672,0:00:40.966 Now, fact-checkers have been debunking[br]this one for years, 0:00:40.990,0:00:43.799 but it's one of those rumors[br]that just won't die. 0:00:43.823,0:00:45.094 A zombie rumor. 0:00:45.513,0:00:47.606 And, of course, it's entirely false. 0:00:48.180,0:00:51.139 It might be tempting to laugh[br]at an example like this, to say, 0:00:51.163,0:00:53.047 "Well, who would believe this, anyway?" 0:00:53.419,0:00:55.045 But the reason it's a zombie rumor 0:00:55.069,0:00:58.958 is because it taps into people's[br]deepest fears about their own safety 0:00:58.982,0:01:01.157 and that of the people they love. 0:01:01.783,0:01:05.056 And if you spend as enough time[br]as I have looking at misinformation, 0:01:05.080,0:01:07.500 you know that this is just[br]one example of many 0:01:07.524,0:01:10.571 that taps into people's deepest[br]fears and vulnerabilities. 0:01:11.214,0:01:15.583 Every day, across the world,[br]we see scores of new memes on Instagram 0:01:15.607,0:01:18.646 encouraging parents[br]not to vaccinate their children. 0:01:18.670,0:01:23.202 We see new videos on YouTube[br]explaining that climate change is a hoax. 0:01:23.226,0:01:27.528 And across all platforms, we see[br]endless posts designed to demonize others 0:01:27.552,0:01:31.053 on the basis of their race,[br]religion or sexuality. 0:01:32.314,0:01:35.344 Welcome to one of the central[br]challenges of our time. 0:01:35.647,0:01:39.672 How can we maintain an internet[br]with freedom of expression at the core, 0:01:39.696,0:01:42.999 while also ensuring that the content[br]that's being disseminated 0:01:43.023,0:01:46.909 doesn't cause irreparable harms[br]to our democracies, our communities 0:01:46.933,0:01:49.171 and to our physical and mental well-being? 0:01:49.998,0:01:52.085 Because we live in the information age, 0:01:52.109,0:01:55.656 yet the central currency[br]upon which we all depend -- information -- 0:01:55.680,0:01:58.037 is no longer deemed entirely trustworthy 0:01:58.061,0:02:00.389 and, at times, can appear[br]downright dangerous. 0:02:00.811,0:02:04.748 This is thanks in part to the runaway[br]growth of social sharing platforms 0:02:04.772,0:02:06.414 that allow us to scroll through, 0:02:06.438,0:02:08.660 where lies and facts sit side by side, 0:02:08.684,0:02:11.755 but with none of the traditional[br]signals of trustworthiness. 0:02:12.268,0:02:15.887 And goodness -- our language around this[br]is horribly muddled. 0:02:15.911,0:02:19.014 People are still obsessed[br]with the phrase "fake news," 0:02:19.038,0:02:21.569 despite the fact that[br]it's extraordinarily unhelpful 0:02:21.593,0:02:25.053 and used to describe a number of things[br]that are actually very different: 0:02:25.077,0:02:28.463 lies, rumors, hoaxes,[br]conspiracies, propaganda. 0:02:28.911,0:02:31.823 And I really wish[br]we could stop using a phrase 0:02:31.847,0:02:34.709 that's been co-opted by politicians[br]right around the world, 0:02:34.733,0:02:36.204 from the left and the right, 0:02:36.228,0:02:39.450 used as a weapon to attack[br]a free and independent press. 0:02:40.307,0:02:45.009 (Applause) 0:02:45.033,0:02:48.495 Because we need our professional[br]news media now more than ever. 0:02:48.882,0:02:52.255 And besides, most of this content[br]doesn't even masquerade as news. 0:02:52.279,0:02:54.921 It's memes, videos, social posts. 0:02:54.945,0:02:58.398 And most of it is not fake;[br]it's misleading. 0:02:58.422,0:03:01.437 We tend to fixate on what's true or false. 0:03:01.461,0:03:05.493 But the biggest concern is actually[br]the weaponization of context. 0:03:06.855,0:03:08.823 Because the most effective disinformation 0:03:08.847,0:03:11.895 has always been that[br]which has a kernel of truth to it. 0:03:11.919,0:03:15.395 Let's take this example[br]from London, from March 2017, 0:03:15.419,0:03:16.959 a tweet that circulated widely 0:03:16.983,0:03:20.570 in the aftermath of a terrorist incident[br]on Westminster Bridge. 0:03:20.594,0:03:23.022 This is a genuine image, not fake. 0:03:23.046,0:03:26.215 The woman who appears in the photograph[br]was interviewed afterwards, 0:03:26.239,0:03:28.648 and she explained that[br]she was utterly traumatized. 0:03:28.672,0:03:30.410 She was on the phone to a loved one, 0:03:30.434,0:03:33.052 and she wasn't looking[br]at the victim out of respect. 0:03:33.076,0:03:37.036 But it still was circulated widely[br]with this Islamophobic framing, 0:03:37.060,0:03:40.106 with multiple hashtags,[br]including: #BanIslam. 0:03:40.425,0:03:42.823 Now, if you worked at Twitter,[br]what would you do? 0:03:42.847,0:03:45.409 Would you take that down,[br]or would you leave it up? 0:03:46.553,0:03:49.982 My gut reaction, my emotional reaction,[br]is to take this down. 0:03:50.006,0:03:52.148 I hate the framing of this image. 0:03:52.585,0:03:54.973 But freedom of expression[br]is a human right, 0:03:54.997,0:03:58.222 and if we start taking down speech[br]that makes us feel uncomfortable, 0:03:58.246,0:03:59.476 we're in trouble. 0:03:59.500,0:04:01.794 And this might look like a clear-cut case, 0:04:01.818,0:04:03.516 but, actually, most speech isn't. 0:04:03.540,0:04:05.976 These lines are incredibly[br]difficult to draw. 0:04:06.000,0:04:08.281 What's a well-meaning[br]decision by one person 0:04:08.305,0:04:10.382 is outright censorship to the next. 0:04:10.759,0:04:13.688 What we now know is that[br]this account, Texas Lone Star, 0:04:13.712,0:04:16.942 was part of a wider Russian[br]disinformation campaign, 0:04:16.966,0:04:19.117 one that has since been taken down. 0:04:19.141,0:04:20.704 Would that change your view? 0:04:21.322,0:04:22.481 It would mine, 0:04:22.505,0:04:24.806 because now it's a case[br]of a coordinated campaign 0:04:24.830,0:04:26.045 to sow discord. 0:04:26.069,0:04:28.030 And for those of you who'd like to think 0:04:28.054,0:04:30.885 that artificial intelligence[br]will solve all of our problems, 0:04:30.909,0:04:33.134 I think we can agree[br]that we're a long way away 0:04:33.158,0:04:35.745 from AI that's able to make sense[br]of posts like this. 0:04:36.856,0:04:39.363 So I'd like to explain[br]three interlocking issues 0:04:39.387,0:04:41.760 that make this so complex 0:04:41.784,0:04:44.906 and then think about some ways[br]we can consider these challenges. 0:04:45.348,0:04:49.238 First, we just don't have[br]a rational relationship to information, 0:04:49.262,0:04:50.730 we have an emotional one. 0:04:50.754,0:04:54.548 It's just not true that more facts[br]will make everything OK, 0:04:54.572,0:04:57.672 because the algorithms that determine[br]what content we see, 0:04:57.696,0:05:00.823 well, they're designed to reward[br]our emotional responses. 0:05:00.847,0:05:02.228 And when we're fearful, 0:05:02.252,0:05:05.426 oversimplified narratives,[br]conspiratorial explanations 0:05:05.450,0:05:08.868 and language that demonizes others[br]is far more effective. 0:05:09.538,0:05:11.412 And besides, many of these companies, 0:05:11.436,0:05:13.982 their business model[br]is attached to attention, 0:05:14.006,0:05:17.696 which means these algorithms[br]will always be skewed towards emotion. 0:05:18.371,0:05:22.669 Second, most of the speech[br]I'm talking about here is legal. 0:05:23.081,0:05:24.527 It would be a different matter 0:05:24.551,0:05:26.892 if I was talking about[br]child sexual abuse imagery 0:05:26.916,0:05:28.843 or content that incites violence. 0:05:28.867,0:05:32.137 It can be perfectly legal[br]to post an outright lie. 0:05:33.130,0:05:37.164 But people keep talking about taking down[br]"problematic" or "harmful" content, 0:05:37.188,0:05:39.797 but with no clear definition[br]of what they mean by that, 0:05:39.821,0:05:41.085 including Mark Zuckerberg, 0:05:41.109,0:05:44.521 who recently called for global[br]regulation to moderate speech. 0:05:44.870,0:05:47.085 And my concern is that[br]we're seeing governments 0:05:47.109,0:05:48.401 right around the world 0:05:48.425,0:05:51.101 rolling out hasty policy decisions 0:05:51.125,0:05:53.871 that might actually trigger[br]much more serious consequences 0:05:53.895,0:05:55.609 when it comes to our speech. 0:05:56.006,0:05:59.712 And even if we could decide[br]which speech to take up or take down, 0:05:59.736,0:06:01.910 we've never had so much speech. 0:06:01.934,0:06:04.065 Every second, millions[br]of pieces of content 0:06:04.089,0:06:06.196 are uploaded by people[br]right around the world 0:06:06.220,0:06:07.388 in different languages, 0:06:07.412,0:06:10.180 drawing on thousands[br]of different cultural contexts. 0:06:10.204,0:06:12.736 We've simply never had[br]effective mechanisms 0:06:12.760,0:06:14.498 to moderate speech at this scale, 0:06:14.522,0:06:17.323 whether powered by humans[br]or by technology. 0:06:18.284,0:06:22.228 And third, these companies --[br]Google, Twitter, Facebook, WhatsApp -- 0:06:22.252,0:06:25.093 they're part of a wider[br]information ecosystem. 0:06:25.117,0:06:28.469 We like to lay all the blame[br]at their feet, but the truth is, 0:06:28.493,0:06:32.323 the mass media and elected officials[br]can also play an equal role 0:06:32.347,0:06:35.260 in amplifying rumors and conspiracies[br]when they want to. 0:06:35.800,0:06:40.744 As can we, when we mindlessly forward[br]divisive or misleading content 0:06:40.768,0:06:42.053 without trying. 0:06:42.077,0:06:43.877 We're adding to the pollution. 0:06:45.236,0:06:47.854 I know we're all looking for an easy fix. 0:06:47.878,0:06:49.545 But there just isn't one. 0:06:49.950,0:06:54.395 Any solution will have to be rolled out[br]at a massive scale, internet scale, 0:06:54.419,0:06:57.680 and yes, the platforms,[br]they're used to operating at that level. 0:06:57.704,0:07:01.176 But can and should we allow them[br]to fix these problems? 0:07:01.668,0:07:02.900 They're certainly trying. 0:07:02.924,0:07:07.010 But most of us would agree that, actually,[br]we don't want global corporations 0:07:07.034,0:07:09.366 to be the guardians of truth[br]and fairness online. 0:07:09.390,0:07:11.927 And I also think the platforms[br]would agree with that. 0:07:12.257,0:07:15.138 And at the moment,[br]they're marking their own homework. 0:07:15.162,0:07:16.360 They like to tell us 0:07:16.384,0:07:18.963 that the interventions[br]they're rolling out are working, 0:07:18.987,0:07:21.527 but because they write[br]their own transparency reports, 0:07:21.551,0:07:25.169 there's no way for us to independently[br]verify what's actually happening. 0:07:26.431,0:07:29.773 (Applause) 0:07:29.797,0:07:32.749 And let's also be clear[br]that most of the changes we see 0:07:32.773,0:07:35.767 only happen after journalists[br]undertake an investigation 0:07:35.791,0:07:37.402 and find evidence of bias 0:07:37.426,0:07:40.255 or content that breaks[br]their community guidelines. 0:07:40.815,0:07:45.410 So yes, these companies have to play[br]a really important role in this process, 0:07:45.434,0:07:46.994 but they can't control it. 0:07:47.855,0:07:49.373 So what about governments? 0:07:49.863,0:07:52.959 Many people believe[br]that global regulation is our last hope 0:07:52.983,0:07:55.863 in terms of cleaning up[br]our information ecosystem. 0:07:55.887,0:07:59.053 But what I see are lawmakers[br]who are struggling to keep up to date 0:07:59.077,0:08:01.418 with the rapid changes in technology. 0:08:01.442,0:08:03.346 And worse, they're working in the dark, 0:08:03.370,0:08:05.191 because they don't have access to data 0:08:05.215,0:08:07.865 to understand what's happening[br]on these platforms. 0:08:08.260,0:08:11.331 And anyway, which governments[br]would we trust to do this? 0:08:11.355,0:08:14.125 We need a global response,[br]not a national one. 0:08:15.419,0:08:17.696 So the missing link is us. 0:08:17.720,0:08:20.843 It's those people who use[br]these technologies every day. 0:08:21.260,0:08:25.851 Can we design a new infrastructure[br]to support quality information? 0:08:26.371,0:08:27.601 Well, I believe we can, 0:08:27.625,0:08:30.982 and I've got a few ideas about[br]what we might be able to actually do. 0:08:31.006,0:08:34.109 So firstly, if we're serious[br]about bringing the public into this, 0:08:34.133,0:08:36.514 can we take some inspiration[br]from Wikipedia? 0:08:36.538,0:08:38.362 They've shown us what's possible. 0:08:38.386,0:08:39.537 Yes, it's not perfect, 0:08:39.561,0:08:42.195 but they've demonstrated[br]that with the right structures, 0:08:42.219,0:08:44.854 with a global outlook[br]and lots and lots of transparency, 0:08:44.878,0:08:47.974 you can build something[br]that will earn the trust of most people. 0:08:47.998,0:08:51.160 Because we have to find a way[br]to tap into the collective wisdom 0:08:51.184,0:08:53.493 and experience of all users. 0:08:53.517,0:08:56.163 This is particularly the case[br]for women, people of color 0:08:56.187,0:08:57.533 and underrepresented groups. 0:08:57.557,0:08:58.723 Because guess what? 0:08:58.747,0:09:01.482 They are experts when it comes[br]to hate and disinformation, 0:09:01.506,0:09:05.022 because they have been the targets[br]of these campaigns for so long. 0:09:05.046,0:09:07.396 And over the years,[br]they've been raising flags, 0:09:07.420,0:09:09.085 and they haven't been listened to. 0:09:09.109,0:09:10.389 This has got to change. 0:09:10.807,0:09:15.133 So could we build a Wikipedia for trust? 0:09:15.157,0:09:19.346 Could we find a way that users[br]can actually provide insights? 0:09:19.370,0:09:23.067 They could offer insights around[br]difficult content-moderation decisions. 0:09:23.091,0:09:24.554 They could provide feedback 0:09:24.578,0:09:27.619 when platforms decide[br]they want to roll out new changes. 0:09:28.241,0:09:32.403 Second, people's experiences[br]with the information is personalized. 0:09:32.427,0:09:35.070 My Facebook news feed[br]is very different to yours. 0:09:35.094,0:09:37.839 Your YouTube recommendations[br]are very different to mine. 0:09:37.863,0:09:40.355 That makes it impossible for us[br]to actually examine 0:09:40.379,0:09:42.402 what information people are seeing. 0:09:42.815,0:09:44.204 So could we imagine 0:09:44.228,0:09:49.006 developing some kind of centralized[br]open repository for anonymized data, 0:09:49.030,0:09:51.894 with privacy and ethical[br]concerns built in? 0:09:52.220,0:09:53.998 Because imagine what we would learn 0:09:54.022,0:09:57.283 if we built out a global network[br]of concerned citizens 0:09:57.307,0:10:00.601 who wanted to donate[br]their social data to science. 0:10:01.141,0:10:02.863 Because we actually know very little 0:10:02.887,0:10:05.768 about the long-term consequences[br]of hate and disinformation 0:10:05.792,0:10:07.767 on people's attitudes and behaviors. 0:10:08.236,0:10:09.403 And what we do know, 0:10:09.427,0:10:11.620 most of that has been[br]carried out in the US, 0:10:11.644,0:10:14.025 despite the fact that[br]this is a global problem. 0:10:14.049,0:10:15.684 We need to work on that, too. 0:10:16.192,0:10:17.342 And third, 0:10:17.366,0:10:19.676 can we find a way to connect the dots? 0:10:19.700,0:10:23.138 No one sector, let alone nonprofit,[br]start-up or government, 0:10:23.162,0:10:24.584 is going to solve this. 0:10:24.608,0:10:27.172 But there are very smart people[br]right around the world 0:10:27.196,0:10:28.577 working on these challenges, 0:10:28.601,0:10:32.177 from newsrooms, civil society,[br]academia, activist groups. 0:10:32.201,0:10:34.099 And you can see some of them here. 0:10:34.123,0:10:37.050 Some are building out indicators[br]of content credibility. 0:10:37.074,0:10:38.320 Others are fact-checking, 0:10:38.344,0:10:41.935 so that false claims, videos and images[br]can be down-ranked by the platforms. 0:10:41.959,0:10:44.172 A nonprofit I helped[br]to found, First Draft, 0:10:44.196,0:10:47.164 is working with normally competitive[br]newsrooms around the world 0:10:47.188,0:10:50.691 to help them build out investigative,[br]collaborative programs. 0:10:51.231,0:10:53.540 And Danny Hillis, a software architect, 0:10:53.564,0:10:55.945 is designing a new system[br]called The Underlay, 0:10:55.969,0:10:58.744 which will be a record[br]of all public statements of fact 0:10:58.768,0:11:00.097 connected to their sources, 0:11:00.121,0:11:03.775 so that people and algorithms[br]can better judge what is credible. 0:11:04.800,0:11:08.156 And educators around the world[br]are testing different techniques 0:11:08.180,0:11:11.664 for finding ways to make people[br]critical of the content they consume. 0:11:12.633,0:11:15.774 All of these efforts are wonderful,[br]but they're working in silos, 0:11:15.798,0:11:18.478 and many of them are woefully underfunded. 0:11:18.502,0:11:20.555 There are also hundreds[br]of very smart people 0:11:20.579,0:11:22.231 working inside these companies, 0:11:22.255,0:11:24.580 but again, these efforts[br]can feel disjointed, 0:11:24.604,0:11:28.541 because they're actually developing[br]different solutions to the same problems. 0:11:29.205,0:11:31.474 How can we find a way[br]to bring people together 0:11:31.498,0:11:34.776 in one physical location[br]for days or weeks at a time, 0:11:34.800,0:11:37.196 so they can actually tackle[br]these problems together 0:11:37.220,0:11:39.038 but from their different perspectives? 0:11:39.062,0:11:40.402 So can we do this? 0:11:40.426,0:11:43.665 Can we build out a coordinated,[br]ambitious response, 0:11:43.689,0:11:47.398 one that matches the scale[br]and the complexity of the problem? 0:11:47.819,0:11:49.192 I really think we can. 0:11:49.216,0:11:52.175 Together, let's rebuild[br]our information commons. 0:11:52.819,0:11:54.009 Thank you. 0:11:54.033,0:11:57.761 (Applause)