1 00:00:01,000 --> 00:00:03,018 Around five years ago, 2 00:00:03,042 --> 00:00:05,434 it struck me that I was losing the ability 3 00:00:05,458 --> 00:00:07,958 to engage with people who aren't like-minded. 4 00:00:08,917 --> 00:00:12,559 The idea of discussing hot-button issues with my fellow Americans 5 00:00:12,583 --> 00:00:14,976 was starting to give me more heartburn 6 00:00:15,000 --> 00:00:19,125 than the times that I engaged with suspected extremists overseas. 7 00:00:19,750 --> 00:00:23,351 It was starting to leave me feeling more embittered and frustrated. 8 00:00:23,375 --> 00:00:24,809 And so just like that, 9 00:00:24,833 --> 00:00:26,643 I shifted my entire focus 10 00:00:26,667 --> 00:00:29,101 from global national security threats 11 00:00:29,125 --> 00:00:32,351 to trying to understand what was causing this push 12 00:00:32,375 --> 00:00:35,601 towards extreme polarization at home. 13 00:00:35,625 --> 00:00:37,976 As a former CIA officer and diplomat 14 00:00:38,000 --> 00:00:41,434 who spent years working on counterextremism issues, 15 00:00:41,458 --> 00:00:45,309 I started to fear that this was becoming a far greater threat to our democracy 16 00:00:45,333 --> 00:00:47,851 than any foreign adversary. 17 00:00:47,875 --> 00:00:49,684 And so I started digging in, 18 00:00:49,708 --> 00:00:51,226 and I started speaking out, 19 00:00:51,250 --> 00:00:54,059 which eventually led me to being hired at Facebook 20 00:00:54,083 --> 00:00:56,768 and ultimately brought me here today 21 00:00:56,792 --> 00:00:59,601 to continue warning you about how these platforms 22 00:00:59,625 --> 00:01:03,643 are manipulating and radicalizing so many of us 23 00:01:03,667 --> 00:01:06,625 and to talk about how to reclaim our public square. 24 00:01:07,625 --> 00:01:09,601 I was a foreign service officer in Kenya 25 00:01:09,625 --> 00:01:12,684 just a few years after the September 11 attacks, 26 00:01:12,708 --> 00:01:15,268 and I led what some call "hearts and minds" campaigns 27 00:01:15,292 --> 00:01:17,226 along the Somalia border. 28 00:01:17,250 --> 00:01:20,559 A big part of my job was to build trust with communities 29 00:01:20,583 --> 00:01:23,292 deemed the most susceptible to extremist messaging. 30 00:01:24,125 --> 00:01:28,309 I spent hours drinking tea with outspoken anti-Western clerics 31 00:01:28,333 --> 00:01:31,559 and even dialogued with some suspected terrorists, 32 00:01:31,583 --> 00:01:34,976 and while many of these engagements began with mutual suspicion, 33 00:01:35,000 --> 00:01:38,809 I don't recall any of them resulting in shouting or insults, 34 00:01:38,833 --> 00:01:42,917 and in some case we even worked together on areas of mutual interest. 35 00:01:44,000 --> 00:01:48,143 The most powerful tools we had were to simply listen, learn 36 00:01:48,167 --> 00:01:49,893 and build empathy. 37 00:01:49,917 --> 00:01:52,976 This is the essence of hearts and minds work, 38 00:01:53,000 --> 00:01:56,268 because what I found again and again is that what most people wanted 39 00:01:56,292 --> 00:02:00,059 was to feel heard, validated and respected. 40 00:02:00,083 --> 00:02:02,726 And I believe that's what most of us want. 41 00:02:02,750 --> 00:02:06,059 So what I see happening online today is especially heartbreaking 42 00:02:06,083 --> 00:02:08,167 and a much harder problem to tackle. 43 00:02:08,917 --> 00:02:12,684 We are being manipulated by the current information ecosystem 44 00:02:12,708 --> 00:02:16,476 entrenching so many of us so far into absolutism 45 00:02:16,500 --> 00:02:19,476 that compromise has become a dirty word. 46 00:02:19,500 --> 00:02:20,809 Because right now, 47 00:02:20,833 --> 00:02:22,976 social media companies like Facebook 48 00:02:23,000 --> 00:02:26,851 profit off of segmenting us and feeding us personalized content 49 00:02:26,875 --> 00:02:30,726 that both validates and exploits our biases. 50 00:02:30,750 --> 00:02:34,184 Their bottom line depends on provoking a strong emotion 51 00:02:34,208 --> 00:02:35,893 to keep us engaged, 52 00:02:35,917 --> 00:02:40,184 often incentivizing the most inflammatory and polarizing voices, 53 00:02:40,208 --> 00:02:44,643 to the point where finding common ground no longer feels possible. 54 00:02:44,667 --> 00:02:48,893 And despite a growing chorus of people crying out for the platforms to change, 55 00:02:48,917 --> 00:02:52,226 it's clear they will not do enough on their own. 56 00:02:52,250 --> 00:02:55,143 So governments must define the responsibility 57 00:02:55,167 --> 00:02:58,976 for the real-world harms being caused by these business models 58 00:02:59,000 --> 00:03:01,768 and impose real costs on the damaging effects 59 00:03:01,792 --> 00:03:07,101 they're having to our public health, our public square and our democracy. 60 00:03:07,125 --> 00:03:11,768 But unfortunately, this won't happen in time for the US presidential election, 61 00:03:11,792 --> 00:03:14,559 so I am continuing to raise this alarm, 62 00:03:14,583 --> 00:03:17,643 because even if one day we do have strong rules in place, 63 00:03:17,667 --> 00:03:20,417 it will take all of us to fix this. 64 00:03:21,417 --> 00:03:24,018 When I started shifting my focus from threats abroad 65 00:03:24,042 --> 00:03:26,184 to the breakdown in civil discourse at home, 66 00:03:26,208 --> 00:03:30,268 I wondered if we could repurpose some of these hearts and minds campaigns 67 00:03:30,292 --> 00:03:32,393 to help heal our divides. 68 00:03:32,417 --> 00:03:36,226 Our more than 200-year experiment with democracy works 69 00:03:36,250 --> 00:03:40,143 in large part because we are able to openly and passionately 70 00:03:40,167 --> 00:03:42,976 debate our ideas for the best solutions. 71 00:03:43,000 --> 00:03:44,809 But while I still deeply believe 72 00:03:44,833 --> 00:03:47,101 in the power of face-to-face civil discourse, 73 00:03:47,125 --> 00:03:48,768 it just cannot compete 74 00:03:48,792 --> 00:03:52,601 with the polarizing effects and scale of social media right now. 75 00:03:52,625 --> 00:03:54,976 The people who are sucked down these rabbit holes 76 00:03:55,000 --> 00:03:56,309 of social media outrage 77 00:03:56,333 --> 00:03:59,976 often feel far harder to break of their ideological mindsets 78 00:04:00,000 --> 00:04:03,476 than those vulnerable communities I worked with ever were. 79 00:04:03,500 --> 00:04:05,809 So when Facebook called me in 2018 80 00:04:05,833 --> 00:04:07,143 and offered me this role 81 00:04:07,167 --> 00:04:11,143 heading its elections integrity operations for political advertising, 82 00:04:11,167 --> 00:04:13,184 I felt I had to say yes. 83 00:04:13,208 --> 00:04:15,934 I had no illusions that I would fix it all, 84 00:04:15,958 --> 00:04:17,601 but when offered the opportunity 85 00:04:17,625 --> 00:04:19,809 to help steer the ship in a better direction, 86 00:04:19,833 --> 00:04:21,375 I had to at least try. 87 00:04:22,667 --> 00:04:24,934 I didn't work directly on polarization, 88 00:04:24,958 --> 00:04:29,393 but I did look at which issues were the most divisive in our society 89 00:04:29,417 --> 00:04:33,226 and therefore the most exploitable in elections interference efforts, 90 00:04:33,250 --> 00:04:35,833 which was Russia's tactic ahead of 2016. 91 00:04:36,583 --> 00:04:38,934 So I started by asking questions. 92 00:04:38,958 --> 00:04:41,851 I wanted to understand the underlying systemic issues 93 00:04:41,875 --> 00:04:44,309 that were allowing all of this to happen, 94 00:04:44,333 --> 00:04:46,417 in order to figure out how to fix it. 95 00:04:47,625 --> 00:04:50,184 Now I still do believe in the power of the internet 96 00:04:50,208 --> 00:04:52,726 to bring more voices to the table, 97 00:04:52,750 --> 00:04:55,851 but despite their stated goal of building community, 98 00:04:55,875 --> 00:04:59,184 the largest social media companies as currently constructed 99 00:04:59,208 --> 00:05:02,809 are antithetical to the concept of reasoned discourse. 100 00:05:02,833 --> 00:05:05,351 There's no way to reward listening, 101 00:05:05,375 --> 00:05:06,976 to encourage civil debate 102 00:05:07,000 --> 00:05:10,643 and to protect people who sincerely want to ask questions 103 00:05:10,667 --> 00:05:14,018 in a business where optimizing engagement and user growth 104 00:05:14,042 --> 00:05:17,268 are the two most important metrics for success. 105 00:05:17,292 --> 00:05:20,726 There's no incentive to help people slow down, 106 00:05:20,750 --> 00:05:23,976 to build in enough friction that people have to stop, 107 00:05:24,000 --> 00:05:26,601 recognize their emotional reaction to something, 108 00:05:26,625 --> 00:05:29,292 and question their own assumptions before engaging. 109 00:05:30,625 --> 00:05:32,601 The unfortunate reality is: 110 00:05:32,625 --> 00:05:35,518 lies are more engaging online than truth, 111 00:05:35,542 --> 00:05:39,268 and salaciousness beats out wonky, fact-based reasoning 112 00:05:39,292 --> 00:05:42,333 in a world optimized for frictionless virality. 113 00:05:43,167 --> 00:05:46,601 As long as algorithms' goals are to keep us engaged, 114 00:05:46,625 --> 00:05:50,684 they will continue to feed us the poison that plays to our worst instincts 115 00:05:50,708 --> 00:05:52,167 and human weaknesses. 116 00:05:52,958 --> 00:05:55,976 And yes, anger, mistrust, 117 00:05:56,000 --> 00:05:57,726 the culture of fear, hatred: 118 00:05:57,750 --> 00:06:00,601 none of this is new in America. 119 00:06:00,625 --> 00:06:03,976 But in recent years, social media has harnessed all of that 120 00:06:04,000 --> 00:06:07,601 and, as I see it, dramatically tipped the scales. 121 00:06:07,625 --> 00:06:10,018 And Facebook knows it. 122 00:06:10,042 --> 00:06:11,934 A recent "Wall Street Journal" article 123 00:06:11,958 --> 00:06:16,101 exposed an internal Facebook presentation from 2018 124 00:06:16,125 --> 00:06:19,893 that specifically points to the companies' own algorithms 125 00:06:19,917 --> 00:06:23,434 for growing extremist groups' presence on their platform 126 00:06:23,458 --> 00:06:25,625 and for polarizing their users. 127 00:06:26,708 --> 00:06:30,393 But keeping us engaged is how they make their money. 128 00:06:30,417 --> 00:06:34,643 The modern information environment is crystallized around profiling us 129 00:06:34,667 --> 00:06:37,643 and then segmenting us into more and more narrow categories 130 00:06:37,667 --> 00:06:40,976 to perfect this personalization process. 131 00:06:41,000 --> 00:06:44,809 We're then bombarded with information confirming our views, 132 00:06:44,833 --> 00:06:46,684 reinforcing our biases, 133 00:06:46,708 --> 00:06:50,226 and making us feel like we belong to something. 134 00:06:50,250 --> 00:06:53,643 These are the same tactics we would see terrorist recruiters 135 00:06:53,667 --> 00:06:55,851 using on vulnerable youth, 136 00:06:55,875 --> 00:06:59,726 albeit in smaller, more localized ways before social media, 137 00:06:59,750 --> 00:07:03,018 with the ultimate goal of persuading their behavior. 138 00:07:03,042 --> 00:07:08,309 Unfortunately, I was never empowered by Facebook to have an actual impact. 139 00:07:08,333 --> 00:07:12,101 In fact, on my second day, my title and job description were changed 140 00:07:12,125 --> 00:07:15,101 and I was cut out of decision-making meetings. 141 00:07:15,125 --> 00:07:16,434 My biggest efforts, 142 00:07:16,458 --> 00:07:17,851 trying to build plans 143 00:07:17,875 --> 00:07:21,476 to combat disinformation and voter suppression in political ads, 144 00:07:21,500 --> 00:07:23,059 were rejected. 145 00:07:23,083 --> 00:07:25,583 And so I lasted just shy of six months. 146 00:07:26,125 --> 00:07:29,726 But here is my biggest takeaway from my time there. 147 00:07:29,750 --> 00:07:32,226 There are thousands of people at Facebook 148 00:07:32,250 --> 00:07:34,268 who are passionately working on a product 149 00:07:34,292 --> 00:07:37,976 that they truly believe makes the world a better place, 150 00:07:38,000 --> 00:07:41,684 but as long as the company continues to merely tinker around the margins 151 00:07:41,708 --> 00:07:44,434 of content policy and moderation, 152 00:07:44,458 --> 00:07:45,768 as opposed to considering 153 00:07:45,792 --> 00:07:49,018 how the entire machine is designed and monetized, 154 00:07:49,042 --> 00:07:52,393 they will never truly address how the platform is contributing 155 00:07:52,417 --> 00:07:56,601 to hatred, division and radicalization. 156 00:07:56,625 --> 00:08:00,893 And that's the one conversation I never heard happen during my time there, 157 00:08:00,917 --> 00:08:03,976 because that would require fundamentally accepting 158 00:08:04,000 --> 00:08:08,143 that the thing you built might not be the best thing for society 159 00:08:08,167 --> 00:08:11,292 and agreeing to alter the entire product and profit model. 160 00:08:12,500 --> 00:08:14,333 So what can we do about this? 161 00:08:15,042 --> 00:08:18,643 I'm not saying that social media bears the sole responsibility 162 00:08:18,667 --> 00:08:20,809 for the state that we're in today. 163 00:08:20,833 --> 00:08:26,268 Clearly, we have deep-seated societal issues that we need to solve. 164 00:08:26,292 --> 00:08:30,434 But Facebook's response, that it is just a mirror to society, 165 00:08:30,458 --> 00:08:33,476 is a convenient attempt to deflect any responsibility 166 00:08:33,500 --> 00:08:37,768 from the way their platform is amplifying harmful content 167 00:08:37,792 --> 00:08:40,833 and pushing some users towards extreme views. 168 00:08:41,583 --> 00:08:44,434 And Facebook could, if they wanted to, 169 00:08:44,458 --> 00:08:46,309 fix some of this. 170 00:08:46,333 --> 00:08:50,184 They could stop amplifying and recommending the conspiracy theorists, 171 00:08:50,208 --> 00:08:52,893 the hate groups, the purveyors of disinformation 172 00:08:52,917 --> 00:08:56,768 and, yes, in some cases even our president. 173 00:08:56,792 --> 00:09:00,184 They could stop using the same personalization techniques 174 00:09:00,208 --> 00:09:04,434 to deliver political rhetoric that they use to sell us sneakers. 175 00:09:04,458 --> 00:09:06,184 They could retrain their algorithms 176 00:09:06,208 --> 00:09:08,393 to focus on a metric other than engagement, 177 00:09:08,417 --> 00:09:12,726 and they could build in guardrails to stop certain content from going viral 178 00:09:12,750 --> 00:09:14,417 before being reviewed. 179 00:09:14,875 --> 00:09:17,143 And they could do all of this 180 00:09:17,167 --> 00:09:21,351 without becoming what they call the arbiters of truth. 181 00:09:21,375 --> 00:09:24,184 But they've made it clear that they will not go far enough 182 00:09:24,208 --> 00:09:27,018 to do the right thing without being forced to, 183 00:09:27,042 --> 00:09:29,976 and, to be frank, why should they? 184 00:09:30,000 --> 00:09:33,934 The markets keep rewarding them, and they're not breaking the law. 185 00:09:33,958 --> 00:09:35,268 Because as it stands, 186 00:09:35,292 --> 00:09:40,184 there are no US laws compelling Facebook, or any social media company, 187 00:09:40,208 --> 00:09:41,976 to protect our public square, 188 00:09:42,000 --> 00:09:43,309 our democracy 189 00:09:43,333 --> 00:09:45,768 and even our elections. 190 00:09:45,792 --> 00:09:49,893 We have ceded the decision-making on what rules to write and what to enforce 191 00:09:49,917 --> 00:09:53,000 to the CEOs of for-profit internet companies. 192 00:09:53,750 --> 00:09:55,768 Is this what we want? 193 00:09:55,792 --> 00:09:59,018 A post-truth world where toxicity and tribalism 194 00:09:59,042 --> 00:10:01,667 trump bridge-building and consensus-seeking? 195 00:10:02,667 --> 00:10:06,768 I do remain optimistic that we still have more in common with each other 196 00:10:06,792 --> 00:10:10,434 than the current media and online environment portray, 197 00:10:10,458 --> 00:10:13,559 and I do believe that having more perspective surface 198 00:10:13,583 --> 00:10:17,184 makes for a more robust and inclusive democracy. 199 00:10:17,208 --> 00:10:20,018 But not the way it's happening right now. 200 00:10:20,042 --> 00:10:24,268 And it bears emphasizing, I do not want to kill off these companies. 201 00:10:24,292 --> 00:10:27,518 I just want them held to a certain level of accountability, 202 00:10:27,542 --> 00:10:29,458 just like the rest of society. 203 00:10:30,542 --> 00:10:34,893 It is time for our governments to step up and do their jobs 204 00:10:34,917 --> 00:10:36,559 of protecting our citizenry. 205 00:10:36,583 --> 00:10:39,643 And while there isn't one magical piece of legislation 206 00:10:39,667 --> 00:10:41,143 that will fix this all, 207 00:10:41,167 --> 00:10:45,851 I do believe that governments can and must find the balance 208 00:10:45,875 --> 00:10:48,018 between protecting free speech 209 00:10:48,042 --> 00:10:52,351 and holding these platforms accountable for their effects on society. 210 00:10:52,375 --> 00:10:56,684 And they could do so in part by insisting on actual transparency 211 00:10:56,708 --> 00:10:59,851 around how these recommendation engines are working, 212 00:10:59,875 --> 00:11:04,518 around how the curation, amplification and targeting are happening. 213 00:11:04,542 --> 00:11:06,976 You see, I want these companies held accountable 214 00:11:07,000 --> 00:11:10,101 not for if an individual posts misinformation 215 00:11:10,125 --> 00:11:11,601 or extreme rhetoric, 216 00:11:11,625 --> 00:11:15,476 but for how their recommendation engines spread it, 217 00:11:15,500 --> 00:11:18,726 how their algorithms are steering people towards it, 218 00:11:18,750 --> 00:11:22,042 and how their tools are used to target people with it. 219 00:11:23,417 --> 00:11:27,101 I tried to make change from within Facebook and failed, 220 00:11:27,125 --> 00:11:30,268 and so I've been using my voice again for the past few years 221 00:11:30,292 --> 00:11:32,518 to continue sounding this alarm 222 00:11:32,542 --> 00:11:36,934 and hopefully inspire more people to demand this accountability. 223 00:11:36,958 --> 00:11:39,601 My message to you is simple: 224 00:11:39,625 --> 00:11:41,809 pressure your government representatives 225 00:11:41,833 --> 00:11:47,268 to step up and stop ceding our public square to for-profit interests. 226 00:11:47,292 --> 00:11:49,143 Help educate your friends and family 227 00:11:49,167 --> 00:11:52,309 about how they're being manipulated online. 228 00:11:52,333 --> 00:11:55,851 Push yourselves to engage with people who aren't like-minded. 229 00:11:55,875 --> 00:11:58,518 Make this issue a priority. 230 00:11:58,542 --> 00:12:01,958 We need a whole-society approach to fix this. 231 00:12:03,250 --> 00:12:08,809 And my message to the leaders of my former employer Facebook is this: 232 00:12:08,833 --> 00:12:14,643 right now, people are using your tools exactly as they were designed 233 00:12:14,667 --> 00:12:17,268 to sow hatred, division and distrust, 234 00:12:17,292 --> 00:12:21,434 and you're not just allowing it, you are enabling it. 235 00:12:21,458 --> 00:12:23,934 And yes, there are lots of great stories 236 00:12:23,958 --> 00:12:27,643 of positive things happening on your platform around the globe, 237 00:12:27,667 --> 00:12:30,893 but that doesn't make any of this OK. 238 00:12:30,917 --> 00:12:33,893 And it's only getting worse as we're heading into our election, 239 00:12:33,917 --> 00:12:35,684 and even more concerning, 240 00:12:35,708 --> 00:12:38,184 face our biggest potential crisis yet, 241 00:12:38,208 --> 00:12:42,351 if the results aren't trusted, and if violence breaks out. 242 00:12:42,375 --> 00:12:47,393 So when in 2021 you once again say, "We know we have to do better," 243 00:12:47,417 --> 00:12:50,101 I want you to remember this moment, 244 00:12:50,125 --> 00:12:53,351 because it's no longer just a few outlier voices. 245 00:12:53,375 --> 00:12:55,643 Civil rights leaders, academics, 246 00:12:55,667 --> 00:12:58,643 journalists, advertisers, your own employees, 247 00:12:58,667 --> 00:13:00,726 are shouting from the rooftops 248 00:13:00,750 --> 00:13:03,351 that your policies and your business practices 249 00:13:03,375 --> 00:13:05,792 are harming people and democracy. 250 00:13:06,875 --> 00:13:09,226 You own your decisions, 251 00:13:09,250 --> 00:13:12,917 but you can no longer say that you couldn't have seen it coming. 252 00:13:14,167 --> 00:13:15,417 Thank you.