0:00:00.887,0:00:04.415 So I'm starting us out today[br]with a historical mystery. 0:00:05.037,0:00:08.006 In 1957, there were two young women, 0:00:08.030,0:00:09.385 both in their 20s, 0:00:09.409,0:00:10.893 both living in the same city, 0:00:10.917,0:00:13.753 both members of the same political group. 0:00:14.751,0:00:18.591 That year, both decided[br]to commit violent attacks. 0:00:19.284,0:00:23.413 One girl took a gun and approached[br]a soldier at a checkpoint. 0:00:24.300,0:00:29.078 The other girl took a bomb[br]and went to a crowded café. 0:00:30.287,0:00:31.986 But here's the thing: 0:00:32.429,0:00:36.138 one of the those girls[br]followed through with the attack, 0:00:37.028,0:00:40.048 but the other turned back. 0:00:41.303,0:00:42.997 So what made the difference? 0:00:44.157,0:00:47.414 I'm a behavioral historian,[br]and I study aggression, 0:00:47.438,0:00:49.359 moral cognition 0:00:49.383,0:00:52.592 and decision-making in social movements. 0:00:52.616,0:00:54.613 That's a mouthful. (Laughs) 0:00:54.637,0:00:57.287 So, the translation of that is: 0:00:57.311,0:01:02.089 I study the moment an individual[br]decides to pull the trigger, 0:01:02.113,0:01:06.749 the day-to-day decisions[br]that led up to that moment 0:01:06.773,0:01:12.896 and the stories that they tell themselves[br]about why that behavior is justified. 0:01:13.468,0:01:14.724 Now, this topic -- 0:01:14.748,0:01:17.418 it's not just scholarly for me. 0:01:17.442,0:01:19.227 It's actually a bit personal. 0:01:19.251,0:01:22.612 I grew up in Kootenai County, Idaho, 0:01:22.636,0:01:24.724 and this is very important. 0:01:25.147,0:01:29.669 This is not the part of Idaho[br]with potatoes. 0:01:29.693,0:01:32.111 We have no potatoes. 0:01:32.135,0:01:34.108 And if you ask me about potatoes, 0:01:34.132,0:01:35.483 I will find you. 0:01:35.507,0:01:36.726 (Laughter) 0:01:36.750,0:01:40.198 This part of Idaho is known[br]for mountain lakes, 0:01:40.222,0:01:42.047 horseback riding, 0:01:42.071,0:01:43.319 skiing. 0:01:43.684,0:01:46.938 Unfortunately, starting in the 1980s, 0:01:46.962,0:01:50.969 it also became known[br]as the worldwide headquarters 0:01:50.993,0:01:52.984 for the Aryan Nations. 0:01:53.348,0:01:56.911 Every year, members of the local[br]neo-Nazi compound 0:01:56.935,0:01:59.924 would turn out and march through our town, 0:01:59.948,0:02:01.158 and every year, 0:02:01.182,0:02:04.523 members of our town[br]would turn out and protest them. 0:02:04.547,0:02:08.395 Now, in 2001, I graduated[br]from high school, 0:02:08.419,0:02:12.577 and I went to college in New York City. 0:02:12.601,0:02:16.553 I arrived in August 2001. 0:02:17.482,0:02:19.766 As many of you probably are aware, 0:02:19.790,0:02:21.681 three weeks later, 0:02:21.705,0:02:23.500 the Twin Towers went down. 0:02:24.243,0:02:27.997 Now, I was shocked. 0:02:28.503,0:02:30.878 I was incredibly angry. 0:02:32.613,0:02:34.222 I wanted to do something, 0:02:34.246,0:02:38.396 but the only thing that I could think[br]of doing at that time 0:02:38.420,0:02:40.623 was to study Arabic. 0:02:41.955,0:02:43.112 I will admit, 0:02:43.136,0:02:49.223 I was that girl in class[br]that wanted to know why "they" hate "us." 0:02:49.247,0:02:52.638 I started studying Arabic[br]for very wrong reasons. 0:02:53.265,0:02:55.200 But something unexpected happened. 0:02:55.224,0:02:58.194 I got a scholarship to go study in Israel. 0:02:58.718,0:03:01.535 So the Idaho girl went to the Middle East. 0:03:01.559,0:03:06.045 And while I was there,[br]I met Palestinian Muslims, 0:03:06.069,0:03:08.101 Palestinian Christians, 0:03:08.125,0:03:09.282 Israeli settlers, 0:03:09.306,0:03:10.975 Israeli peace activists. 0:03:11.501,0:03:16.035 And what I learned[br]is that every act has an ecology. 0:03:16.696,0:03:18.125 It has a context. 0:03:20.074,0:03:23.627 Now, since then, I have gone[br]around the world, 0:03:23.651,0:03:27.565 I have studied violent movements, 0:03:27.589,0:03:32.554 I have worked with NGOs[br]and ex-combatants in Iraq, 0:03:32.578,0:03:33.776 Syria, 0:03:33.800,0:03:35.219 Vietnam, 0:03:35.243,0:03:36.425 the Balkans, 0:03:36.449,0:03:37.730 Cuba. 0:03:38.422,0:03:40.817 I earned my PhD in History, 0:03:40.841,0:03:43.355 and now what I do is[br]I go to different archives 0:03:43.379,0:03:45.600 and I dig through documents, 0:03:45.624,0:03:49.257 looking for police confessions, 0:03:49.281,0:03:51.846 court cases, 0:03:51.870,0:03:57.086 diaries and manifestos of individuals[br]involved in violent attacks. 0:03:57.110,0:04:00.296 Now, you gather all these documents -- 0:04:00.320,0:04:01.730 what do they tell you? 0:04:01.754,0:04:04.921 Our brains love causal mysteries, 0:04:04.945,0:04:06.133 it turns out. 0:04:06.157,0:04:09.107 So any time we see an attack on the news, 0:04:09.131,0:04:11.563 we tend to ask one question: 0:04:11.587,0:04:12.889 Why? 0:04:12.913,0:04:14.404 Why did that happen? 0:04:14.428,0:04:17.328 Well, I can tell you I've read[br]thousands of manifestos, 0:04:17.352,0:04:21.913 and what you find out is[br]that they are actually imitative. 0:04:21.937,0:04:25.506 They imitate the political movement[br]that they're drawing from. 0:04:25.530,0:04:29.543 So they actually don't tell us[br]a lot about decision-making 0:04:29.567,0:04:31.561 in that particular case. 0:04:31.924,0:04:36.542 So we have to teach ourselves[br]to ask a totally different question. 0:04:36.566,0:04:40.144 Instead of "Why?" we have to ask "How?" 0:04:40.168,0:04:43.079 How did individuals produce these attacks, 0:04:43.103,0:04:48.381 and how did their decision-making ecology[br]contribute to violent behavior? 0:04:48.781,0:04:53.799 There's a couple things I've learned[br]from asking this kind of question. 0:04:53.823,0:04:55.895 The most important thing is that 0:04:55.919,0:04:58.955 political violence is not[br]culturally endemic. 0:04:58.979,0:05:00.359 We create it. 0:05:00.383,0:05:02.673 And whether we realize it or not, 0:05:02.697,0:05:08.160 our day-to-day habits contribute[br]to the creation of violence 0:05:08.184,0:05:09.738 in our environment. 0:05:09.762,0:05:14.978 So here's a couple of habits[br]that I've learned contribute to violence. 0:05:16.322,0:05:19.562 One of the first things that attackers did 0:05:19.586,0:05:23.392 when preparing themselves[br]for a violent event 0:05:23.416,0:05:27.111 was they enclosed themselves[br]in an information bubble. 0:05:27.135,0:05:29.693 We've heard of fake news, yeah? 0:05:29.717,0:05:32.094 Well, this shocked me: 0:05:32.118,0:05:35.616 every group that I studied[br]had some kind of a fake news slogan. 0:05:35.640,0:05:39.028 French communists[br]called it the "putrid press." 0:05:39.052,0:05:43.149 French ultranationalists called it[br]the "sellout press" 0:05:43.173,0:05:45.365 and the "treasonous press." 0:05:45.389,0:05:48.900 Islamists in Egypt called it[br]the "depraved news." 0:05:48.924,0:05:51.989 And Egyptian communists called it ... 0:05:52.013,0:05:53.266 "fake news." 0:05:53.290,0:05:57.796 So why do groups spend all this time[br]trying to make these information bubbles? 0:05:57.820,0:06:00.701 The answer is actually really simple. 0:06:00.725,0:06:05.220 We make decisions based on[br]the information we trust, yeah? 0:06:05.244,0:06:09.087 So if we trust bad information, 0:06:09.111,0:06:12.086 we're going to make bad decisions. 0:06:12.110,0:06:15.100 Another interesting habit[br]that individuals used 0:06:15.124,0:06:18.067 when they wanted[br]to produce a violent attack 0:06:18.091,0:06:21.621 was that they looked at their victim[br]not as an individual 0:06:21.645,0:06:24.356 but just as a member of an opposing team. 0:06:25.006,0:06:26.737 Now this gets really weird. 0:06:27.554,0:06:31.889 There's some fun brain science behind[br]why that kind of thinking is effective. 0:06:31.913,0:06:35.441 Say I divide all of you guys[br]into two teams: 0:06:35.465,0:06:36.617 blue team, 0:06:36.641,0:06:37.945 red team. 0:06:37.969,0:06:41.437 And then I ask you to compete[br]in a game against each other. 0:06:41.461,0:06:45.419 Well, the funny thing is,[br]within milliseconds, 0:06:45.443,0:06:50.292 you will actually start experiencing[br]pleasure -- pleasure -- 0:06:50.316,0:06:55.129 when something bad happens[br]to members of the other team. 0:06:55.814,0:06:59.898 The funny thing about that is[br]if I ask one of you blue team members 0:06:59.922,0:07:01.837 to go and join the red team, 0:07:02.805,0:07:04.797 your brain recalibrates, 0:07:04.821,0:07:06.195 and within milliseconds, 0:07:06.219,0:07:08.812 you will now start experiencing pleasure 0:07:08.836,0:07:12.294 when bad things happen[br]to members of your old team. 0:07:14.064,0:07:20.669 This is a really good example[br]of why us-them thinking is so dangerous 0:07:20.693,0:07:22.463 in our political environment. 0:07:22.487,0:07:26.828 Another habit that attackers used[br]to kind of rev themselves up for an attack 0:07:26.852,0:07:29.220 was they focused on differences. 0:07:29.244,0:07:32.201 In other words, they looked[br]at their victims, and they thought, 0:07:33.075,0:07:35.221 "I share nothing in common[br]with that person. 0:07:35.245,0:07:37.374 They are totally different than me." 0:07:38.829,0:07:41.995 Again, this might sound[br]like a really simple concept, 0:07:42.019,0:07:46.676 but there's some fascinating science[br]behind why this works. 0:07:47.209,0:07:52.438 Say I show you guys videos[br]of different-colored hands 0:07:52.462,0:07:56.237 and sharp pins being driven[br]into these different-colored hands, 0:07:56.261,0:07:57.411 OK? 0:07:58.360,0:08:00.219 If you're white, 0:08:00.243,0:08:05.954 the chances are you will experience[br]the most sympathetic activation, 0:08:05.978,0:08:07.541 or the most pain, 0:08:07.565,0:08:10.496 when you see a pin[br]going into the white hand. 0:08:12.053,0:08:15.447 If you are Latin American, Arab, Black, 0:08:15.471,0:08:19.038 you will probably experience[br]the most sympathetic activation 0:08:19.062,0:08:23.926 watching a pin going into the hand[br]that looks most like yours. 0:08:26.846,0:08:30.718 The good news is,[br]that's not biologically fixed. 0:08:30.742,0:08:32.537 That is learned behavior. 0:08:33.252,0:08:37.730 Which means the more we spend time[br]with other ethnic communities 0:08:37.754,0:08:44.623 and the more we see them as similar to us[br]and part of our team, 0:08:44.647,0:08:46.875 the more we feel their pain. 0:08:46.899,0:08:49.456 The last habit[br]that I'm going to talk about 0:08:49.480,0:08:54.537 is when attackers prepared themselves[br]to go out and do one of these events, 0:08:54.561,0:08:57.227 they focused on certain emotional cues. 0:08:57.251,0:09:03.117 For months, they geared themselves up[br]by focusing on anger cues, for instance. 0:09:03.141,0:09:05.839 I bring this up because[br]it's really popular right now. 0:09:05.863,0:09:09.547 If you read blogs or the news, 0:09:09.571,0:09:13.550 you see talk of two concepts[br]from laboratory science: 0:09:13.574,0:09:16.779 amygdala hijacking[br]and emotional hijacking. 0:09:16.803,0:09:19.376 Now, amygdala hijacking: 0:09:19.400,0:09:23.489 it's the concept that I show you[br]a cue -- say, a gun -- 0:09:23.513,0:09:27.406 and your brain reacts[br]with an automatic threat response 0:09:27.430,0:09:28.599 to that cue. 0:09:28.623,0:09:31.124 Emotional hijacking --[br]it's a very similar concept. 0:09:31.148,0:09:36.179 It's the idea that I show you[br]an anger cue, for instance, 0:09:36.203,0:09:41.292 and your brain will react[br]with an automatic anger response 0:09:41.316,0:09:42.643 to that cue. 0:09:42.667,0:09:46.646 I think women usually get[br]this more than men. (Laughs) 0:09:46.670,0:09:47.677 (Laughter) 0:09:47.701,0:09:51.227 That kind of a hijacking narrative[br]grabs our attention. 0:09:51.251,0:09:54.016 Just the word 'hijacking"[br]grabs our attention. 0:09:54.526,0:09:55.677 The thing is, 0:09:55.701,0:10:00.472 most of the time, that's not really[br]how cues work in real life. 0:10:00.970,0:10:02.120 If you study history, 0:10:02.144,0:10:07.406 what you find is that we are bombarded[br]with hundreds of thousands of cues 0:10:07.430,0:10:08.837 every day. 0:10:08.861,0:10:10.908 And so what we do is we learn to filter. 0:10:10.932,0:10:12.776 We ignore some cues, 0:10:12.800,0:10:14.979 we pay attention to other cues. 0:10:15.003,0:10:18.631 For political violence,[br]this becomes really important, 0:10:18.655,0:10:24.566 because what it meant is that attackers[br]usually didn't just see an anger cue 0:10:24.590,0:10:26.470 and suddenly snap. 0:10:26.825,0:10:28.288 Instead, 0:10:28.312,0:10:34.831 politicians, social activists[br]spent weeks, months, years 0:10:34.855,0:10:39.878 flooding the environment[br]with anger cues, for instance, 0:10:39.902,0:10:41.905 and attackers, 0:10:41.929,0:10:44.364 they paid attention to those cues, 0:10:44.388,0:10:46.914 they trusted those cues, 0:10:46.938,0:10:48.486 they focused on them, 0:10:48.510,0:10:51.112 they even memorized those cues. 0:10:51.136,0:10:57.613 All of this just really goes to show[br]how important it is to study history. 0:10:57.637,0:11:01.607 It's one thing to see how cues operate[br]in a laboratory setting. 0:11:01.631,0:11:05.245 And those laboratory experiments[br]are incredibly important. 0:11:05.269,0:11:09.491 They give us a lot of new data[br]about how our bodies work. 0:11:10.269,0:11:15.488 But it's also very important to see[br]how those cues operate in real life. 0:11:18.535,0:11:22.700 So what does all this tell us[br]about political violence? 0:11:23.908,0:11:27.496 Political violence is not[br]culturally endemic. 0:11:27.985,0:11:33.208 It is not an automatic, predetermined[br]response to environmental stimuli. 0:11:33.523,0:11:34.706 We produce it. 0:11:35.356,0:11:37.429 Our everyday habits produce it. 0:11:38.945,0:11:42.906 Let's go back, actually, to those two[br]women that I mentioned at the start. 0:11:43.940,0:11:49.518 The first woman had been paying attention[br]to those outrage campaigns, 0:11:49.542,0:11:50.965 so she took a gun 0:11:50.989,0:11:53.258 and approached a soldier at a checkpoint. 0:11:55.302,0:11:58.905 But in that moment,[br]something really interesting happened. 0:11:58.929,0:12:01.798 She looked at that soldier, 0:12:01.822,0:12:03.775 and she thought to herself, 0:12:06.180,0:12:08.762 "He's the same age as me. 0:12:09.435,0:12:10.953 He looks like me." 0:12:12.724,0:12:15.262 And she put down the gun,[br]and she walked away. 0:12:16.179,0:12:18.602 Just from that little bit of similarity. 0:12:20.128,0:12:23.702 The second girl had[br]a totally different outcome. 0:12:25.533,0:12:28.382 She also listened[br]to the outrage campaigns, 0:12:28.406,0:12:31.398 but she surrounded herself[br]with individuals 0:12:31.422,0:12:33.246 who were supportive of violence, 0:12:33.270,0:12:35.776 with peers who supported her violence. 0:12:36.707,0:12:39.983 She enclosed herself[br]in an information bubble. 0:12:40.747,0:12:44.397 She focused on certain[br]emotional cues for months. 0:12:44.421,0:12:50.013 She taught herself to bypass certain[br]cultural inhibitions against violence. 0:12:50.037,0:12:51.784 She practiced her plan, 0:12:51.808,0:12:54.055 she taught herself new habits, 0:12:54.079,0:12:58.201 and when the time came,[br]she took her bomb to the café, 0:12:58.225,0:13:00.526 and she followed through with that attack. 0:13:03.592,0:13:06.403 This was not impulse. 0:13:06.903,0:13:08.576 This was learning. 0:13:10.463,0:13:14.345 Polarization in our society[br]is not impulse, 0:13:14.369,0:13:15.715 it's learning. 0:13:16.391,0:13:19.318 Every day we are teaching ourselves: 0:13:19.342,0:13:21.304 the news we click on, 0:13:21.328,0:13:23.372 the emotions that we focus on, 0:13:23.396,0:13:27.756 the thoughts that we entertain[br]about the red team or the blue team. 0:13:28.284,0:13:30.549 All of this contributes to learning, 0:13:30.573,0:13:32.296 whether we realize it or not. 0:13:32.696,0:13:34.324 The good news 0:13:35.570,0:13:41.441 is that while the individuals I study[br]already made their decisions, 0:13:41.465,0:13:43.885 we can still change our trajectory. 0:13:45.164,0:13:48.824 We might never make[br]the decisions that they made, 0:13:48.848,0:13:52.997 but we can stop contributing[br]to violent ecologies. 0:13:53.552,0:13:58.038 We can get out of whatever[br]news bubble we're in, 0:13:58.062,0:14:02.106 we can be more mindful[br]about the emotional cues 0:14:02.130,0:14:03.359 that we focus on, 0:14:03.383,0:14:05.782 the outrage bait that we click on. 0:14:06.422,0:14:07.602 But most importantly, 0:14:07.626,0:14:12.127 we can stop seeing each other[br]as just members of the red team 0:14:12.151,0:14:13.545 or the blue team. 0:14:13.959,0:14:19.856 Because whether we are Christian,[br]Muslim, Jewish, atheist, 0:14:19.880,0:14:22.364 Democrat or Republican, 0:14:22.388,0:14:23.544 we're human. 0:14:23.568,0:14:24.843 We're human beings. 0:14:25.992,0:14:29.494 And we often share really similar habits. 0:14:30.224,0:14:32.057 We have differences. 0:14:32.081,0:14:34.296 Those differences are beautiful, 0:14:34.320,0:14:36.761 and those differences are very important. 0:14:36.785,0:14:43.254 But our future depends on us[br]being able to find common ground 0:14:43.278,0:14:44.628 with the other side. 0:14:45.785,0:14:49.120 And that's why it is so, so important 0:14:49.144,0:14:51.723 for us to retrain our brains 0:14:51.747,0:14:55.547 and stop contributing[br]to violent ecologies. 0:14:56.272,0:14:57.444 Thank you. 0:14:57.468,0:14:59.374 (Applause)