0:00:01.739,0:00:03.248 I study rumors. 0:00:03.904,0:00:05.468 Not tabloid gossip 0:00:05.468,0:00:08.978 or the kind of rumors that are making[br]stock markets crash -- 0:00:08.978,0:00:10.577 or soar -- 0:00:10.577,0:00:13.442 but the kind of rumors[br]that affect your health ... 0:00:13.442,0:00:14.934 and the world's health. 0:00:15.302,0:00:17.822 Like eating a lot of garlic 0:00:17.822,0:00:19.117 or drinking a lot of water 0:00:19.117,0:00:22.158 is going to help protect[br]us from coronavirus -- 0:00:22.158,0:00:23.358 if only. 0:00:24.325,0:00:26.324 Rumors have a bad reputation. 0:00:26.947,0:00:28.950 They're seen as not fact, 0:00:28.950,0:00:30.484 wrong, 0:00:30.484,0:00:32.607 or "just a rumor." 0:00:33.199,0:00:35.799 But I've studied rumors for years, 0:00:35.799,0:00:39.617 and one thing I've learned[br]is that they all have a story, 0:00:39.617,0:00:41.945 and often, an important story. 0:00:42.190,0:00:48.253 One of the most moving or alarming[br]rumor episodes that I investigated 0:00:48.253,0:00:50.286 was in northern Nigeria. 0:00:50.604,0:00:53.961 I was working with UNICEF's Global[br]Immunization programme. 0:00:55.002,0:00:59.018 And it wasn't the rumors themselves[br]that I found so alarming; 0:00:59.018,0:01:01.798 It was the global impact of those rumors. 0:01:03.079,0:01:04.929 The rumors were suspecting 0:01:04.929,0:01:09.325 that the polio vaccine[br]was actually a contraceptive. 0:01:09.325,0:01:12.244 It was controlling populations -- 0:01:12.244,0:01:15.379 or maybe it caused AIDS. 0:01:16.225,0:01:20.220 No, no, maybe it's the CIA[br]spying on them or counting them. 0:01:20.505,0:01:26.297 I mean, why else would they have people[br]knocking on their door again and again 0:01:26.297,0:01:29.411 with the same polio vaccine? 0:01:29.665,0:01:32.550 When children were dying of measles, 0:01:32.550,0:01:35.550 no one was coming with measles vaccines. 0:01:37.693,0:01:40.890 This wasn't about getting the facts right. 0:01:40.890,0:01:42.608 This was about trust. 0:01:42.964,0:01:44.826 It was about broken trust. 0:01:45.475,0:01:47.510 Why so much distrust? 0:01:47.999,0:01:51.652 It wasn't the mothers who were[br]particularly distrusting, actually. 0:01:51.842,0:01:53.644 It was the local leaders, 0:01:53.644,0:01:55.385 the religious leaders, 0:01:55.385,0:01:56.957 the local political leaders. 0:01:57.231,0:02:00.520 It was the governor of the state of Kano 0:02:00.520,0:02:02.361 who decided to boycott 0:02:02.361,0:02:07.096 the entire polio eradication[br]effort in that state ... 0:02:07.096,0:02:09.239 for 11 months. 0:02:10.002,0:02:12.048 Why such distrust? 0:02:12.609,0:02:14.663 Well, it was 2003. 0:02:15.127,0:02:18.111 It was two years after 9/11. 0:02:18.592,0:02:21.951 And they were convinced that the West, 0:02:21.951,0:02:24.035 and particularly the United States, 0:02:24.035,0:02:26.042 was at war with Muslims. 0:02:26.393,0:02:28.645 And they knew that the West, 0:02:28.645,0:02:30.900 and particularly the United States, 0:02:30.900,0:02:32.621 was a huge supporter -- 0:02:32.621,0:02:33.993 and funder -- 0:02:33.993,0:02:36.672 of the global polio[br]eradication initiative. 0:02:36.950,0:02:38.851 They had their reasoning. 0:02:39.700,0:02:42.670 That lack of trust, 0:02:42.670,0:02:46.165 that "just a rumor or two" 0:02:46.165,0:02:52.364 cost the polio eradication program[br]500 million dollars 0:02:52.364,0:02:54.054 to reset the clock, 0:02:54.054,0:02:57.990 to regain the progress lost[br]during those 11 months 0:02:57.990,0:02:59.342 and beyond. 0:02:59.603,0:03:05.507 The Nigerian strain of the polio virus[br]traveled to over 20 countries, 0:03:05.507,0:03:07.567 as far as Indonesia. 0:03:08.706,0:03:10.655 The cost of a rumor. 0:03:11.795,0:03:15.989 The Nigeria episode was one[br]of many episodes that I investigated 0:03:15.989,0:03:17.432 when I was with UNICEF 0:03:17.432,0:03:22.258 and earned the title of the "director[br]of UNICEF's fire department." 0:03:22.891,0:03:23.886 (Laughing) 0:03:23.886,0:03:24.848 We -- 0:03:24.848,0:03:27.939 at that point I realized[br]I never really had enough time. 0:03:28.309,0:03:30.396 I was too busy putting out the fires, 0:03:30.396,0:03:32.475 and not enough time to understand 0:03:32.475,0:03:36.495 what was driving not just[br]the individual episodes, 0:03:36.495,0:03:40.727 but why was there an epidemic[br]of these happening around the world. 0:03:41.289,0:03:45.086 I left UNICEF and went[br]back to research -- 0:03:45.086,0:03:47.274 applied research -- 0:03:47.274,0:03:54.189 and I set up in 2010 what I called[br]the Vaccine Confidence Project 0:03:54.189,0:03:57.575 at the London School of Hygiene[br]and Tropical Medicine. 0:03:58.251,0:04:00.189 I convened anthropologists, 0:04:00.189,0:04:01.501 epidemiologists, 0:04:01.501,0:04:03.022 psychologists, 0:04:03.022,0:04:06.023 digital media specialists 0:04:06.023,0:04:07.898 and mathematical modelers. 0:04:08.499,0:04:10.878 We set ourselves the task 0:04:10.878,0:04:15.196 to investigate historic[br]episodes of rumors 0:04:15.196,0:04:17.133 and their impacts, 0:04:17.133,0:04:21.185 from trying to figure out[br]what were the early signals, 0:04:21.185,0:04:22.990 what were the amplifying factors 0:04:22.990,0:04:24.734 and the impacts, 0:04:24.734,0:04:26.551 how did they get traction, 0:04:26.551,0:04:30.486 so we could start to understand[br]what we should be looking for, 0:04:30.486,0:04:32.160 how we could help governments 0:04:32.160,0:04:37.101 and immunization programs[br]be more alert and responsive 0:04:37.101,0:04:39.588 to early signals of problems. 0:04:40.067,0:04:42.060 It was an early warning system. 0:04:42.622,0:04:46.943 In 2015, we developed[br]a vaccine confidence index. 0:04:47.540,0:04:53.737 It's a survey trying to investigate[br]to what extent do people agree 0:04:53.737,0:04:56.811 or disagree that vaccines are important, 0:04:56.811,0:04:58.148 they're safe, 0:04:58.148,0:04:59.259 they're effective -- 0:04:59.259,0:05:00.391 they work -- 0:05:00.391,0:05:03.448 and somehow they're compatible[br]with my religious beliefs. 0:05:03.888,0:05:08.585 We've run this with over hundreds[br]of thousands of people around the world, 0:05:08.585,0:05:13.817 trying to get our finger on the pulse[br]of confidence and trust, 0:05:13.817,0:05:19.819 but also, more importantly, looking[br]at when that trust goes up or down, 0:05:19.819,0:05:23.411 because we want to see[br]when it starts to decline, 0:05:23.411,0:05:25.773 that's the time to jump in, 0:05:25.773,0:05:30.769 to get there before there's a crisis[br]like the Nigerian one. 0:05:31.667,0:05:36.838 We also set up 24-7 media and social media[br]monitoring around the world -- 0:05:36.838,0:05:38.596 multilanguage -- 0:05:38.596,0:05:42.956 listening for what's going on[br]in vaccine conversations, 0:05:42.956,0:05:46.861 trying to pick up early concerns[br]or changes in sentiment 0:05:46.861,0:05:48.996 that we should be paying attention to. 0:05:49.344,0:05:54.236 We've created an ecosystem[br]of different types of information 0:05:54.236,0:05:55.983 to try to understand: 0:05:55.983,0:05:59.749 what are the public thinking[br]and how can we engage? 0:06:00.289,0:06:01.760 We look for early signals. 0:06:01.760,0:06:03.222 When we find one, 0:06:03.222,0:06:07.012 we have a global network of collaborators[br]in a number of countries 0:06:07.012,0:06:09.794 who have more local [br]intelligence in that setting 0:06:09.794,0:06:11.510 to try to understand -- 0:06:11.510,0:06:13.865 is this signal misinformation, 0:06:13.865,0:06:16.654 or is something brewing[br]that we should know about? 0:06:17.449,0:06:19.338 In London, we have a bigger picture. 0:06:19.338,0:06:23.609 We watch the swarms of rumors[br]not just traveling locally 0:06:23.609,0:06:25.245 but jumping countries. 0:06:25.563,0:06:29.221 We've seen them jump[br]from Japan over to Columbia, 0:06:29.221,0:06:31.097 through Europe and around. 0:06:31.270,0:06:32.459 They move. 0:06:32.499,0:06:35.563 We live in a hyper-connected environment. 0:06:36.053,0:06:38.992 One of the things[br]that we found fascinating, 0:06:38.992,0:06:41.326 and we've learned[br]a lot in the last 10 years -- 0:06:41.326,0:06:43.830 this is our 10th anniversary, 0:06:43.830,0:06:45.367 this didn't start yesterday, 0:06:45.367,0:06:46.958 this rumor problem -- 0:06:46.958,0:06:48.752 and one of the things we've learned 0:06:48.752,0:06:51.053 is in our global monitoring, 0:06:51.053,0:06:55.081 that Europe is the most[br]skeptical region in the world. 0:06:55.357,0:06:57.324 France won the prize, actually. 0:06:57.503,0:06:58.495 (Laughter) 0:06:58.495,0:06:59.631 By far. 0:07:00.119,0:07:04.246 And actually some of those rumors[br]have traveled to other parts of the world. 0:07:05.114,0:07:07.261 But we were trying to understand Europe. 0:07:07.261,0:07:08.258 Hmm. 0:07:08.258,0:07:09.257 Why Europe? 0:07:09.257,0:07:11.229 I thought the US was really -- 0:07:11.229,0:07:13.093 had some of the most skepticism, 0:07:13.093,0:07:15.253 but boy, I was wrong. 0:07:15.492,0:07:16.923 And a political scientist, 0:07:16.923,0:07:18.564 a colleague we work with, 0:07:18.564,0:07:20.248 Jon Kennedy, 0:07:20.248,0:07:24.242 he took our data from 28[br]European countries 0:07:24.242,0:07:25.528 and he looked at it 0:07:25.528,0:07:28.447 and correlated it[br]with political opinion polling. 0:07:29.323,0:07:30.896 And what did he find? 0:07:31.095,0:07:37.242 He found that people who are most likely[br]to vote for a populist party 0:07:37.242,0:07:40.795 also were the ones most likely[br]to strongly disagree 0:07:40.795,0:07:43.844 that vaccines were important,[br]safe or effective. 0:07:44.439,0:07:45.798 What did we learn? 0:07:46.116,0:07:51.187 Vaccines cannot escape[br]the politcal and social turbulance 0:07:51.187,0:07:52.562 that surrounds it. 0:07:52.726,0:07:57.978 Scientists were unprepared[br]for this tsunami of doubt 0:07:57.978,0:07:59.173 and questions 0:07:59.173,0:08:00.462 and distrust. 0:08:00.757,0:08:01.842 What -- 0:08:01.842,0:08:05.592 why are vaccines so ripe for resistance? 0:08:06.229,0:08:08.133 Well, we identified a number of things, 0:08:08.133,0:08:09.401 but one: 0:08:09.401,0:08:14.129 they're highly mediated by government[br]that requires, regulates 0:08:14.129,0:08:17.973 and sometimes recommends vaccines -- 0:08:17.973,0:08:20.933 or often recommends[br]and sometimes requires. 0:08:22.114,0:08:24.446 Big business makes vaccines, 0:08:24.446,0:08:25.989 and neither institution, 0:08:25.989,0:08:27.981 government or big business, 0:08:27.981,0:08:30.783 are high in the trust ranks these days. 0:08:31.231,0:08:34.951 And then there's scientists[br]who discover and develop vaccines, 0:08:34.951,0:08:36.882 and they're pretty elite 0:08:36.882,0:08:39.540 and not accessible to the general public, 0:08:39.540,0:08:41.617 at least the language they speak. 0:08:42.780,0:08:48.396 Third, we're in a hyper-connected[br]environment with social media these days, 0:08:48.396,0:08:55.324 and people can share their unfettered[br]views, concerns, anxieties and worries 0:08:55.324,0:08:58.607 and find a lot of people[br]that think the way they do, 0:08:58.607,0:09:04.374 and think maybe their worries[br]are worth paying attention to. 0:09:04.982,0:09:06.332 And finally, 0:09:06.332,0:09:10.667 vaccines touch every single[br]life on the planet. 0:09:11.190,0:09:13.652 What other health intervention, 0:09:13.652,0:09:15.824 besides water, 0:09:15.824,0:09:18.406 touches every single life? 0:09:19.078,0:09:22.155 So if you're looking[br]for something to disrupt, 0:09:22.155,0:09:23.998 it's a perfect stage. 0:09:25.089,0:09:29.567 Perhaps that's one of the reasons[br]that we need to pay more attention 0:09:29.567,0:09:32.772 and rebuild our trust in issues. 0:09:33.177,0:09:35.824 People are asking all kinds of questions. 0:09:36.558,0:09:38.241 They're asking, 0:09:38.241,0:09:40.209 why are vaccines -- 0:09:40.209,0:09:44.091 and these are the kinds of things[br]we're hearing in our social media -- 0:09:44.091,0:09:49.356 why can't my child have[br]a personalized vaccination schedule? 0:09:50.151,0:09:52.807 What's the wisdom of so many vaccines? 0:09:54.010,0:09:57.207 What about all those ingredients[br]and preservatives? 0:09:57.784,0:09:59.939 These are not crazy people, 0:09:59.939,0:10:02.307 they're not uneducated; 0:10:02.307,0:10:04.480 they're actually worried mothers. 0:10:05.363,0:10:10.578 But some of them have come[br]to me and said we feel ignored, 0:10:10.578,0:10:13.842 we feel judged if we ask a question, 0:10:13.842,0:10:15.524 and we even feel demonized 0:10:15.524,0:10:18.732 that maybe we're part of some[br]antivaccine group. 0:10:20.067,0:10:22.088 So we have some listening to do. 0:10:22.651,0:10:25.230 And maybe that's why last year, 0:10:25.230,0:10:27.249 there was research that found 0:10:27.249,0:10:33.450 that in six months in 2019, 0:10:33.450,0:10:34.732 online -- 0:10:34.732,0:10:36.223 this was with hundreds -- 0:10:36.223,0:10:40.269 100 million different users[br]on social media -- 0:10:40.269,0:10:45.999 although the numbers of individuals[br]who expressed in their online groups 0:10:45.999,0:10:47.720 they were positive, 0:10:47.720,0:10:49.112 as groups, 0:10:49.112,0:10:51.478 the ones who are the most negative 0:10:51.478,0:10:55.402 were recruiting[br]the conversations in the middle 0:10:55.402,0:10:59.173 that were undecided about whether[br]they wanted to get vaccines. 0:10:59.642,0:11:01.156 The highly negative -- 0:11:01.156,0:11:03.605 what we might call[br]the antivaccine groups -- 0:11:03.605,0:11:06.727 were recruiting the undecided 0:11:06.727,0:11:11.324 at a rate 500 percent faster 0:11:11.324,0:11:13.566 than the pro-vaccine groups. 0:11:14.096,0:11:15.891 500 percent faster. 0:11:16.591,0:11:17.839 They were more nimble, 0:11:17.839,0:11:19.278 they were responsive 0:11:19.278,0:11:20.772 and they were listening. 0:11:20.772,0:11:24.222 Most people believe that vaccines are good 0:11:24.222,0:11:26.185 and they believe in their importance. 0:11:26.663,0:11:28.799 But that belief is under attack. 0:11:29.410,0:11:33.434 We need to build in more[br]opportunities for conversation. 0:11:33.869,0:11:35.598 And there are ways to do it. 0:11:35.841,0:11:38.518 It's not easy for some[br]health professionals 0:11:38.518,0:11:41.933 to have conversations where[br]their authority is questioned. 0:11:42.380,0:11:44.047 It's uncomfortable. 0:11:44.362,0:11:47.815 And they're just too busy[br]to listen to all these questions. 0:11:48.338,0:11:50.520 But we need to to do something about that 0:11:50.520,0:11:54.191 because we're losing[br]a lot of concerned parents 0:11:54.191,0:11:56.709 that just want a conversation. 0:11:57.321,0:12:01.700 We should get volunteers[br]trained to sit in waiting rooms, 0:12:01.700,0:12:03.630 to be on hotlines, 0:12:03.630,0:12:05.997 to have online chat forums, 0:12:05.997,0:12:07.616 to have chat boxes. 0:12:08.099,0:12:09.541 In younger kids, 0:12:09.541,0:12:11.548 with younger kids in school, 0:12:11.548,0:12:13.908 teach them about immune systems 0:12:13.908,0:12:15.607 and teach them that actually, 0:12:15.607,0:12:18.159 you know that vaccine[br]your little brother got? 0:12:18.159,0:12:22.858 Well, it just inspired[br]your natural immune system. 0:12:23.479,0:12:26.551 It's a great thing and this is why. 0:12:27.494,0:12:29.346 We need to build that confidence; 0:12:29.346,0:12:30.883 we need to listen. 0:12:32.482,0:12:35.214 Despite all this questioning -- 0:12:35.214,0:12:36.775 and there's a lot of it -- 0:12:36.775,0:12:40.867 I hear probably more[br]than a lot of people -- 0:12:40.867,0:12:42.275 I am an optimist. 0:12:42.526,0:12:46.677 And my optimism is with[br]a younger generation. 0:12:47.094,0:12:52.641 The younger generation who actually now[br]are becoming very aware 0:12:52.641,0:12:55.091 of the risks of social media, 0:12:55.091,0:12:56.887 the false news, 0:12:56.887,0:12:59.705 the false identities, 0:12:59.705,0:13:02.259 and they're starting to embrace science. 0:13:02.753,0:13:08.386 And some of them are a group of children[br]whose mothers refused to vaccinate them. 0:13:09.879,0:13:13.385 Last spring of 2019, 0:13:13.385,0:13:18.491 18-year-old Ethan[br]Lindenberger went on Reddit 0:13:18.491,0:13:20.020 and put out a post. 0:13:21.930,0:13:24.637 "My mother doesn't believe in vaccines. 0:13:24.637,0:13:27.328 She's really worried they cause autism. 0:13:27.550,0:13:29.677 In fact, she strongly believes that. 0:13:30.017,0:13:31.407 But I'm 18. 0:13:31.904,0:13:33.582 I'm a senior in high school. 0:13:33.767,0:13:35.297 I can drive a car, 0:13:35.297,0:13:36.361 I can vote 0:13:36.361,0:13:38.538 and I could go get my own vaccine. 0:13:38.809,0:13:41.127 Can someone tell me where to get it?" 0:13:41.453,0:13:43.639 That post went viral. 0:13:43.990,0:13:47.746 It started to get a whole[br]younger movement going. 0:13:49.397,0:13:52.378 I saw Ethan speak at a conference, 0:13:52.378,0:13:55.825 the Global Vaccine Summit[br]at the EU last fall. 0:13:56.600,0:13:58.568 He spoke eloquently, 0:13:58.568,0:14:00.126 and I was impressed 0:14:00.126,0:14:02.272 in front of a whole forum. 0:14:02.809,0:14:04.877 He told his personal story, 0:14:04.877,0:14:06.614 and then he said to the group, 0:14:06.614,0:14:10.130 he said, "You know, everybody[br]talks about misinformation, 0:14:10.130,0:14:14.164 but I want to tell you about[br]a different kind of misinformation, 0:14:14.164,0:14:19.475 and that's misinformation that says[br]that people like my mother, 0:14:19.475,0:14:21.472 who is a loving mother, 0:14:21.472,0:14:25.566 is a bad person because[br]she doesn't give me vaccines. 0:14:26.237,0:14:31.075 Well, I want to tell all of you[br]that she didn't give me a vaccine 0:14:31.075,0:14:33.149 because she loves me 0:14:33.149,0:14:37.719 and because she believed that[br]that was the best thing for me. 0:14:38.162,0:14:39.629 I think differently 0:14:39.629,0:14:42.284 and I will never change her mind, 0:14:42.284,0:14:44.560 but she's not a bad person." 0:14:45.324,0:14:48.096 That was the message from a teenager. 0:14:48.731,0:14:51.982 Empathy, kindness and understanding. 0:14:53.068,0:14:56.644 We have an abundance[br]of scientific information 0:14:56.644,0:14:58.842 to debunk false rumors. 0:14:59.616,0:15:01.195 That's not our problem. 0:15:01.696,0:15:04.266 We have a relationship problem, 0:15:04.266,0:15:06.442 not a misinformation problem. 0:15:06.739,0:15:09.352 Misinformation is the symptom, 0:15:09.352,0:15:10.678 not the cause. 0:15:11.038,0:15:13.001 If people trust, 0:15:13.001,0:15:16.846 they'll put up with a little risk[br]to avert a much bigger one. 0:15:17.489,0:15:21.119 The one thing that I want and I hope for 0:15:21.119,0:15:24.716 is that we as a medical[br]and health community 0:15:24.716,0:15:28.970 have the moral courage and humility 0:15:28.970,0:15:31.398 to productively engage, 0:15:31.398,0:15:32.801 like Ethan, 0:15:32.801,0:15:35.328 with those who disagree with us. 0:15:35.889,0:15:37.007 I hope so. 0:15:37.306,0:15:38.411 Thank you. 0:15:38.550,0:15:40.335 (Applause and cheers)