1 00:00:09,073 --> 00:00:11,160 Today, in the palm of your hand, 2 00:00:11,356 --> 00:00:14,226 you have access to a world of information. 3 00:00:15,055 --> 00:00:17,802 You can reach a multitude of new sources 4 00:00:18,318 --> 00:00:20,950 and exchange your views with a wide range of people 5 00:00:20,950 --> 00:00:22,186 all over the world. 6 00:00:23,146 --> 00:00:27,056 This new reality should allow us to share a wisdom, 7 00:00:27,401 --> 00:00:30,216 to communicate, to understand each other 8 00:00:30,593 --> 00:00:34,044 and to become more understanding of our differences, 9 00:00:34,051 --> 00:00:35,751 more tolerant of our differences. 10 00:00:36,582 --> 00:00:41,073 But 2016 seems to have replaced the information age 11 00:00:41,073 --> 00:00:43,204 with the post-truth era. 12 00:00:44,013 --> 00:00:46,494 With Brexit and the Trump election, 13 00:00:46,494 --> 00:00:48,458 we have found, we have discovered, 14 00:00:48,458 --> 00:00:52,315 that more communication doesn't mean more information. 15 00:00:54,262 --> 00:00:56,322 Let me show you what I mean. 16 00:00:58,937 --> 00:01:03,762 Here is a picture of the Trump inauguration ceremony in 2017, 17 00:01:04,338 --> 00:01:08,499 and the same picture of the Obama inauguration ceremony in 2009. 18 00:01:09,174 --> 00:01:11,751 The White House declared that the Trump ceremony 19 00:01:11,751 --> 00:01:13,911 had been the largest in history. 20 00:01:17,259 --> 00:01:18,851 Did you know 21 00:01:19,273 --> 00:01:22,295 that 28% of Trump supporters 22 00:01:22,295 --> 00:01:25,876 said they believed there were as many people, if not more, 23 00:01:25,892 --> 00:01:27,502 at the Trump ceremony. 24 00:01:27,882 --> 00:01:30,479 30% said they didn't know. They were not sure. 25 00:01:30,891 --> 00:01:33,961 And 41%, that is less than half, 26 00:01:34,467 --> 00:01:37,555 said that they disagreed with the White House statement. 27 00:01:38,636 --> 00:01:43,128 When you see these pictures, these views may seem crazy. 28 00:01:44,101 --> 00:01:47,496 So how is it that people can believe something so clearly untrue? 29 00:01:48,441 --> 00:01:49,471 But wait a minute, 30 00:01:49,471 --> 00:01:53,097 Are you sure that you would never believe something so clearly untrue? 31 00:01:54,143 --> 00:01:55,366 Think about it. 32 00:01:55,475 --> 00:01:57,974 These pictures are not proof. 33 00:01:58,274 --> 00:01:59,968 They could have been photoshopped. 34 00:02:00,421 --> 00:02:01,988 They could have been swapped. 35 00:02:02,721 --> 00:02:05,106 So when somebody raises these doubts to you, 36 00:02:05,315 --> 00:02:07,625 you will have to use your own judgment 37 00:02:07,625 --> 00:02:10,232 to weigh the evidence and make up your mind. 38 00:02:11,120 --> 00:02:13,776 So now suppose that you were a Trump supporter. 39 00:02:14,300 --> 00:02:15,745 Are you sure 40 00:02:15,745 --> 00:02:17,660 that you would not give any credence 41 00:02:17,660 --> 00:02:19,343 to somebody raising these doubts? 42 00:02:20,129 --> 00:02:22,270 Are you sure that you would never entertain 43 00:02:22,270 --> 00:02:25,130 that perhaps there were more people at the Trump ceremony, 44 00:02:25,130 --> 00:02:27,137 and that these pictures are not proof? 45 00:02:28,112 --> 00:02:29,992 Today, I want to put to you 46 00:02:29,992 --> 00:02:33,802 that the cause of the fake news success lies primarily with us. 47 00:02:34,345 --> 00:02:35,693 Fake news works 48 00:02:35,693 --> 00:02:37,607 because we are willing to believe it. 49 00:02:38,103 --> 00:02:41,163 And because we are willing to believe it, we lie to ourselves. 50 00:02:42,725 --> 00:02:45,209 If we want a sound public debate, 51 00:02:45,209 --> 00:02:48,934 we need to stop lying to ourselves when we engage with the news. 52 00:02:50,733 --> 00:02:53,905 Let's consider an ideal public debate, 53 00:02:54,155 --> 00:02:56,694 and think about it as a battle of ideas. 54 00:02:57,575 --> 00:02:59,965 All ideas are voiced and debated. 55 00:03:00,965 --> 00:03:04,863 A good idea is convincing, and it wins over less convincing ideas. 56 00:03:06,005 --> 00:03:08,076 The philosopher Karl Popper 57 00:03:08,076 --> 00:03:11,087 said that this process should lead public debate 58 00:03:11,488 --> 00:03:13,466 to select the best ideas. 59 00:03:13,980 --> 00:03:18,292 Unconvincing ideas disappear, and only the good ones survive. 60 00:03:19,087 --> 00:03:22,505 And science seems like a perfect example 61 00:03:22,528 --> 00:03:25,917 where public debate leads to selection of the best ideas. 62 00:03:27,265 --> 00:03:28,823 With this view in mind, 63 00:03:28,823 --> 00:03:32,654 the Internet should have a positive effect on public debate. 64 00:03:33,342 --> 00:03:37,342 On the Internet, ideas are free to be voiced and criticized. 65 00:03:37,836 --> 00:03:42,136 Good ideas should be able to convince more people and spread around, 66 00:03:42,604 --> 00:03:44,434 and bad ideas are abandoned. 67 00:03:45,288 --> 00:03:46,698 But when you look, 68 00:03:46,698 --> 00:03:50,169 that's not necessarily what is happening on the Internet at the moment. 69 00:03:50,553 --> 00:03:51,810 So why is that? 70 00:03:52,430 --> 00:03:58,184 Well, perhaps, this view of the public debate is a bit unrealistic. 71 00:03:59,206 --> 00:04:03,986 In 1949, Max Planck famously joked about this vision from Popper. 72 00:04:04,444 --> 00:04:08,827 Max Planck was a theoretical physicist who eventually won a Nobel Prize. 73 00:04:09,755 --> 00:04:10,972 He said, 74 00:04:10,972 --> 00:04:15,625 "A new scientific truth doesn't triumph because it convinces its opponents; 75 00:04:16,500 --> 00:04:19,588 rather, these opponents get old, 76 00:04:19,870 --> 00:04:21,364 and eventually they die, 77 00:04:21,364 --> 00:04:24,211 and they are replaced by a new generation of scientists." 78 00:04:24,849 --> 00:04:28,066 And what Max Planck was alluding to with irony 79 00:04:28,470 --> 00:04:32,900 was to the reality of debates with humans, us. 80 00:04:33,502 --> 00:04:37,522 Us humans, we are not designed as perfect rational thinkers 81 00:04:37,522 --> 00:04:39,746 only looking for the truth. 82 00:04:39,962 --> 00:04:44,063 The fact is that often, we are attached to our ideas. 83 00:04:45,470 --> 00:04:47,161 We can be attached to ideas 84 00:04:47,161 --> 00:04:49,599 because some ideas may be convenient for us - 85 00:04:49,858 --> 00:04:51,037 for interest. 86 00:04:51,949 --> 00:04:55,573 So going to the example of the scientists of Max Planck, 87 00:04:55,573 --> 00:04:57,513 these scientists may have become famous 88 00:04:57,513 --> 00:05:00,832 because of the ideas that they proposed in the past, 89 00:05:00,832 --> 00:05:02,514 and which are now old ideas. 90 00:05:03,094 --> 00:05:06,914 Abandoning these ideas would be losing part of the credit they got 91 00:05:06,914 --> 00:05:09,716 for proposing these ideas in the first place. 92 00:05:10,748 --> 00:05:12,315 But it's not just in science. 93 00:05:12,315 --> 00:05:13,680 If you think of politics, 94 00:05:13,680 --> 00:05:16,925 if a government proposes to extend social welfare, 95 00:05:17,853 --> 00:05:19,793 those who would receive social welfare 96 00:05:19,793 --> 00:05:22,499 have an interest to believe it's good for the country. 97 00:05:23,302 --> 00:05:25,907 And those who would have to pay for the social welfare 98 00:05:25,907 --> 00:05:28,574 have an interest to think it's bad for the country, 99 00:05:28,574 --> 00:05:30,515 it's a bad policy for the country. 100 00:05:31,739 --> 00:05:34,991 But we are not just attached to ideas for material interests; 101 00:05:34,991 --> 00:05:36,320 it's much more complex. 102 00:05:36,320 --> 00:05:39,196 Often, we can be emotionally attached to our ideas. 103 00:05:39,719 --> 00:05:41,951 They may be part of our identity. 104 00:05:42,375 --> 00:05:45,965 So if I'm a Christian conservative or a left-wing liberal, 105 00:05:46,585 --> 00:05:49,158 these ideas may be part of who I am. 106 00:05:49,812 --> 00:05:54,319 Abandoning these ideas could be losing part of my identity. 107 00:05:55,517 --> 00:05:59,001 As a consequence, we are attached to our ideas. 108 00:05:59,492 --> 00:06:02,413 So we're not neutral judges 109 00:06:02,660 --> 00:06:05,410 when we're considering the evidence for or against them. 110 00:06:06,306 --> 00:06:10,771 On the contrary, science and behavior 111 00:06:10,771 --> 00:06:14,148 show that we typically engage in self-deception, 112 00:06:14,420 --> 00:06:18,090 which means that we are building beliefs which are compatible 113 00:06:18,553 --> 00:06:21,397 with our interests and with our other beliefs. 114 00:06:22,110 --> 00:06:24,976 Self-deception is subtle, 115 00:06:24,976 --> 00:06:27,843 it takes place all the time in your everyday life. 116 00:06:27,843 --> 00:06:29,655 So I'm going to give you two ways 117 00:06:29,655 --> 00:06:32,215 in which self-deception can change what we believe in 118 00:06:32,215 --> 00:06:34,372 and produce convenient views for ourselves. 119 00:06:35,277 --> 00:06:37,212 First, when you receive some news, 120 00:06:37,999 --> 00:06:41,313 you have some flexibility in how you consider it. 121 00:06:41,996 --> 00:06:46,148 If it's positive news which is compatible with your beliefs, 122 00:06:46,148 --> 00:06:49,040 you can accept it as positive evidence. 123 00:06:49,804 --> 00:06:51,061 And if it's negative news, 124 00:06:51,061 --> 00:06:53,443 you can to the contrary choose to discount it 125 00:06:53,443 --> 00:06:55,740 and not choose to consider it. 126 00:06:55,798 --> 00:06:57,607 Let me tell you about a study. 127 00:06:58,336 --> 00:07:01,312 A group of people were asked about beliefs, 128 00:07:01,312 --> 00:07:04,024 political beliefs and non-political beliefs. 129 00:07:04,560 --> 00:07:06,612 For their political beliefs, they had to say 130 00:07:06,612 --> 00:07:10,684 whether they believed in statements such as "Abortion should be legal" - 131 00:07:10,684 --> 00:07:14,300 very loaded statements, typical of political statements. 132 00:07:14,584 --> 00:07:18,785 And non-political beliefs were statements such as 133 00:07:18,883 --> 00:07:22,173 "Second-hand smoking is dangerous to your health." 134 00:07:23,787 --> 00:07:25,282 So what happened is 135 00:07:25,282 --> 00:07:28,651 that these people were confronted with contradictions to these beliefs. 136 00:07:28,651 --> 00:07:31,054 What do you think people did? 137 00:07:31,054 --> 00:07:34,554 How did they react when they were confronted with these contradictions? 138 00:07:35,326 --> 00:07:40,146 Well, here's how people reacted with their non-political beliefs. 139 00:07:40,704 --> 00:07:45,015 So you have the strength of their beliefs before the contradiction and after it. 140 00:07:45,630 --> 00:07:47,883 So when faced with a series of contradictions, 141 00:07:47,883 --> 00:07:49,670 they updated their beliefs 142 00:07:49,670 --> 00:07:53,502 and the strength of their beliefs was lower after facing contradictions. 143 00:07:54,205 --> 00:07:55,915 But now look at what happened 144 00:07:55,915 --> 00:07:59,210 when they were faced with contradictions to their political beliefs. 145 00:07:59,602 --> 00:08:03,702 Here you can see that people resisted the contradiction much more 146 00:08:03,702 --> 00:08:06,640 and held to their political beliefs. 147 00:08:07,725 --> 00:08:11,568 That's one way we selectively interpret the news. 148 00:08:11,865 --> 00:08:13,128 But there is another way. 149 00:08:13,128 --> 00:08:16,575 We're not just receiving the news; we're looking out for it. 150 00:08:16,690 --> 00:08:19,467 We are selecting where we want to look for information. 151 00:08:19,658 --> 00:08:22,095 And typically, we look for confirming information. 152 00:08:22,592 --> 00:08:23,939 If you are a conservative, 153 00:08:23,939 --> 00:08:26,522 you are more likely to read a conservative newspaper, 154 00:08:26,615 --> 00:08:28,732 to watch a conservative news channel, 155 00:08:28,780 --> 00:08:31,810 and perhaps even to turn off the TV or the radio 156 00:08:31,810 --> 00:08:34,193 when a left-wing politician is being interviewed. 157 00:08:35,318 --> 00:08:39,318 Let me show you a hypothetical scenario. 158 00:08:39,718 --> 00:08:43,315 Let's say that you wake up in the morning and you open your newspaper, 159 00:08:43,783 --> 00:08:45,261 and in one scenario, 160 00:08:45,261 --> 00:08:47,423 you've got some news which is not good - 161 00:08:47,423 --> 00:08:49,240 it's a contradiction to your beliefs. 162 00:08:49,721 --> 00:08:53,016 Let's say that this news suggests that your favorite politician 163 00:08:53,016 --> 00:08:54,907 is involved in a political scandal. 164 00:08:56,739 --> 00:08:59,794 And consider the other situation, where to the contrary, 165 00:09:00,390 --> 00:09:05,299 the news in the newspaper is positive - it goes with your usual beliefs. 166 00:09:05,927 --> 00:09:09,586 Perhaps it's another politician, a politician you do not like, 167 00:09:10,048 --> 00:09:12,138 who is involved in the political scandal. 168 00:09:13,246 --> 00:09:15,440 Do you think, do you feel, 169 00:09:15,737 --> 00:09:19,051 that you'd react in the same way to these two situations? 170 00:09:19,894 --> 00:09:21,805 Well, research shows that you would not. 171 00:09:21,805 --> 00:09:26,081 Most likely, what happens is that if you find a contradiction, 172 00:09:26,333 --> 00:09:28,718 you tend to look for all the news sources. 173 00:09:28,900 --> 00:09:30,949 You give yourself a chance to find something 174 00:09:30,949 --> 00:09:32,979 which will contradict this negative news. 175 00:09:33,813 --> 00:09:35,140 Perhaps another newspaper, 176 00:09:35,140 --> 00:09:37,636 perhaps you will read the fine print in the newspaper 177 00:09:37,636 --> 00:09:42,023 to make sure that the title was really reflecting the information. 178 00:09:42,460 --> 00:09:44,810 On the contrary, if you have the positive news, 179 00:09:44,810 --> 00:09:46,545 you're more likely to stop there, 180 00:09:46,972 --> 00:09:50,369 you're more likely to be happy to consider that this piece of evidence 181 00:09:50,369 --> 00:09:53,122 is enough for you to make up your mind on this issue. 182 00:09:54,554 --> 00:09:59,353 So as much as we would like to think of ourselves as rational thinkers, 183 00:10:00,043 --> 00:10:04,043 it is a fact that this tendency to look for confirming news 184 00:10:04,043 --> 00:10:07,900 and to reject negative information is ingrained in us. 185 00:10:09,563 --> 00:10:11,308 So now before you panic 186 00:10:11,308 --> 00:10:16,371 and you think that our irrationality is making public debate impossible, 187 00:10:17,019 --> 00:10:18,759 you can relax. 188 00:10:19,393 --> 00:10:23,671 These bars have existed forever, way before social media. 189 00:10:24,330 --> 00:10:26,700 So what's happening with social media 190 00:10:26,715 --> 00:10:30,130 is that they are exacerbating some of the effects of these bars. 191 00:10:30,946 --> 00:10:33,775 Let me give you two ways in which they are doing so. 192 00:10:34,314 --> 00:10:37,485 First, on the Internet you have much more freedom 193 00:10:37,485 --> 00:10:39,869 to look for the information which is convenient 194 00:10:39,869 --> 00:10:41,618 to whatever beliefs you have. 195 00:10:42,616 --> 00:10:44,943 The Internet is like a giant supermarket. 196 00:10:45,764 --> 00:10:46,984 For any kind of idea, 197 00:10:46,984 --> 00:10:50,449 you'll be able to find arguments for and supporting this idea. 198 00:10:51,344 --> 00:10:54,322 Let's consider a crazy idea as an example. 199 00:10:54,447 --> 00:10:57,912 Let's consider that you believe that the Earth is flat. 200 00:10:58,891 --> 00:11:01,542 Well, 30 years ago, you would have been a bit alone 201 00:11:01,542 --> 00:11:05,388 and maybe struggling to find people to give you evidence for this. 202 00:11:05,529 --> 00:11:07,799 Today, you can just connect to the Internet, 203 00:11:07,799 --> 00:11:09,850 contact the Flat Earth Society, 204 00:11:09,850 --> 00:11:13,580 and this society is going to provide you with elements of evidence, 205 00:11:13,645 --> 00:11:15,844 arguments in favor of your belief. 206 00:11:17,809 --> 00:11:19,122 That's another way 207 00:11:19,122 --> 00:11:22,015 in which the Internet helps you engage in self-deception, 208 00:11:22,015 --> 00:11:23,861 that you can do it collectively. 209 00:11:24,422 --> 00:11:27,232 You now have a multitude of communities on the Internet 210 00:11:27,232 --> 00:11:30,402 that have created informational bubbles on their own. 211 00:11:31,412 --> 00:11:34,254 In these bubbles, people select information, 212 00:11:34,498 --> 00:11:36,691 interpret information, and repackage it 213 00:11:36,691 --> 00:11:39,525 in a way that is compatible with the community's views. 214 00:11:40,159 --> 00:11:42,784 When a new fact is discussed in this community, 215 00:11:42,784 --> 00:11:46,755 the images which are positive for the community are reinforced, 216 00:11:46,815 --> 00:11:49,732 and those which are not are dropped, 217 00:11:49,792 --> 00:11:51,688 and the nuances are lost. 218 00:11:52,724 --> 00:11:55,859 These communities build simple views 219 00:11:55,859 --> 00:11:58,561 compatible with the beliefs of the community. 220 00:11:59,416 --> 00:12:03,136 So the social media have not created a unified public space 221 00:12:03,160 --> 00:12:04,979 where ideas are debated; 222 00:12:04,999 --> 00:12:10,862 instead, social media have increased our ability to connect specifically 223 00:12:10,862 --> 00:12:14,424 with the people whose views mostly match our own. 224 00:12:15,606 --> 00:12:17,132 And whatever your views, 225 00:12:17,132 --> 00:12:19,940 you can find a community of like-minded people 226 00:12:20,147 --> 00:12:24,667 with whom you can share arguments supporting your existing beliefs 227 00:12:24,667 --> 00:12:26,210 and not challenging them. 228 00:12:27,903 --> 00:12:29,466 But what do you want? 229 00:12:30,466 --> 00:12:35,896 Do you want a unified public space where ideas are debated, discussed, 230 00:12:35,896 --> 00:12:37,880 and where maybe the best ideas can win, 231 00:12:37,880 --> 00:12:39,928 or a compartmented public space 232 00:12:40,215 --> 00:12:44,235 where different visions of the world can coexist unchallenged? 233 00:12:44,996 --> 00:12:49,939 Well, if we want to defend and support 234 00:12:50,075 --> 00:12:52,336 the existence of an open public space, 235 00:12:52,336 --> 00:12:55,801 we need to realize that the problem starts with us, 236 00:12:56,567 --> 00:12:58,634 with how we form opinions 237 00:12:58,634 --> 00:13:00,862 and how we share them on the social media. 238 00:13:01,579 --> 00:13:06,379 So perhaps, in the post-truth era we need some guidelines 239 00:13:06,379 --> 00:13:08,962 about how to interact on the social media. 240 00:13:09,862 --> 00:13:11,471 Here are three steps. 241 00:13:12,277 --> 00:13:13,784 Step 1: 242 00:13:14,448 --> 00:13:18,749 Avoid the narrow selection of information which closely match your views. 243 00:13:19,912 --> 00:13:21,502 Make sure that in your timeline 244 00:13:21,502 --> 00:13:24,123 you've got some news sources which challenge your view. 245 00:13:25,339 --> 00:13:28,522 Try to find people with whom you disagree 246 00:13:28,703 --> 00:13:30,783 and exchange with them. 247 00:13:31,230 --> 00:13:33,205 Listen to what they have to say. 248 00:13:34,026 --> 00:13:36,544 Try to appreciate the points that they may have. 249 00:13:37,556 --> 00:13:39,726 Give them a chance to change your mind. 250 00:13:42,182 --> 00:13:43,409 Point 2: 251 00:13:44,044 --> 00:13:45,293 Question your views. 252 00:13:46,162 --> 00:13:48,673 The more you'd like an idea to be true, 253 00:13:48,874 --> 00:13:52,664 the more you need to distrust the way you made up your mind about it. 254 00:13:53,386 --> 00:13:55,796 Are you sure that you considered 255 00:13:55,796 --> 00:13:58,778 all the best counterarguments to the views you have? 256 00:13:58,978 --> 00:14:02,457 that you've considered the possibility of weak points in your reasoning? 257 00:14:02,468 --> 00:14:04,212 Try to find them. 258 00:14:04,986 --> 00:14:07,134 Whenever you really want an idea to be true, 259 00:14:07,134 --> 00:14:09,494 you may remember that at some point in your life, 260 00:14:09,494 --> 00:14:12,092 we all liked to believe in Santa. 261 00:14:14,169 --> 00:14:16,660 And even though we really wanted Santa to be true, 262 00:14:17,503 --> 00:14:19,839 it didn't make it more real in the end. 263 00:14:20,156 --> 00:14:22,380 So the more you'd like an idea to be true, 264 00:14:22,380 --> 00:14:26,199 you need to be wondering, Is it too good to be true? 265 00:14:26,324 --> 00:14:29,010 Do I believe this because I want to believe it 266 00:14:29,010 --> 00:14:30,954 or because of the evidence? 267 00:14:31,499 --> 00:14:33,305 Could it be another Santa for me? 268 00:14:34,632 --> 00:14:36,008 Step 3: 269 00:14:36,008 --> 00:14:38,512 Avoid contributing to the distortion of facts. 270 00:14:39,173 --> 00:14:41,793 When you want to forward information on social media, 271 00:14:42,046 --> 00:14:44,116 make sure you've read it, you understand it, 272 00:14:44,271 --> 00:14:46,186 and avoid simplifying it in a way 273 00:14:46,186 --> 00:14:49,237 which is going to conform with the views of the community. 274 00:14:49,237 --> 00:14:51,768 Try not to contribute to an echo chamber effect 275 00:14:51,768 --> 00:14:54,596 within the community of views you are participating in. 276 00:14:55,681 --> 00:15:00,181 If you follow these guidelines, you are going to do yourself a favor: 277 00:15:01,362 --> 00:15:04,949 you're going to stop building a convenient alternative reality; 278 00:15:05,365 --> 00:15:09,057 you're going to avoid spreading half truths and distorted facts; 279 00:15:09,516 --> 00:15:15,231 and you will limit the self-enforcing groups of confirming exchanges, 280 00:15:15,573 --> 00:15:19,136 which are pushing people in different informational bubbles. 281 00:15:19,794 --> 00:15:22,914 You will help ideas to be questioned and challenged, 282 00:15:22,914 --> 00:15:26,576 and you will contribute to making the public space an open space 283 00:15:26,576 --> 00:15:28,429 where the best ideas can win. 284 00:15:29,406 --> 00:15:32,434 Will you allow yourself to change your views 285 00:15:32,863 --> 00:15:34,647 and to abandon old ones? 286 00:15:35,892 --> 00:15:37,867 Smart people change their mind. 287 00:15:38,305 --> 00:15:39,594 Choose to be one. 288 00:15:40,885 --> 00:15:43,198 (Applause)