0:00:09.073,0:00:11.160 Today, in the palm of your hand, 0:00:11.356,0:00:14.226 you have access to a world of information. 0:00:15.055,0:00:17.802 You can reach a multitude of new sources 0:00:18.318,0:00:20.950 and exchange your views[br]with a wide range of people 0:00:20.950,0:00:22.186 all over the world. 0:00:23.146,0:00:27.056 This new reality should allow us [br]to share a wisdom, 0:00:27.401,0:00:30.216 to communicate, to understand each other 0:00:30.593,0:00:34.044 and to become more understanding [br]of our differences, 0:00:34.051,0:00:35.751 more tolerant of our differences. 0:00:36.582,0:00:41.073 But 2016 seems to have replaced[br]the information age 0:00:41.073,0:00:43.204 with the post-truth era. 0:00:44.013,0:00:46.494 With Brexit and the Trump election, 0:00:46.494,0:00:48.458 we have found, we have discovered, 0:00:48.458,0:00:52.315 that more communication [br]doesn't mean more information. 0:00:54.262,0:00:56.322 Let me show you what I mean. 0:00:58.937,0:01:03.762 Here is a picture of the [br]Trump inauguration ceremony in 2017, 0:01:04.338,0:01:08.499 and the same picture of the [br]Obama inauguration ceremony in 2009. 0:01:09.174,0:01:11.751 The White House declared[br]that the Trump ceremony 0:01:11.751,0:01:13.911 had been the largest in history. 0:01:17.259,0:01:18.851 Did you know 0:01:19.273,0:01:22.295 that 28% of Trump supporters 0:01:22.295,0:01:25.876 said they believed there were [br]as many people, if not more, 0:01:25.892,0:01:27.502 at the Trump ceremony. 0:01:27.882,0:01:30.479 30% said they didn't know. [br]They were not sure. 0:01:30.891,0:01:33.961 And 41%, that is less than half, 0:01:34.467,0:01:37.555 said that they disagreed [br]with the White House statement. 0:01:38.636,0:01:43.128 When you see these pictures, [br]these views may seem crazy. 0:01:44.101,0:01:47.496 So how is it that people can believe [br]something so clearly untrue? 0:01:48.441,0:01:49.471 But wait a minute, 0:01:49.471,0:01:53.097 Are you sure that you would never [br]believe something so clearly untrue? 0:01:54.143,0:01:55.366 Think about it. 0:01:55.475,0:01:57.974 These pictures are not proof. 0:01:58.274,0:01:59.968 They could have been photoshopped. 0:02:00.421,0:02:01.988 They could have been swapped. 0:02:02.721,0:02:05.106 So when somebody raises[br]these doubts to you, 0:02:05.315,0:02:07.625 you will have to use your own judgment 0:02:07.625,0:02:10.232 to weigh the evidence[br]and make up your mind. 0:02:11.120,0:02:13.776 So now suppose that you were[br]a Trump supporter. 0:02:14.300,0:02:15.745 Are you sure 0:02:15.745,0:02:17.660 that you would not give any credence 0:02:17.660,0:02:19.343 to somebody raising these doubts? 0:02:20.129,0:02:22.270 Are you sure that you[br]would never entertain 0:02:22.270,0:02:25.130 that perhaps there were more people [br]at the Trump ceremony, 0:02:25.130,0:02:27.137 and that these pictures are not proof? 0:02:28.112,0:02:29.992 Today, I want to put to you 0:02:29.992,0:02:33.802 that the cause of the fake news success[br]lies primarily with us. 0:02:34.345,0:02:35.693 Fake news works 0:02:35.693,0:02:37.607 because we are willing to believe it. 0:02:38.103,0:02:41.163 And because we are willing to believe it, [br]we lie to ourselves. 0:02:42.725,0:02:45.209 If we want a sound public debate, 0:02:45.209,0:02:48.934 we need to stop lying to ourselves[br]when we engage with the news. 0:02:50.733,0:02:53.905 Let's consider an ideal public debate, 0:02:54.155,0:02:56.694 and think about it as a battle of ideas. 0:02:57.575,0:02:59.965 All ideas are voiced and debated. 0:03:00.965,0:03:04.863 A good idea is convincing,[br]and it wins over less convincing ideas. 0:03:06.005,0:03:08.076 The philosopher Karl Popper 0:03:08.076,0:03:11.087 said that this process[br]should lead public debate 0:03:11.488,0:03:13.466 to select the best ideas. 0:03:13.980,0:03:18.292 Unconvincing ideas disappear, [br]and only the good ones survive. 0:03:19.087,0:03:22.505 And science seems like a perfect example 0:03:22.528,0:03:25.917 where public debate[br]leads to selection of the best ideas. 0:03:27.265,0:03:28.823 With this view in mind, 0:03:28.823,0:03:32.654 the Internet should have [br]a positive effect on public debate. 0:03:33.342,0:03:37.342 On the Internet, ideas are free [br]to be voiced and criticized. 0:03:37.836,0:03:42.136 Good ideas should be able to convince [br]more people and spread around, 0:03:42.604,0:03:44.434 and bad ideas are abandoned. 0:03:45.288,0:03:46.698 But when you look, 0:03:46.698,0:03:50.169 that's not necessarily what is happening[br]on the Internet at the moment. 0:03:50.553,0:03:51.810 So why is that? 0:03:52.430,0:03:58.184 Well, perhaps, this view [br]of the public debate is a bit unrealistic. 0:03:59.206,0:04:03.986 In 1949, Max Planck famously joked [br]about this vision from Popper. 0:04:04.444,0:04:08.827 Max Planck was a theoretical physicist [br]who eventually won a Nobel Prize. 0:04:09.755,0:04:10.972 He said, 0:04:10.972,0:04:15.625 "A new scientific truth doesn't triumph[br]because it convinces its opponents; 0:04:16.500,0:04:19.588 rather, these opponents get old, 0:04:19.870,0:04:21.364 and eventually they die, 0:04:21.364,0:04:24.211 and they are replaced by [br]a new generation of scientists." 0:04:24.849,0:04:28.066 And what Max Planck [br]was alluding to with irony 0:04:28.470,0:04:32.900 was to the reality of debates[br]with humans, us. 0:04:33.502,0:04:37.522 Us humans, we are not designed [br]as perfect rational thinkers 0:04:37.522,0:04:39.746 only looking for the truth. 0:04:39.962,0:04:44.063 The fact is that often, [br]we are attached to our ideas. 0:04:45.470,0:04:47.161 We can be attached to ideas 0:04:47.161,0:04:49.599 because some ideas[br]may be convenient for us - 0:04:49.858,0:04:51.037 for interest. 0:04:51.949,0:04:55.573 So going to the example [br]of the scientists of Max Planck, 0:04:55.573,0:04:57.513 these scientists may have become famous 0:04:57.513,0:05:00.832 because of the ideas[br]that they proposed in the past, 0:05:00.832,0:05:02.514 and which are now old ideas. 0:05:03.094,0:05:06.914 Abandoning these ideas would be[br]losing part of the credit they got 0:05:06.914,0:05:09.716 for proposing these ideas [br]in the first place. 0:05:10.748,0:05:12.315 But it's not just in science. 0:05:12.315,0:05:13.680 If you think of politics, 0:05:13.680,0:05:16.925 if a government proposes[br]to extend social welfare, 0:05:17.853,0:05:19.793 those who would receive social welfare 0:05:19.793,0:05:22.499 have an interest to believe [br]it's good for the country. 0:05:23.302,0:05:25.907 And those who would have to pay[br]for the social welfare 0:05:25.907,0:05:28.574 have an interest to think [br]it's bad for the country, 0:05:28.574,0:05:30.515 it's a bad policy for the country. 0:05:31.739,0:05:34.991 But we are not just attached to ideas[br]for material interests; 0:05:34.991,0:05:36.320 it's much more complex. 0:05:36.320,0:05:39.196 Often, we can be [br]emotionally attached to our ideas. 0:05:39.719,0:05:41.951 They may be part of our identity. 0:05:42.375,0:05:45.965 So if I'm a Christian conservative[br]or a left-wing liberal, 0:05:46.585,0:05:49.158 these ideas may be part of who I am. 0:05:49.812,0:05:54.319 Abandoning these ideas [br]could be losing part of my identity. 0:05:55.517,0:05:59.001 As a consequence,[br]we are attached to our ideas. 0:05:59.492,0:06:02.413 So we're not neutral judges 0:06:02.660,0:06:05.410 when we're considering [br]the evidence for or against them. 0:06:06.306,0:06:10.771 On the contrary, [br]science and behavior 0:06:10.771,0:06:14.148 show that we typically [br]engage in self-deception, 0:06:14.420,0:06:18.090 which means that we are[br]building beliefs which are compatible 0:06:18.553,0:06:21.397 with our interests [br]and with our other beliefs. 0:06:22.110,0:06:24.976 Self-deception is subtle, 0:06:24.976,0:06:27.843 it takes place all the time[br]in your everyday life. 0:06:27.843,0:06:29.655 So I'm going to give you two ways 0:06:29.655,0:06:32.215 in which self-deception[br]can change what we believe in 0:06:32.215,0:06:34.372 and produce convenient[br]views for ourselves. 0:06:35.277,0:06:37.212 First, when you receive some news, 0:06:37.999,0:06:41.313 you have some flexibility [br]in how you consider it. 0:06:41.996,0:06:46.148 If it's positive news[br]which is compatible with your beliefs, 0:06:46.148,0:06:49.040 you can accept it as positive evidence. 0:06:49.804,0:06:51.061 And if it's negative news, 0:06:51.061,0:06:53.443 you can to the contrary[br]choose to discount it 0:06:53.443,0:06:55.740 and not choose to consider it. 0:06:55.798,0:06:57.607 Let me tell you about a study. 0:06:58.336,0:07:01.312 A group of people [br]were asked about beliefs, 0:07:01.312,0:07:04.024 political beliefs [br]and non-political beliefs. 0:07:04.560,0:07:06.612 For their political beliefs,[br]they had to say 0:07:06.612,0:07:10.684 whether they believed in statements[br]such as "Abortion should be legal" - 0:07:10.684,0:07:14.300 very loaded statements,[br]typical of political statements. 0:07:14.584,0:07:18.785 And non-political beliefs [br]were statements such as 0:07:18.883,0:07:22.173 "Second-hand smoking[br]is dangerous to your health." 0:07:23.787,0:07:25.282 So what happened is 0:07:25.282,0:07:28.651 that these people were confronted[br]with contradictions to these beliefs. 0:07:28.651,0:07:31.054 What do you think people did? 0:07:31.054,0:07:34.554 How did they react when they were [br]confronted with these contradictions? 0:07:35.326,0:07:40.146 Well, here's how people reacted[br]with their non-political beliefs. 0:07:40.704,0:07:45.015 So you have the strength of their beliefs[br]before the contradiction and after it. 0:07:45.630,0:07:47.883 So when faced with a series[br]of contradictions, 0:07:47.883,0:07:49.670 they updated their beliefs 0:07:49.670,0:07:53.502 and the strength of their beliefs[br]was lower after facing contradictions. 0:07:54.205,0:07:55.915 But now look at what happened 0:07:55.915,0:07:59.210 when they were faced with contradictions[br]to their political beliefs. 0:07:59.602,0:08:03.702 Here you can see that people resisted [br]the contradiction much more 0:08:03.702,0:08:06.640 and held to their political beliefs. 0:08:07.725,0:08:11.568 That's one way we selectively[br]interpret the news. 0:08:11.865,0:08:13.128 But there is another way. 0:08:13.128,0:08:16.575 We're not just receiving the news; [br]we're looking out for it. 0:08:16.690,0:08:19.467 We are selecting where we want [br]to look for information. 0:08:19.658,0:08:22.095 And typically, we look for[br]confirming information. 0:08:22.592,0:08:23.939 If you are a conservative, 0:08:23.939,0:08:26.522 you are more likely[br]to read a conservative newspaper, 0:08:26.615,0:08:28.732 to watch a conservative news channel, 0:08:28.780,0:08:31.810 and perhaps even to turn off[br]the TV or the radio 0:08:31.810,0:08:34.193 when a left-wing politician [br]is being interviewed. 0:08:35.318,0:08:39.318 Let me show you a hypothetical scenario. 0:08:39.718,0:08:43.315 Let's say that you wake up in the morning[br]and you open your newspaper, 0:08:43.783,0:08:45.261 and in one scenario, 0:08:45.261,0:08:47.423 you've got some news which is not good - 0:08:47.423,0:08:49.240 it's a contradiction to your beliefs. 0:08:49.721,0:08:53.016 Let's say that this news suggests[br]that your favorite politician 0:08:53.016,0:08:54.907 is involved in a political scandal. 0:08:56.739,0:08:59.794 And consider the other situation, [br]where to the contrary, 0:09:00.390,0:09:05.299 the news in the newspaper is positive -[br]it goes with your usual beliefs. 0:09:05.927,0:09:09.586 Perhaps it's another politician, [br]a politician you do not like, 0:09:10.048,0:09:12.138 who is involved in the political scandal. 0:09:13.246,0:09:15.440 Do you think, do you feel, 0:09:15.737,0:09:19.051 that you'd react in the same way[br]to these two situations? 0:09:19.894,0:09:21.805 Well, research shows that you would not. 0:09:21.805,0:09:26.081 Most likely, what happens[br]is that if you find a contradiction, 0:09:26.333,0:09:28.718 you tend to look for all the news sources. 0:09:28.900,0:09:30.949 You give yourself a chance[br]to find something 0:09:30.949,0:09:32.979 which will contradict this negative news. 0:09:33.813,0:09:35.140 Perhaps another newspaper, 0:09:35.140,0:09:37.636 perhaps you will read[br]the fine print in the newspaper 0:09:37.636,0:09:42.023 to make sure that the title[br]was really reflecting the information. 0:09:42.460,0:09:44.810 On the contrary, [br]if you have the positive news, 0:09:44.810,0:09:46.545 you're more likely to stop there, 0:09:46.972,0:09:50.369 you're more likely to be happy to consider[br]that this piece of evidence 0:09:50.369,0:09:53.122 is enough for you to make up[br]your mind on this issue. 0:09:54.554,0:09:59.353 So as much as we would like to think[br]of ourselves as rational thinkers, 0:10:00.043,0:10:04.043 it is a fact that this tendency [br]to look for confirming news 0:10:04.043,0:10:07.900 and to reject negative information [br]is ingrained in us. 0:10:09.563,0:10:11.308 So now before you panic 0:10:11.308,0:10:16.371 and you think that our irrationality[br]is making public debate impossible, 0:10:17.019,0:10:18.759 you can relax. 0:10:19.393,0:10:23.671 These bars have existed forever, [br]way before social media. 0:10:24.330,0:10:26.700 So what's happening with social media 0:10:26.715,0:10:30.130 is that they are exacerbating [br]some of the effects of these bars. 0:10:30.946,0:10:33.775 Let me give you two ways [br]in which they are doing so. 0:10:34.314,0:10:37.485 First, on the Internet [br]you have much more freedom 0:10:37.485,0:10:39.869 to look for the information [br]which is convenient 0:10:39.869,0:10:41.618 to whatever beliefs you have. 0:10:42.616,0:10:44.943 The Internet is like a giant supermarket. 0:10:45.764,0:10:46.984 For any kind of idea, 0:10:46.984,0:10:50.449 you'll be able to find arguments for [br]and supporting this idea. 0:10:51.344,0:10:54.322 Let's consider a crazy idea as an example. 0:10:54.447,0:10:57.912 Let's consider that you believe [br]that the Earth is flat. 0:10:58.891,0:11:01.542 Well, 30 years ago,[br]you would have been a bit alone 0:11:01.542,0:11:05.388 and maybe struggling to find people[br]to give you evidence for this. 0:11:05.529,0:11:07.799 Today, you can just [br]connect to the Internet, 0:11:07.799,0:11:09.850 contact the Flat Earth Society, 0:11:09.850,0:11:13.580 and this society is going to provide you [br]with elements of evidence, 0:11:13.645,0:11:15.844 arguments in favor of your belief. 0:11:17.809,0:11:19.122 That's another way 0:11:19.122,0:11:22.015 in which the Internet helps you[br]engage in self-deception, 0:11:22.015,0:11:23.861 that you can do it collectively. 0:11:24.422,0:11:27.232 You now have a multitude[br]of communities on the Internet 0:11:27.232,0:11:30.402 that have created[br]informational bubbles on their own. 0:11:31.412,0:11:34.254 In these bubbles, [br]people select information, 0:11:34.498,0:11:36.691 interpret information,[br]and repackage it 0:11:36.691,0:11:39.525 in a way that is compatible [br]with the community's views. 0:11:40.159,0:11:42.784 When a new fact is discussed [br]in this community, 0:11:42.784,0:11:46.755 the images which are positive [br]for the community are reinforced, 0:11:46.815,0:11:49.732 and those which are not are dropped, 0:11:49.792,0:11:51.688 and the nuances are lost. 0:11:52.724,0:11:55.859 These communities build simple views 0:11:55.859,0:11:58.561 compatible with the beliefs[br]of the community. 0:11:59.416,0:12:03.136 So the social media have not created [br]a unified public space 0:12:03.160,0:12:04.979 where ideas are debated; 0:12:04.999,0:12:10.862 instead, social media have increased [br]our ability to connect specifically 0:12:10.862,0:12:14.424 with the people whose views [br]mostly match our own. 0:12:15.606,0:12:17.132 And whatever your views, 0:12:17.132,0:12:19.940 you can find a community[br]of like-minded people 0:12:20.147,0:12:24.667 with whom you can share arguments[br]supporting your existing beliefs 0:12:24.667,0:12:26.210 and not challenging them. 0:12:27.903,0:12:29.466 But what do you want? 0:12:30.466,0:12:35.896 Do you want a unified public space [br]where ideas are debated, discussed, 0:12:35.896,0:12:37.880 and where maybe the best ideas can win, 0:12:37.880,0:12:39.928 or a compartmented public space 0:12:40.215,0:12:44.235 where different visions of the world [br]can coexist unchallenged? 0:12:44.996,0:12:49.939 Well, if we want to defend and support 0:12:50.075,0:12:52.336 the existence of an open public space, 0:12:52.336,0:12:55.801 we need to realize[br]that the problem starts with us, 0:12:56.567,0:12:58.634 with how we form opinions 0:12:58.634,0:13:00.862 and how we share them on the social media. 0:13:01.579,0:13:06.379 So perhaps, in the post-truth era [br]we need some guidelines 0:13:06.379,0:13:08.962 about how to interact on the social media. 0:13:09.862,0:13:11.471 Here are three steps. 0:13:12.277,0:13:13.784 Step 1: 0:13:14.448,0:13:18.749 Avoid the narrow selection of information[br]which closely match your views. 0:13:19.912,0:13:21.502 Make sure that in your timeline 0:13:21.502,0:13:24.123 you've got some news sources [br]which challenge your view. 0:13:25.339,0:13:28.522 Try to find people with whom you disagree 0:13:28.703,0:13:30.783 and exchange with them. 0:13:31.230,0:13:33.205 Listen to what they have to say. 0:13:34.026,0:13:36.544 Try to appreciate the points [br]that they may have. 0:13:37.556,0:13:39.726 Give them a chance to change your mind. 0:13:42.182,0:13:43.409 Point 2: 0:13:44.044,0:13:45.293 Question your views. 0:13:46.162,0:13:48.673 The more you'd like an idea to be true, 0:13:48.874,0:13:52.664 the more you need to distrust the way[br]you made up your mind about it. 0:13:53.386,0:13:55.796 Are you sure that you considered 0:13:55.796,0:13:58.778 all the best counterarguments[br]to the views you have? 0:13:58.978,0:14:02.457 that you've considered the possibility[br]of weak points in your reasoning? 0:14:02.468,0:14:04.212 Try to find them. 0:14:04.986,0:14:07.134 Whenever you really want [br]an idea to be true, 0:14:07.134,0:14:09.494 you may remember[br]that at some point in your life, 0:14:09.494,0:14:12.092 we all liked to believe in Santa. 0:14:14.169,0:14:16.660 And even though we really wanted[br]Santa to be true, 0:14:17.503,0:14:19.839 it didn't make it more real in the end. 0:14:20.156,0:14:22.380 So the more you'd like an idea to be true, 0:14:22.380,0:14:26.199 you need to be wondering,[br]Is it too good to be true? 0:14:26.324,0:14:29.010 Do I believe this [br]because I want to believe it 0:14:29.010,0:14:30.954 or because of the evidence? 0:14:31.499,0:14:33.305 Could it be another Santa for me? 0:14:34.632,0:14:36.008 Step 3: 0:14:36.008,0:14:38.512 Avoid contributing[br]to the distortion of facts. 0:14:39.173,0:14:41.793 When you want to forward[br]information on social media, 0:14:42.046,0:14:44.116 make sure you've read it,[br]you understand it, 0:14:44.271,0:14:46.186 and avoid simplifying it in a way 0:14:46.186,0:14:49.237 which is going to conform[br]with the views of the community. 0:14:49.237,0:14:51.768 Try not to contribute[br]to an echo chamber effect 0:14:51.768,0:14:54.596 within the community of views[br]you are participating in. 0:14:55.681,0:15:00.181 If you follow these guidelines, [br]you are going to do yourself a favor: 0:15:01.362,0:15:04.949 you're going to stop building[br]a convenient alternative reality; 0:15:05.365,0:15:09.057 you're going to avoid spreading [br]half truths and distorted facts; 0:15:09.516,0:15:15.231 and you will limit the self-enforcing [br]groups of confirming exchanges, 0:15:15.573,0:15:19.136 which are pushing people [br]in different informational bubbles. 0:15:19.794,0:15:22.914 You will help ideas [br]to be questioned and challenged, 0:15:22.914,0:15:26.576 and you will contribute to making[br]the public space an open space 0:15:26.576,0:15:28.429 where the best ideas can win. 0:15:29.406,0:15:32.434 Will you allow yourself [br]to change your views 0:15:32.863,0:15:34.647 and to abandon old ones? 0:15:35.892,0:15:37.867 Smart people change their mind. 0:15:38.305,0:15:39.594 Choose to be one. 0:15:40.885,0:15:43.198 (Applause)