0:00:09.156,0:00:11.327 A few years ago, after an interview, 0:00:11.327,0:00:13.396 the cameramen aside of the journalist 0:00:13.396,0:00:14.839 hang around to talk to me. 0:00:15.516,0:00:17.996 He was amazed at the things I said, 0:00:17.996,0:00:21.676 because he had always heard[br]something else, on that subject, 0:00:21.676,0:00:23.486 and he wanted some clarification. 0:00:24.236,0:00:26.916 So I brought him proof of what I said, 0:00:27.596,0:00:31.776 showed him how his beliefs[br]had no scientific ground, 0:00:31.776,0:00:35.033 and we pleasantly chatted[br]for half an hour. 0:00:36.076,0:00:40.116 Eventually, he said,[br]"I understand, but I don't trust you." 0:00:41.756,0:00:45.503 I understand, but I don't trust you. 0:00:46.436,0:00:49.959 Even today, I'm grateful to that person[br]for the lesson he taught me. 0:00:49.959,0:00:53.116 That day I realized[br]that I didn't understand anything, 0:00:53.116,0:00:54.236 I had it all wrong. 0:00:54.750,0:00:57.396 I don't know if I've learned[br]how to communicate. 0:00:57.396,0:01:00.636 but I've certainly learned how not to. 0:01:00.636,0:01:03.211 How did I feel? Was that frustrating? 0:01:03.211,0:01:07.796 Yes, a lot: but also challenging. 0:01:07.796,0:01:10.476 If the facts are here, and they're proven, 0:01:11.236,0:01:15.116 I still have to find a way[br]to communicate them effectively. 0:01:15.116,0:01:19.026 I mean, in such a way[br]that they come to help our decisions. 0:01:19.876,0:01:24.636 So I started studying: I wanted[br]to understand better, I was curious. 0:01:25.316,0:01:30.036 The fact is, we all carry on[br]with our preconceptions, 0:01:30.617,0:01:31.676 that's how we evolved. 0:01:32.321,0:01:33.716 They also have a utility: 0:01:33.716,0:01:38.676 let's say that preconceptions[br]are prefabricated, ready-to-use ideas 0:01:38.676,0:01:40.138 that we carry with us. 0:01:40.138,0:01:44.807 Ready to be used quickly[br]when we need a quick decisions. 0:01:46.036,0:01:48.876 However, they also have[br]a rather serious side effect: 0:01:49.436,0:01:53.876 they prevent us from considering[br]new information, 0:01:53.876,0:01:56.796 especially when these are at odds 0:01:56.796,0:01:58.756 with the beliefs we already have. 0:01:58.756,0:02:02.916 They make us less agile, harder for us[br]to change our minds. 0:02:03.436,0:02:05.676 Yeah, because changing[br]your mind is painful, 0:02:05.676,0:02:07.716 it's admitting that we were wrong. 0:02:08.561,0:02:09.956 That's how our minds are. 0:02:10.516,0:02:13.863 This phenomenon is called[br]"cognitive dissonance", 0:02:13.863,0:02:15.436 makes us suffer 0:02:15.436,0:02:18.299 and the mind will do anything to avoid it. 0:02:18.299,0:02:19.309 We need to know it. 0:02:19.849,0:02:21.876 But this is annoying: 0:02:21.876,0:02:24.276 we need to make the right decision, 0:02:24.276,0:02:28.396 or at least the rightest one[br]with the information at our disposal. 0:02:28.396,0:02:30.286 How can we do that? 0:02:31.196,0:02:35.236 Well, there are tricks to cheat the mind 0:02:35.236,0:02:37.116 that respect its dynamics 0:02:37.116,0:02:41.036 but at the same time allow us[br]to acquire new information 0:02:41.036,0:02:43.849 and change your mind, if that's the case. 0:02:43.849,0:02:44.956 For example, a strategy 0:02:44.956,0:02:50.266 is give points of advantage[br]to the mind, flatter it. 0:02:50.266,0:02:52.116 "You did good! 0:02:52.116,0:02:54.956 The information you had,[br]the beliefs you had. 0:02:54.956,0:02:58.556 were correct, acceptable,[br]knowing what you knew. 0:02:59.276,0:03:01.036 But now there are new infos. 0:03:01.036,0:03:04.324 Show how agile you are,[br]how quick you are to adjust. 0:03:05.036,0:03:11.012 Things like this: we can afford[br]to be condescending to ourselves. 0:03:12.436,0:03:16.236 At this point, you must be wondering[br]what I was talking about that day, 0:03:16.236,0:03:20.196 what will ever be so thorny, scary, 0:03:20.196,0:03:23.756 to be absolutely rejected. 0:03:23.756,0:03:26.596 Well, I was talking[br]about agriculture, food, 0:03:26.596,0:03:30.196 innovations related[br]to the food production system, 0:03:30.716,0:03:32.756 genetics, biotechnology. 0:03:33.996,0:03:35.892 I won't discuss it here today. 0:03:35.876,0:03:40.756 a TEDx is a cup of tea[br]that only refreshes for a few minutes, 0:03:41.396,0:03:43.716 a mouthful of ideas and stimuli. 0:03:45.076,0:03:47.216 But I want to tell you that I'm worried, 0:03:47.216,0:03:50.431 because I see how our society is facing 0:03:50.431,0:03:53.559 decisions related to food production. 0:03:54.956,0:03:58.636 Since it's a delicate subject,[br]because it touches our belly, 0:03:58.636,0:04:01.116 culinary traditions, 0:04:01.116,0:04:04.636 the environment and its protection,[br]the landscape, health, 0:04:05.756,0:04:10.796 is a subject riddled of ideologies,[br]and therefore of preconceptions. 0:04:10.796,0:04:15.825 We all think we have the right recipe[br]for sustainable agriculture. 0:04:15.825,0:04:18.036 We are all so convinced! 0:04:18.036,0:04:20.456 And so, when a different idea comes along 0:04:20.456,0:04:24.636 we lock ourselves in defense,[br]ready to retaliate to demolish it. 0:04:25.516,0:04:29.716 The mainstream is really powerful, 0:04:29.716,0:04:33.236 the dominant narrative[br]we've heard for years 0:04:33.236,0:04:35.116 in certain supermarkets ads, 0:04:35.116,0:04:36.361 for example, 0:04:36.361,0:04:40.316 or we have seen on certain[br]television broadcasts, 0:04:40.316,0:04:43.236 or have read in the stories and articles 0:04:43.236,0:04:45.756 of some opinionist etc. 0:04:45.756,0:04:51.476 that dominant narrative[br]shaped our thoughts. 0:04:51.996,0:04:57.036 And so, today, by instinct,[br]it seems good, clean, right to us. 0:04:57.036,0:04:58.815 We would never question it. 0:04:59.676,0:05:04.356 This applies to everyone, be careful:[br]it is a risk that we all take. 0:05:04.356,0:05:06.907 Faced with a verified fact that shows us 0:05:06.907,0:05:09.997 that our position has limits, 0:05:09.997,0:05:11.956 we won't change our minds: 0:05:12.413,0:05:14.196 rather, we'll find a different reason 0:05:14.196,0:05:16.516 to keep thinking the same way, 0:05:17.036,0:05:20.556 maintaining the status quo. 0:05:20.556,0:05:23.916 This is a proven fact:[br]there are many years of study, 0:05:23.916,0:05:27.288 and many scientific works that prove it. 0:05:27.288,0:05:29.735 That's who we are,[br]and it's better to know. 0:05:30.396,0:05:33.036 Democracy also has[br]to come to terms with it: 0:05:33.036,0:05:37.076 on the other hand, we come[br]from a very instructive period. 0:05:38.236,0:05:40.990 You've heard of "infodemia," 0:05:40.990,0:05:44.596 that cacophony of information 0:05:44.596,0:05:47.236 we have been subjected to[br]in the last year. 0:05:48.036,0:05:50.716 It caused us such a discomfort 0:05:50.716,0:05:53.076 that in some protests 0:05:53.076,0:05:57.436 billboards with the sentence[br]"Enough with science" showed up. 0:05:57.928,0:05:59.516 Which, of course, is absurd. 0:06:00.108,0:06:03.476 I mean, we can take a stand[br]against another stand; 0:06:03.476,0:06:07.356 but science doesn't take a stand,[br]so what's the point of attacking it? 0:06:08.196,0:06:12.276 And yet it is a signal[br]of the frustration, fear 0:06:12.276,0:06:15.477 that some people feel,[br]it's understandable. 0:06:16.316,0:06:18.047 Sometimes, some people ask me: 0:06:18.047,0:06:21.036 "How do you choose who to trust?" 0:06:21.840,0:06:24.986 Which journalist, which communicator? 0:06:25.926,0:06:30.116 I choose those who care[br]about their reputation. 0:06:30.836,0:06:32.996 Those who constantly[br]verifies their sources, 0:06:32.996,0:06:37.956 because she fears the shame[br]of writing unverified things. 0:06:39.436,0:06:42.836 I choose those who can't afford[br]to lose their reputation, 0:06:42.836,0:06:45.456 because it's her most precious asset, 0:06:45.456,0:06:48.456 and then communicates responsibly. 0:06:50.636,0:06:54.436 Today's theme is "Second Chance". 0:06:54.872,0:06:57.721 If I met that cameraman today,[br]what would I tell him? 0:06:57.721,0:07:01.196 Well, I don't think I'd bother[br]taking his fears away. 0:07:01.196,0:07:02.236 I'd leave it to him. 0:07:02.236,0:07:05.396 I understand, we attach to our fears. 0:07:06.676,0:07:11.916 No, I think I'd rather show him[br]what opportunities we're missing. 0:07:12.596,0:07:15.436 what chances of sustainable[br]food production 0:07:15.436,0:07:17.306 which we said no to. 0:07:17.306,0:07:19.196 Lost benefits. 0:07:19.676,0:07:21.036 Yeah, I think I'd do that. 0:07:21.036,0:07:27.196 because I have learned, each of us[br]has fears to be respected, 0:07:28.135,0:07:30.496 and we are solely responsible for. 0:07:30.996,0:07:32.796 So, today, when I need to talk 0:07:32.796,0:07:36.316 of complex, polarizing,[br]scary themes, what do I do? 0:07:37.356,0:07:42.356 Well, I'm thinking of Mrs. Paola,[br]a lady who follows me. 0:07:42.356,0:07:44.836 and who wrote to me[br]after reading my article 0:07:44.836,0:07:48.316 where I was talking about agriculture,[br]innovation, environmentalism. 0:07:49.316,0:07:53.836 That environmentalism that often[br]forgets its original mission, 0:07:53.836,0:07:57.476 and rejects innovations that are also[br]beneficial to the environment, 0:07:57.476,0:07:58.876 in the name of ideology. 0:08:00.036,0:08:01.716 Mrs. Paola wrote to me: 0:08:02.479,0:08:06.821 "I became interested in agriculture[br]at the tender age of 73, 0:08:06.821,0:08:08.476 for the love of my grandchildren. 0:08:08.476,0:08:09.486 Because I understood 0:08:09.486,0:08:13.556 that much of their future[br]depends on this branch of science. 0:08:14.396,0:08:16.596 And speaking of fear marketing, 0:08:16.596,0:08:19.956 marketing that exploits consumers' fears, 0:08:19.956,0:08:22.396 she asked me, "Who benefits? 0:08:23.156,0:08:27.966 I don't want to be a useful idiot[br]in someone else's hands anymore." 0:08:30.276,0:08:32.806 I told her, you're not a useful idiot. 0:08:32.806,0:08:37.156 you are living proof that democracy[br]is an immense value. 0:08:38.316,0:08:42.076 You are proof of how valuable[br]an election vote is. 0:08:42.076,0:08:44.996 a choice of purchase,[br]a freedom of thought, 0:08:45.676,0:08:48.436 a choice of sharing contents[br]and considerations. 0:08:50.396,0:08:52.276 What's pushing Mrs. Paola? 0:08:53.026,0:08:54.036 Curiosity. 0:08:54.796,0:08:58.356 Which has no age, and is strong enough 0:08:58.356,0:09:02.016 to push us to look beyond our beliefs. 0:09:03.276,0:09:05.116 Curiosity is a precious ally 0:09:05.116,0:09:09.236 to take our fears, our mind by the hand, 0:09:09.236,0:09:11.556 and start walking again. 0:09:11.556,0:09:14.476 It's the engine of discovery, of the new. 0:09:16.756,0:09:20.756 This is not a happy ending story:[br]this is still an endless story. 0:09:21.716,0:09:24.436 Yes, we can't live without ideologies. 0:09:25.356,0:09:27.996 But we can keep looking them in the face, 0:09:27.996,0:09:30.596 with genuine curiosity.