1 00:00:00,742 --> 00:00:07,963 So we humans have an extraordinary potential for goodness, 2 00:00:07,963 --> 00:00:12,363 but also an immense power to do harm. 3 00:00:12,363 --> 00:00:18,029 Any tool can be used to build or to destroy. 4 00:00:18,036 --> 00:00:21,217 That all depends on our motivation. 5 00:00:21,217 --> 00:00:24,664 Therefore, it is all the more important 6 00:00:24,664 --> 00:00:30,748 to foster an altruistic motivation rather than a selfish one. 7 00:00:30,748 --> 00:00:37,008 So now, we indeed are facing many challenges in our times. 8 00:00:37,008 --> 00:00:40,325 Those could be personal challenges. 9 00:00:40,325 --> 00:00:44,481 Our own mind can be our best friend or our worst enemy. 10 00:00:44,481 --> 00:00:49,775 There's also societal challenges: 11 00:00:49,775 --> 00:00:51,586 poverty in the midst of plenty, 12 00:00:51,586 --> 00:00:54,964 inequalities, conflict, injustice. 13 00:00:54,964 --> 00:00:59,120 And then there are the new challenges, which we don't expect. 14 00:00:59,120 --> 00:01:04,263 Ten thousand years ago, there was about five million human beings on Earth. 15 00:01:04,263 --> 00:01:06,463 Whatever they could do, the Earth's resilience 16 00:01:06,463 --> 00:01:11,089 will soon heal human activities. 17 00:01:11,089 --> 00:01:13,775 After the Industrial and Technological Revolutions, 18 00:01:13,775 --> 00:01:16,178 that's not the same anymore. 19 00:01:16,178 --> 00:01:20,473 We are now the major agent of impact on our Earth. 20 00:01:20,473 --> 00:01:25,187 We've entered the Anthropocene, the era of human beings. 21 00:01:25,187 --> 00:01:28,170 So in a way, if we were to say, 22 00:01:28,170 --> 00:01:32,326 "we need to continue this endless growth, 23 00:01:32,326 --> 00:01:35,786 endless use of material resources", 24 00:01:35,786 --> 00:01:38,514 it's like saying if this man was saying, 25 00:01:38,514 --> 00:01:43,866 and I heard the former heads of state, I won't mention who, saying, 26 00:01:43,866 --> 00:01:47,395 "Five years ago, we were at the edge of the precipice. 27 00:01:47,395 --> 00:01:49,972 Today we made a big step forward." 28 00:01:51,017 --> 00:01:53,900 So this edge is the same 29 00:01:53,900 --> 00:01:57,204 which have been defined by scientists 30 00:01:57,204 --> 00:01:59,654 as the planetary boundaries, 31 00:01:59,654 --> 00:02:03,705 and within those boundaries, they can carry a number of factors. 32 00:02:03,705 --> 00:02:09,231 We can still prosper, humanity can still prosper for 150,000 years 33 00:02:09,231 --> 00:02:12,563 if we keep the same stability of climate 34 00:02:12,563 --> 00:02:15,721 as in the Holocene for the last 10,000 years. 35 00:02:15,721 --> 00:02:21,479 But this depends on choosing a voluntary simplicity, 36 00:02:21,479 --> 00:02:24,160 growing qualitatively, not quantitatively. 37 00:02:24,160 --> 00:02:26,854 So in 1900, as you can see, 38 00:02:26,854 --> 00:02:30,394 we were well within the limits of safety. 39 00:02:30,394 --> 00:02:35,793 Now, in 1950 came the great acceleration. 40 00:02:35,793 --> 00:02:38,872 Now hold your breath, not too long, 41 00:02:38,872 --> 00:02:40,742 to imagine what comes next. 42 00:02:40,742 --> 00:02:44,917 Now we have vastly overrun 43 00:02:44,917 --> 00:02:47,355 some of the planetary boundaries. 44 00:02:47,355 --> 00:02:49,631 Just to take biodiversity, 45 00:02:49,631 --> 00:02:53,376 at the current rate, in 2050, 46 00:02:53,376 --> 00:02:55,999 30 percent of all species on Earth 47 00:02:55,999 --> 00:02:57,718 will have disappeared. 48 00:02:57,718 --> 00:03:03,545 Even if we keep their DNA in some fridge, that's not going to be reversible. 49 00:03:03,545 --> 00:03:05,786 So here I am sitting in front 50 00:03:05,786 --> 00:03:10,836 of a 7,000 meters high, 21,000 feet glacier in Bhutan. 51 00:03:10,836 --> 00:03:17,987 At the third pole, two thousand glaciers are melting fast, faster than the Arctic. 52 00:03:17,987 --> 00:03:21,934 So what can we do in that situation? 53 00:03:21,934 --> 00:03:26,821 Well, however complex, 54 00:03:26,821 --> 00:03:29,983 politically, economically, scientifically 55 00:03:29,983 --> 00:03:32,131 the question of the environment is, 56 00:03:32,131 --> 00:03:38,678 it simply boils down to a question of altruism versus selfishness. 57 00:03:38,678 --> 00:03:42,537 I'm a Marxist of the Groucho tendency. 58 00:03:42,537 --> 00:03:43,972 (Laughter) 59 00:03:43,972 --> 00:03:47,641 Groucho Marx said, "Why should I care for future generations? 60 00:03:47,641 --> 00:03:49,301 What did they do for me?" 61 00:03:49,301 --> 00:03:50,997 (Laughter) 62 00:03:50,997 --> 00:03:55,430 Unfortunately, I heard the billionaire Steven Forbes, 63 00:03:55,430 --> 00:03:59,118 on Fox News, saying exactly the same thing, but seriously. 64 00:03:59,118 --> 00:04:01,293 He was told about the rise of the ocean, 65 00:04:01,293 --> 00:04:04,640 and he said, "I find it absurd to change my behavior today 66 00:04:04,640 --> 00:04:08,165 for something that will happen in a hundred years." 67 00:04:08,165 --> 00:04:10,593 So if you don't care for future generations, 68 00:04:10,593 --> 00:04:13,494 just go for it. 69 00:04:13,494 --> 00:04:16,420 So one of the main challenges of our times 70 00:04:16,420 --> 00:04:20,042 is to reconcile three time scales. 71 00:04:20,088 --> 00:04:22,003 The short term of the economy, 72 00:04:22,003 --> 00:04:26,125 the ups and downs of the stock market, the end of the year accounts. 73 00:04:26,125 --> 00:04:28,621 The midterm of the quality of life: 74 00:04:28,621 --> 00:04:34,205 what is the quality every moment of our life and over 10 years and 20 years? 75 00:04:34,205 --> 00:04:37,734 And the long term of the environment. 76 00:04:37,734 --> 00:04:39,823 When the environmentalists speak with economists, 77 00:04:39,823 --> 00:04:42,982 it's like a schizophrenic dialogue, completely incoherent. 78 00:04:42,982 --> 00:04:45,814 They don't speak the same language. 79 00:04:45,814 --> 00:04:49,354 Now for the last 10 years, I went around the world 80 00:04:49,354 --> 00:04:53,452 meeting economists, scientists, neuroscientists, environmentalists, 81 00:04:53,452 --> 00:04:56,262 philosophers, thinkers in the Himalayas, 82 00:04:56,262 --> 00:04:57,910 all over the place. 83 00:04:57,910 --> 00:05:01,834 It seems to be, there's only one concept 84 00:05:01,834 --> 00:05:04,794 that can reconcile those three time scales. 85 00:05:04,794 --> 00:05:09,368 It is simply having more consideration for others. 86 00:05:09,368 --> 00:05:11,922 If you have more consideration for others, 87 00:05:11,922 --> 00:05:14,569 you will be having a caring economics, 88 00:05:14,569 --> 00:05:17,051 where finances are the service of society 89 00:05:17,051 --> 00:05:20,319 and not society at the service of finances. 90 00:05:20,319 --> 00:05:23,447 You will not play at the casino with the resources that people 91 00:05:23,447 --> 00:05:25,583 have entrusted you with. 92 00:05:25,583 --> 00:05:28,379 If you have more consideration for others, 93 00:05:28,379 --> 00:05:31,806 you will make sure that you remedy inequality, 94 00:05:31,806 --> 00:05:35,346 that you bring some kind of well being within the society, 95 00:05:35,346 --> 00:05:37,176 in education, at the workplace. 96 00:05:37,176 --> 00:05:40,599 Otherwise, a nation that is the most powerful and the richest, 97 00:05:40,599 --> 00:05:42,246 everyone is miserable. 98 00:05:42,246 --> 00:05:44,146 What's the point? 99 00:05:44,146 --> 00:05:46,363 And if you have more consideration for others, 100 00:05:46,363 --> 00:05:49,649 you are not going to ransack that planet that we have 101 00:05:49,649 --> 00:05:54,083 and that at the current rate, we don't have three planets to continue that way. 102 00:05:54,083 --> 00:05:56,556 So the question is, 103 00:05:56,556 --> 00:06:00,387 okay, altruism is the answer, it's not just a novel ideal, 104 00:06:00,387 --> 00:06:04,021 but can it be a real, pragmatic solution? 105 00:06:04,021 --> 00:06:06,649 And first of all, does it exist, 106 00:06:06,649 --> 00:06:10,213 true altruism, or are we so selfish? 107 00:06:10,213 --> 00:06:15,594 So some philosophers thought we were irredeemably selfish. 108 00:06:15,594 --> 00:06:21,195 You know, but are we really all just like rascals? 109 00:06:21,195 --> 00:06:23,819 That's good news, isn't it? 110 00:06:23,819 --> 00:06:26,548 Many philosophers like Hobbes have said so. 111 00:06:26,548 --> 00:06:29,429 But not everyone looks like a rascal, 112 00:06:29,429 --> 00:06:32,572 or is man a wolf for man? 113 00:06:32,572 --> 00:06:35,150 But this guy doesn't seem too bad. 114 00:06:35,150 --> 00:06:38,493 He's one of my friends in Tibet. 115 00:06:38,493 --> 00:06:40,652 He's very kind. 116 00:06:40,652 --> 00:06:43,937 So now, we love cooperation. 117 00:06:43,937 --> 00:06:47,265 There's no better joy than work together. 118 00:06:47,265 --> 00:06:48,611 Isn't it? 119 00:06:48,611 --> 00:06:52,350 And then not only humans. 120 00:06:52,350 --> 00:06:54,837 Then, of course, there's the struggle for life, 121 00:06:54,837 --> 00:06:59,768 the survival of the fittest, Social Darwinism. 122 00:06:59,768 --> 00:07:05,015 But in evolution, cooperation, though competition exists, of course. 123 00:07:05,015 --> 00:07:08,382 Cooperation is much more creative 124 00:07:08,382 --> 00:07:10,723 to go to increased level of complexity. 125 00:07:10,723 --> 00:07:15,414 We are super cooperators and we should even go further. 126 00:07:15,414 --> 00:07:19,153 So now, on top of that, 127 00:07:19,153 --> 00:07:21,697 the quality of human relationships. 128 00:07:21,697 --> 00:07:23,655 You know, the OECD did a survey 129 00:07:23,655 --> 00:07:25,958 among 10 factors, including income, everything, 130 00:07:25,958 --> 00:07:29,208 the first one that people said that's the main thing for my happiness 131 00:07:29,208 --> 00:07:32,621 is quality of social relationships. 132 00:07:32,621 --> 00:07:35,908 Not only in humans. 133 00:07:35,908 --> 00:07:38,844 And look at those great grandmothers. 134 00:07:38,844 --> 00:07:41,386 So now, 135 00:07:41,386 --> 00:07:44,590 this idea that we go deep within, 136 00:07:44,590 --> 00:07:46,890 we are irredeemably selfish, 137 00:07:46,890 --> 00:07:49,224 this is armchair science. 138 00:07:49,224 --> 00:07:51,491 There is not a single sociological study, 139 00:07:51,491 --> 00:07:54,737 psychological study, that's ever shown that. 140 00:07:54,737 --> 00:07:56,977 Rather, the opposite. 141 00:07:56,977 --> 00:08:00,355 My friend, Daniel Batson, spent a whole life 142 00:08:00,355 --> 00:08:03,118 putting people in the lab in very complex situations, 143 00:08:03,118 --> 00:08:05,637 and of course we are sometimes selfish 144 00:08:05,637 --> 00:08:07,472 and some people more than others. 145 00:08:07,472 --> 00:08:10,149 But he found that systematically, no matter what, 146 00:08:10,149 --> 00:08:13,149 there's a significant number of people 147 00:08:13,149 --> 00:08:16,504 who do behave altruistically, no matter what. 148 00:08:16,504 --> 00:08:19,696 Now if you see someone deeply wounded, great suffering, 149 00:08:19,696 --> 00:08:22,318 you might just help out of empathic distress. 150 00:08:22,318 --> 00:08:24,688 You can't stand it, so it's better to help 151 00:08:24,688 --> 00:08:26,448 than keep on looking at that person. 152 00:08:26,448 --> 00:08:28,484 So we tested all that, and in the end, 153 00:08:28,484 --> 00:08:34,284 he said, "clearly people can be altruistic." So that's good news. 154 00:08:34,284 --> 00:08:39,896 And even further, we should look at the banality of goodness. 155 00:08:39,896 --> 00:08:41,730 Now look at here. 156 00:08:41,730 --> 00:08:46,130 When we come out, we are going to say, "That's so nice, there was no fistfight 157 00:08:46,130 --> 00:08:49,067 while this mob was thinking about altruism." 158 00:08:49,067 --> 00:08:51,099 No, we expect that, isn't it? 159 00:08:51,099 --> 00:08:54,848 If there was a fistfight, we would speak of that for months. 160 00:08:54,848 --> 00:08:58,029 So the banality of goodness is something that doesn't attract your attention, 161 00:08:58,029 --> 00:08:59,807 but it exists. 162 00:08:59,807 --> 00:09:06,993 Now, look at this. Look at this. 163 00:09:08,107 --> 00:09:09,663 Okay. 164 00:09:09,663 --> 00:09:12,054 So some psychologists said, 165 00:09:12,054 --> 00:09:15,031 when I tell them I run 140 humanitarian projects 166 00:09:15,031 --> 00:09:17,875 in the Himalayas that give me so much joy, 167 00:09:17,875 --> 00:09:21,009 they said, "oh, I see, you work for the warm glow. 168 00:09:21,009 --> 00:09:24,063 That is not altruistic. You just feel good." 169 00:09:24,063 --> 00:09:26,991 You think this guy, when he jumped in front of the train, 170 00:09:26,991 --> 00:09:29,277 he thought, "I'm going to feel so good when this is over?" 171 00:09:29,277 --> 00:09:31,563 (Laughter). 172 00:09:31,563 --> 00:09:33,849 But that's not the end of it. 173 00:09:33,849 --> 00:09:36,391 They say, well, but when you interviewed him, he said, 174 00:09:36,391 --> 00:09:39,526 "I had no choice, I had to jump, of course." 175 00:09:39,526 --> 00:09:41,947 He has no choice. Automatic behavior. 176 00:09:41,947 --> 00:09:44,342 It's neither selfish nor altruistic. 177 00:09:44,342 --> 00:09:45,572 No choice. 178 00:09:45,572 --> 00:09:47,624 Well, of course, this guy's not going to think for half an hour, 179 00:09:47,624 --> 00:09:49,841 "should I give my hand, not give my hand?" 180 00:09:49,841 --> 00:09:54,056 He does it. There is a choice, but it's obvious, it's immediate. 181 00:09:54,056 --> 00:09:56,037 And then, also, here they have a choice. 182 00:09:56,037 --> 00:09:58,738 (Laughter). 183 00:09:58,738 --> 00:10:01,080 So now, there are people who had choice, 184 00:10:01,080 --> 00:10:03,258 like Pastor André Trocmé and his wife, 185 00:10:03,258 --> 00:10:05,457 and the whole village of Chambon-sur-Lignon in France. 186 00:10:05,457 --> 00:10:09,225 For the whole Second World War, they saved 3,500 Jews, 187 00:10:09,225 --> 00:10:11,872 gave them shelter, brought them to Switzerland, 188 00:10:11,872 --> 00:10:15,517 against all odds, at the risk of their lives and that of their family. 189 00:10:15,517 --> 00:10:17,734 So altruism does exist. 190 00:10:17,734 --> 00:10:19,279 So what is altruism? 191 00:10:19,279 --> 00:10:23,121 It is the wish "may others be happy and find the cause of happiness." 192 00:10:23,121 --> 00:10:28,786 Now empathy is the effective resonance or cognitive resonance that tells you, 193 00:10:28,786 --> 00:10:31,537 this personality is joyful, this person suffers. 194 00:10:31,537 --> 00:10:34,463 But empathy enough is not sufficient. 195 00:10:34,463 --> 00:10:36,686 If you keep on being confronted with suffering, 196 00:10:36,686 --> 00:10:39,447 you might have empathic distress, burnout, 197 00:10:39,447 --> 00:10:43,507 so you need the greater sphere of loving kindness. 198 00:10:43,507 --> 00:10:46,324 With Daniel Singer at the Max Planck Institute of Leipzig, 199 00:10:46,324 --> 00:10:48,797 we showed that the brain network 200 00:10:48,797 --> 00:10:52,385 for empathy and loving kindness are different. 201 00:10:52,385 --> 00:10:54,416 Now, that's all well done, 202 00:10:54,416 --> 00:11:00,000 so we got that from evolution, from maternal care, parental love, 203 00:11:00,000 --> 00:11:01,625 but we need to extend that. 204 00:11:01,625 --> 00:11:04,968 Can we extend it to other species? 205 00:11:04,968 --> 00:11:09,375 Now, if we want a more altruistic society, we need two things: 206 00:11:09,375 --> 00:11:12,932 individual change and societal change. 207 00:11:12,932 --> 00:11:15,150 So is individual change possible? 208 00:11:15,150 --> 00:11:18,349 Two thousand years of contemplative study said yes, it is. 209 00:11:18,349 --> 00:11:21,641 Now, 15 years of collaboration with neuroscience and epigeneticists 210 00:11:21,641 --> 00:11:26,435 said yes, our brains change when you train in altruism. 211 00:11:26,435 --> 00:11:30,707 So I spent 120 hours in an MRI machine. 212 00:11:30,707 --> 00:11:33,773 This is the first time I went after two and a half hours. 213 00:11:33,773 --> 00:11:35,095 And then, the result 214 00:11:35,095 --> 00:11:37,487 -- I've been published in many scientific papers -- 215 00:11:37,487 --> 00:11:40,749 it shows without ambiguity that there are structural change 216 00:11:40,749 --> 00:11:42,432 and functional change in the brain 217 00:11:42,432 --> 00:11:44,638 when you train the altruistic love. 218 00:11:44,638 --> 00:11:46,252 Just to give you an idea: 219 00:11:46,252 --> 00:11:49,073 this is the meditator at rest on the left, 220 00:11:49,073 --> 00:11:52,776 meditation in compassion meditation, you see all the activity, 221 00:11:52,776 --> 00:11:54,890 and then the control group at rest, 222 00:11:54,890 --> 00:11:57,512 nothing happened, in meditation, nothing happened. 223 00:11:57,512 --> 00:11:59,440 They have not been trained. 224 00:11:59,440 --> 00:12:02,551 So, do you need 50,000 hours of meditation? 225 00:12:02,551 --> 00:12:03,657 No, you don't. 226 00:12:03,657 --> 00:12:05,657 Four weeks, 20 minutes a day, 227 00:12:05,657 --> 00:12:07,838 of caring, mindfulness meditation 228 00:12:07,838 --> 00:12:14,141 already brings a structural change in the brain compared to a control group. 229 00:12:14,141 --> 00:12:17,879 That's only 20 minutes a day for four weeks. 230 00:12:17,879 --> 00:12:20,967 Even with preschoolers, we showed in Madison, 231 00:12:20,967 --> 00:12:26,482 eight weeks program: gratitude, loving kindness, cooperation, 232 00:12:26,482 --> 00:12:28,157 mindful breathing, you would say, 233 00:12:28,157 --> 00:12:29,832 "Oh, they're just preschoolers." 234 00:12:29,832 --> 00:12:31,508 Look after eight weeks. 235 00:12:31,508 --> 00:12:33,958 The pro-social behavior, that's the blue line, 236 00:12:33,958 --> 00:12:39,612 and then comes the ultimate scientific test, the stickers test. 237 00:12:39,612 --> 00:12:43,350 Before, you determine for each child who is their best friend in the class, 238 00:12:43,350 --> 00:12:47,425 their least favorite child, the unknown child, and the sick child, 239 00:12:47,425 --> 00:12:50,129 and they have to give stickers away. 240 00:12:50,129 --> 00:12:54,181 So before the intervention, they give most of it to their best friend. 241 00:12:54,181 --> 00:12:57,640 Four, five years old, 20 minutes, three times a week. 242 00:12:57,640 --> 00:13:01,123 After the intervention, no more discrimination. 243 00:13:01,123 --> 00:13:03,422 The same amount of stickers to their best friend 244 00:13:03,422 --> 00:13:05,488 and the least favorite child. 245 00:13:05,488 --> 00:13:08,866 You know, that's something we should do in all the schools in the world. 246 00:13:08,866 --> 00:13:10,430 Now where do we go from there? 247 00:13:10,430 --> 00:13:14,998 (Applause) 248 00:13:14,998 --> 00:13:17,529 When the Dalai Lama heard that, his solution, he said, 249 00:13:17,529 --> 00:13:20,965 "You go to 10 schools, 100 schools, the U.N., the whole world." 250 00:13:20,965 --> 00:13:22,649 So now where do we go from there? 251 00:13:22,649 --> 00:13:24,762 Individual change is possible. 252 00:13:24,762 --> 00:13:27,768 Now do we have to wait for an altruistic gene 253 00:13:27,768 --> 00:13:29,951 to be in the human race? 254 00:13:29,951 --> 00:13:33,132 That will take 50,000 years, too much for the environment. 255 00:13:33,132 --> 00:13:37,567 Fortunately, there is the evolution of culture. 256 00:13:37,567 --> 00:13:40,979 Cultures, as specialists have shown, 257 00:13:40,979 --> 00:13:43,661 change faster than genes. 258 00:13:43,661 --> 00:13:44,829 That's the good news. 259 00:13:44,829 --> 00:13:48,189 Look at attitudes towards war has dramatically changed over the years. 260 00:13:48,189 --> 00:13:51,300 So now, individual change and cultural change 261 00:13:51,300 --> 00:13:53,352 mutually fashion each other, 262 00:13:53,352 --> 00:13:56,756 and yes, we can achieve a more altruistic society. 263 00:13:56,756 --> 00:13:58,358 So where do we go from there? 264 00:13:58,358 --> 00:14:00,517 Myself, I will go back to the East. 265 00:14:00,517 --> 00:14:03,977 Now, we treat 100,000 patients a year in our projects. 266 00:14:03,977 --> 00:14:05,916 We have 25,000 kids in school, 267 00:14:05,916 --> 00:14:07,333 four percent overhead. 268 00:14:07,333 --> 00:14:09,921 Some people say, "Well, your stuff works in practice, 269 00:14:09,921 --> 00:14:12,185 but does it work in theory?" 270 00:14:12,185 --> 00:14:15,807 So there's always the positive deviance. 271 00:14:15,807 --> 00:14:19,266 So I will also go back to my hermitage to find the inner resources 272 00:14:19,266 --> 00:14:21,425 to better serve others. 273 00:14:21,425 --> 00:14:24,362 But on the more global level, what can we do? 274 00:14:24,362 --> 00:14:25,973 We need three things. 275 00:14:25,973 --> 00:14:28,635 Enhancing cooperation: 276 00:14:28,635 --> 00:14:32,361 cooperative learning in the school instead of competitive learning, 277 00:14:32,361 --> 00:14:35,612 Unconditional cooperation within corporations. 278 00:14:35,612 --> 00:14:40,209 There can be some competition between corporations, but not within. 279 00:14:40,209 --> 00:14:42,399 We need sustainable harmony. 280 00:14:42,399 --> 00:14:44,411 I love this term. 281 00:14:44,411 --> 00:14:46,524 Not sustainable growth anymore. 282 00:14:46,524 --> 00:14:50,030 Sustainable harmony means now we will reduce inequality. 283 00:14:50,030 --> 00:14:53,826 In the future, we do more with less, 284 00:14:53,826 --> 00:14:57,200 and we continue to grow qualitatively, 285 00:14:57,200 --> 00:14:58,825 not quantitatively. 286 00:14:58,825 --> 00:15:00,985 We need caring economics. 287 00:15:00,985 --> 00:15:06,766 The old economics cannot deal with poverty in the midst of plenty, 288 00:15:06,766 --> 00:15:09,008 cannot deal with the problem of the common goods 289 00:15:09,008 --> 00:15:11,096 of the atmosphere, of the oceans. 290 00:15:11,096 --> 00:15:12,779 We need a caring economics. 291 00:15:12,779 --> 00:15:14,788 If you say economics should be compassionate, 292 00:15:14,788 --> 00:15:16,193 they say, "That's not our job." 293 00:15:16,193 --> 00:15:19,838 But if you say they don't care, that looks bad. 294 00:15:19,838 --> 00:15:22,967 We need local commitment, global responsibility. 295 00:15:22,967 --> 00:15:28,307 We need to extend altruism to the other 1.6 million other species. 296 00:15:28,307 --> 00:15:31,732 Sentient beings who are co-citizens in this world. 297 00:15:31,732 --> 00:15:35,064 and we need to dare altruism. 298 00:15:35,064 --> 00:15:38,605 So long life to the altruistic revolution. 299 00:15:38,605 --> 00:15:43,155 Viva la revolucion de altruismo. 300 00:15:43,155 --> 00:15:51,665 (Applause) 301 00:15:51,665 --> 00:15:53,116 Thank you. 302 00:15:53,116 --> 00:15:54,688 (Applause)