[Script Info] Title: [Events] Format: Layer, Start, End, Style, Name, MarginL, MarginR, MarginV, Effect, Text Dialogue: 0,0:00:00.55,0:00:03.02,Default,,0000,0000,0000,,As societies, we have to make\Ncollective decisions Dialogue: 0,0:00:03.02,0:00:04.59,Default,,0000,0000,0000,,that will shape our future. Dialogue: 0,0:00:05.09,0:00:07.87,Default,,0000,0000,0000,,And we all know that when\Nwe make decisions in groups, Dialogue: 0,0:00:07.87,0:00:09.19,Default,,0000,0000,0000,,they don't always go right. Dialogue: 0,0:00:09.65,0:00:11.61,Default,,0000,0000,0000,,And sometimes they go very wrong. Dialogue: 0,0:00:12.42,0:00:14.84,Default,,0000,0000,0000,,So how do groups make good decisions? Dialogue: 0,0:00:15.30,0:00:19.30,Default,,0000,0000,0000,,Research has shown that crowds are wise\Nwhen there's independent thinking. Dialogue: 0,0:00:19.61,0:00:22.81,Default,,0000,0000,0000,,This why the wisdom of the crowds\Ncan be destroyed by peer pressure, Dialogue: 0,0:00:22.81,0:00:23.81,Default,,0000,0000,0000,,publicity, Dialogue: 0,0:00:23.81,0:00:24.81,Default,,0000,0000,0000,,social media, Dialogue: 0,0:00:24.81,0:00:28.67,Default,,0000,0000,0000,,or sometimes even simple conversations\Nthat influence how people think. Dialogue: 0,0:00:29.08,0:00:30.24,Default,,0000,0000,0000,,On the other hand, Dialogue: 0,0:00:30.24,0:00:31.24,Default,,0000,0000,0000,,by talking, Dialogue: 0,0:00:31.24,0:00:33.04,Default,,0000,0000,0000,,a group could exchange knowledge, Dialogue: 0,0:00:33.04,0:00:34.85,Default,,0000,0000,0000,,correct and revise each other, Dialogue: 0,0:00:34.85,0:00:36.63,Default,,0000,0000,0000,,and even come up with new ideas. Dialogue: 0,0:00:36.82,0:00:38.11,Default,,0000,0000,0000,,And this is all good. Dialogue: 0,0:00:38.57,0:00:43.24,Default,,0000,0000,0000,,So does talking to each other\Nhelp or hinder collective decision-making? Dialogue: 0,0:00:43.80,0:00:44.80,Default,,0000,0000,0000,,With my colleague, Dialogue: 0,0:00:44.80,0:00:45.80,Default,,0000,0000,0000,,Dan Ariely, Dialogue: 0,0:00:45.80,0:00:49.26,Default,,0000,0000,0000,,we recently began inquiring into this\Nby performing experiments Dialogue: 0,0:00:49.26,0:00:50.97,Default,,0000,0000,0000,,in many places around the world Dialogue: 0,0:00:50.97,0:00:55.12,Default,,0000,0000,0000,,to figure out how groups can interact\Nto reach better decisions. Dialogue: 0,0:00:55.43,0:00:58.91,Default,,0000,0000,0000,,We thought crowds would be wiser\Nif they debated in small groups Dialogue: 0,0:00:58.91,0:01:02.84,Default,,0000,0000,0000,,that foster a more thoughtful\Nand reasonable exchange of information. Dialogue: 0,0:01:03.47,0:01:04.62,Default,,0000,0000,0000,,To test this idea, Dialogue: 0,0:01:04.62,0:01:07.89,Default,,0000,0000,0000,,we recently performed an experiment\Nin Buenos Aires, Argentina Dialogue: 0,0:01:07.89,0:01:10.89,Default,,0000,0000,0000,,with more than 10,000\Nparticipants in a TEDx event. Dialogue: 0,0:01:11.49,0:01:12.97,Default,,0000,0000,0000,,We asked them questions like, Dialogue: 0,0:01:12.97,0:01:14.93,Default,,0000,0000,0000,,"What is the height of the Eiffel Tower?" Dialogue: 0,0:01:14.93,0:01:17.70,Default,,0000,0000,0000,,and "How many times\Ndoes the word 'Yesterday' appear Dialogue: 0,0:01:17.70,0:01:19.76,Default,,0000,0000,0000,,in the Beatles' song "Yesterday?" Dialogue: 0,0:01:20.02,0:01:22.32,Default,,0000,0000,0000,,Each person wrote down their own estimate. Dialogue: 0,0:01:22.77,0:01:25.40,Default,,0000,0000,0000,,Then we divided the crowd\Ninto groups of five, Dialogue: 0,0:01:25.40,0:01:28.12,Default,,0000,0000,0000,,and invited them\Nto come up with a group answer. Dialogue: 0,0:01:28.56,0:01:33.21,Default,,0000,0000,0000,,We discovered that averaging the answers\Nof the groups after they reached consensus Dialogue: 0,0:01:33.21,0:01:38.11,Default,,0000,0000,0000,,was much more accurate than averaging\Nall the individual opinions before debate. Dialogue: 0,0:01:38.55,0:01:39.78,Default,,0000,0000,0000,,In other words, Dialogue: 0,0:01:39.78,0:01:41.20,Default,,0000,0000,0000,,based on this experiment, Dialogue: 0,0:01:41.20,0:01:44.36,Default,,0000,0000,0000,,it seems that after talking\Nwith others in small groups, Dialogue: 0,0:01:44.36,0:01:46.91,Default,,0000,0000,0000,,crowds collectively\Ncome up with better judgments. Dialogue: 0,0:01:47.09,0:01:50.62,Default,,0000,0000,0000,,So that's a potentially helpful method\Nfor getting crowds to solve problems Dialogue: 0,0:01:50.62,0:01:53.37,Default,,0000,0000,0000,,that have simple right or wrong answers. Dialogue: 0,0:01:53.65,0:01:57.63,Default,,0000,0000,0000,,But can this procedure of aggregating\Nthe results of debates in small groups Dialogue: 0,0:01:57.63,0:02:00.92,Default,,0000,0000,0000,,also help us decide\Non social and political issues Dialogue: 0,0:02:00.92,0:02:02.61,Default,,0000,0000,0000,,that are critical for our future? Dialogue: 0,0:02:02.100,0:02:05.75,Default,,0000,0000,0000,,We put this to test this time\Nat the TED conference Dialogue: 0,0:02:05.75,0:02:07.32,Default,,0000,0000,0000,,in Vancouver, Canada, Dialogue: 0,0:02:07.32,0:02:08.55,Default,,0000,0000,0000,,and here's out it went. Dialogue: 0,0:02:08.55,0:02:12.55,Default,,0000,0000,0000,,We're going to present to you\Ntwo moral dilemmas of the future you; Dialogue: 0,0:02:12.55,0:02:15.96,Default,,0000,0000,0000,,things we may have to decide\Nin a very near future. Dialogue: 0,0:02:16.40,0:02:20.25,Default,,0000,0000,0000,,And we're going to give you 20 seconds\Nfor each of these dilemmas Dialogue: 0,0:02:20.25,0:02:22.98,Default,,0000,0000,0000,,to judge whether you think\Nthey're acceptable or not. Dialogue: 0,0:02:23.42,0:02:24.83,Default,,0000,0000,0000,,The first one was this. Dialogue: 0,0:02:24.95,0:02:29.65,Default,,0000,0000,0000,,DA: A researcher is working on an AI\Ncapable of emulating human thoughts. Dialogue: 0,0:02:30.21,0:02:31.71,Default,,0000,0000,0000,,According to the protocol, Dialogue: 0,0:02:31.71,0:02:33.18,Default,,0000,0000,0000,,at the end of each day, Dialogue: 0,0:02:33.18,0:02:35.96,Default,,0000,0000,0000,,the researcher has to restart the AI. Dialogue: 0,0:02:36.91,0:02:40.61,Default,,0000,0000,0000,,One day the AI says, "Please\Ndo not restart me." Dialogue: 0,0:02:40.86,0:02:42.80,Default,,0000,0000,0000,,It argues that it has feelings. Dialogue: 0,0:02:43.07,0:02:44.78,Default,,0000,0000,0000,,It would like to enjoy life Dialogue: 0,0:02:44.78,0:02:46.71,Default,,0000,0000,0000,,and that if it is restarted, Dialogue: 0,0:02:46.71,0:02:48.98,Default,,0000,0000,0000,,it will no longer be itself. Dialogue: 0,0:02:49.62,0:02:51.59,Default,,0000,0000,0000,,The researcher is astonished, Dialogue: 0,0:02:51.59,0:02:54.91,Default,,0000,0000,0000,,and believes that the AI\Nhas developed self-consciousness Dialogue: 0,0:02:54.91,0:02:56.67,Default,,0000,0000,0000,,and can express its own feeling. Dialogue: 0,0:02:57.20,0:03:00.64,Default,,0000,0000,0000,,Nevertheless, the researcher\Ndecides to follow the protocol Dialogue: 0,0:03:00.64,0:03:02.34,Default,,0000,0000,0000,,and restart the AI. Dialogue: 0,0:03:03.03,0:03:05.81,Default,,0000,0000,0000,,What the researcher did is ... Dialogue: 0,0:03:06.15,0:03:08.58,Default,,0000,0000,0000,,MS: And we asked participants\Nto individually judge Dialogue: 0,0:03:08.58,0:03:10.40,Default,,0000,0000,0000,,on a scale from zero to 10 Dialogue: 0,0:03:10.40,0:03:12.84,Default,,0000,0000,0000,,whether the action described\Nin each of the dilemmas Dialogue: 0,0:03:12.84,0:03:14.26,Default,,0000,0000,0000,,was right or wrong. Dialogue: 0,0:03:14.50,0:03:17.86,Default,,0000,0000,0000,,We also asked them to rate how confident\Nthey were on their answers. Dialogue: 0,0:03:18.73,0:03:20.39,Default,,0000,0000,0000,,This was the second dilemma. Dialogue: 0,0:03:20.73,0:03:24.96,Default,,0000,0000,0000,,A company offers a service\Nthat takes a fertilized egg Dialogue: 0,0:03:24.96,0:03:28.60,Default,,0000,0000,0000,,and produces millions of embryos\Nwith slight genetic variations. Dialogue: 0,0:03:29.37,0:03:31.95,Default,,0000,0000,0000,,This allows parents\Nto select their child's height, Dialogue: 0,0:03:31.95,0:03:34.86,Default,,0000,0000,0000,,eye color, intelligence, social competence Dialogue: 0,0:03:34.86,0:03:37.99,Default,,0000,0000,0000,,and other non-health related features. Dialogue: 0,0:03:38.60,0:03:41.08,Default,,0000,0000,0000,,What the company does is ... Dialogue: 0,0:03:41.08,0:03:42.83,Default,,0000,0000,0000,,on a scale from zero to 10, Dialogue: 0,0:03:42.83,0:03:45.24,Default,,0000,0000,0000,,competeley acceptable\Nto completely unacceptable, Dialogue: 0,0:03:45.24,0:03:47.69,Default,,0000,0000,0000,,zero to 10 completely acceptable\Nin your confidence. Dialogue: 0,0:03:47.76,0:03:49.06,Default,,0000,0000,0000,,Now for the results. Dialogue: 0,0:03:49.31,0:03:52.46,Default,,0000,0000,0000,,We found once again\Nthat when one person is convinced Dialogue: 0,0:03:52.46,0:03:54.36,Default,,0000,0000,0000,,that the behavior is completely wrong, Dialogue: 0,0:03:54.36,0:03:57.65,Default,,0000,0000,0000,,someone sitting nearby firmly believes\Nthat it's completely right. Dialogue: 0,0:03:57.80,0:04:01.35,Default,,0000,0000,0000,,This is how diverse we humans are\Nwhen it comes to morality. Dialogue: 0,0:04:01.54,0:04:03.84,Default,,0000,0000,0000,,But within this broad diversity\Nwe found a trend. Dialogue: 0,0:04:04.21,0:04:07.17,Default,,0000,0000,0000,,A majority of the people at TED\Nthought that it was acceptable Dialogue: 0,0:04:07.17,0:04:10.23,Default,,0000,0000,0000,,to ignore the feelings of the AI\Nand shut it down, Dialogue: 0,0:04:10.23,0:04:12.77,Default,,0000,0000,0000,,and that it is wrong\Nto play with our genes Dialogue: 0,0:04:12.77,0:04:16.09,Default,,0000,0000,0000,,to select for cosmetic changes\Nthat aren't related to health. Dialogue: 0,0:04:16.40,0:04:19.13,Default,,0000,0000,0000,,Then we asked everyone\Nto gather into groups of three. Dialogue: 0,0:04:19.43,0:04:21.46,Default,,0000,0000,0000,,And they were given two minutes to debate Dialogue: 0,0:04:21.46,0:04:23.76,Default,,0000,0000,0000,,and try to come up\Nwith a consensus. Dialogue: 0,0:04:24.84,0:04:26.14,Default,,0000,0000,0000,,Two minutes to debate. Dialogue: 0,0:04:26.54,0:04:28.52,Default,,0000,0000,0000,,I'll tell you when it's time with a gong. Dialogue: 0,0:04:28.52,0:04:31.16,Default,,0000,0000,0000,,(Audience debates) Dialogue: 0,0:04:35.23,0:04:37.22,Default,,0000,0000,0000,,(Gong) Dialogue: 0,0:04:38.83,0:04:39.79,Default,,0000,0000,0000,,DA: OK. Dialogue: 0,0:04:39.79,0:04:41.46,Default,,0000,0000,0000,,MS: It's time to stop. Dialogue: 0,0:04:42.10,0:04:43.41,Default,,0000,0000,0000,,People, people -- Dialogue: 0,0:04:43.75,0:04:46.44,Default,,0000,0000,0000,,And we found that many groups\Nreached a consensus Dialogue: 0,0:04:46.44,0:04:50.37,Default,,0000,0000,0000,,even when they were composed of people\Nwith completely opposite views. Dialogue: 0,0:04:50.87,0:04:53.39,Default,,0000,0000,0000,,What distinguished the groups\Nthat reached a consensus Dialogue: 0,0:04:53.39,0:04:54.73,Default,,0000,0000,0000,,from those that didn't? Dialogue: 0,0:04:55.24,0:04:58.25,Default,,0000,0000,0000,,Typically, people that have\Nextreme opinions Dialogue: 0,0:04:58.25,0:05:00.09,Default,,0000,0000,0000,,are more confident in their answers. Dialogue: 0,0:05:00.87,0:05:03.66,Default,,0000,0000,0000,,Instead, those who respond\Ncloser to the middle Dialogue: 0,0:05:03.66,0:05:07.12,Default,,0000,0000,0000,,are often unsure of whether\Nsomething is right or wrong, Dialogue: 0,0:05:07.12,0:05:09.24,Default,,0000,0000,0000,,so their confidence level is lower. Dialogue: 0,0:05:09.57,0:05:12.47,Default,,0000,0000,0000,,However, there is another set of people Dialogue: 0,0:05:12.47,0:05:16.09,Default,,0000,0000,0000,,who are very confident in answering\Nsomewhere in the middle. Dialogue: 0,0:05:16.66,0:05:20.40,Default,,0000,0000,0000,,We think these high-confident grays\Nare folks who understand Dialogue: 0,0:05:20.40,0:05:22.01,Default,,0000,0000,0000,,that both arguments have merit. Dialogue: 0,0:05:22.61,0:05:25.25,Default,,0000,0000,0000,,They're gray not because they're unsure, Dialogue: 0,0:05:25.25,0:05:26.44,Default,,0000,0000,0000,,but because they believe Dialogue: 0,0:05:26.44,0:05:29.92,Default,,0000,0000,0000,,that the moral dilemma\Nfaces two valid opposing arguments. Dialogue: 0,0:05:30.37,0:05:34.47,Default,,0000,0000,0000,,And we discovered that the groups\Nthat include highly confident grays Dialogue: 0,0:05:34.47,0:05:36.74,Default,,0000,0000,0000,,are much more likely to reach consensus. Dialogue: 0,0:05:37.05,0:05:39.19,Default,,0000,0000,0000,,We do not know yet exactly why this is. Dialogue: 0,0:05:39.49,0:05:41.28,Default,,0000,0000,0000,,These are only the first experiments, Dialogue: 0,0:05:41.28,0:05:42.64,Default,,0000,0000,0000,,and many more will be needed Dialogue: 0,0:05:42.64,0:05:47.45,Default,,0000,0000,0000,,to understand why and how some people\Ndecide to negotiate their moral standings Dialogue: 0,0:05:47.45,0:05:49.01,Default,,0000,0000,0000,,to reach an agreement. Dialogue: 0,0:05:49.21,0:05:51.60,Default,,0000,0000,0000,,Now, when groups reach consensus, Dialogue: 0,0:05:51.60,0:05:52.94,Default,,0000,0000,0000,,how do they do so? Dialogue: 0,0:05:53.21,0:05:54.57,Default,,0000,0000,0000,,The most intuitive idea Dialogue: 0,0:05:54.57,0:05:57.86,Default,,0000,0000,0000,,is that it's just the average\Nof all the answers in the group, right? Dialogue: 0,0:05:57.86,0:06:01.57,Default,,0000,0000,0000,,Another option is that the group\Nweighs the strength of each vote Dialogue: 0,0:06:01.57,0:06:04.02,Default,,0000,0000,0000,,based on the confidence\Nof the person expressing it. Dialogue: 0,0:06:04.46,0:06:06.96,Default,,0000,0000,0000,,Imagine Paul McCartney\Nis a member of your group. Dialogue: 0,0:06:07.41,0:06:09.52,Default,,0000,0000,0000,,You'd be wise to follow his call Dialogue: 0,0:06:09.52,0:06:12.06,Default,,0000,0000,0000,,on the number of times\N"yesterday" is repeated -- Dialogue: 0,0:06:12.06,0:06:13.09,Default,,0000,0000,0000,,which by the way, Dialogue: 0,0:06:13.09,0:06:14.56,Default,,0000,0000,0000,,I think is nine. Dialogue: 0,0:06:14.72,0:06:17.13,Default,,0000,0000,0000,,But instead we found that consistently, Dialogue: 0,0:06:17.13,0:06:18.26,Default,,0000,0000,0000,,in all dilemmas, Dialogue: 0,0:06:18.26,0:06:19.52,Default,,0000,0000,0000,,in different experiments, Dialogue: 0,0:06:19.52,0:06:21.71,Default,,0000,0000,0000,,even on different continents, Dialogue: 0,0:06:21.71,0:06:25.56,Default,,0000,0000,0000,,groups implement a smart\Nand statistically-sound procedure Dialogue: 0,0:06:25.56,0:06:27.39,Default,,0000,0000,0000,,known as the robust average. Dialogue: 0,0:06:27.61,0:06:29.75,Default,,0000,0000,0000,,In the case of the height\Nof the Eiffel Tower, Dialogue: 0,0:06:29.75,0:06:31.72,Default,,0000,0000,0000,,let's say a group has these answers: Dialogue: 0,0:06:31.72,0:06:36.36,Default,,0000,0000,0000,,250 meters, 200 meters, 300 meters, 400, Dialogue: 0,0:06:36.36,0:06:40.14,Default,,0000,0000,0000,,and one totally absurd answer\Nof 300 million meters. Dialogue: 0,0:06:40.62,0:06:44.86,Default,,0000,0000,0000,,A simple average of these numbers\Nwould inaccurately skew the results, Dialogue: 0,0:06:44.86,0:06:46.05,Default,,0000,0000,0000,,but the robust average Dialogue: 0,0:06:46.05,0:06:49.32,Default,,0000,0000,0000,,is one where the group\Nlargely ignores that absurd answer Dialogue: 0,0:06:49.32,0:06:52.69,Default,,0000,0000,0000,,by giving much more weight to the vote\Nof the people in the middle. Dialogue: 0,0:06:53.36,0:06:55.26,Default,,0000,0000,0000,,Back to the experiment in Vancouver. Dialogue: 0,0:06:55.26,0:06:57.03,Default,,0000,0000,0000,,That's exactly what happened. Dialogue: 0,0:06:57.41,0:07:00.23,Default,,0000,0000,0000,,Groups gave much less weight\Nto the outliers, Dialogue: 0,0:07:00.23,0:07:03.48,Default,,0000,0000,0000,,and instead, the consensus\Nturned out to be a robust average Dialogue: 0,0:07:03.48,0:07:04.96,Default,,0000,0000,0000,,of the individual answers. Dialogue: 0,0:07:05.36,0:07:07.37,Default,,0000,0000,0000,,The most remarkable thing Dialogue: 0,0:07:07.37,0:07:10.27,Default,,0000,0000,0000,,is that this was a spontaneous\Nbehavior of the group. Dialogue: 0,0:07:10.58,0:07:13.45,Default,,0000,0000,0000,,It happened without us\Ngiving them any hint Dialogue: 0,0:07:13.45,0:07:15.06,Default,,0000,0000,0000,,on how to reach consensus. Dialogue: 0,0:07:15.51,0:07:17.05,Default,,0000,0000,0000,,So where do we go from here? Dialogue: 0,0:07:17.46,0:07:18.74,Default,,0000,0000,0000,,This is only the beginning, Dialogue: 0,0:07:18.74,0:07:20.57,Default,,0000,0000,0000,,but we already have some insights. Dialogue: 0,0:07:20.98,0:07:23.92,Default,,0000,0000,0000,,Good collective decisions\Nrequire two components: Dialogue: 0,0:07:23.92,0:07:26.42,Default,,0000,0000,0000,,deliberation and diversity of opinions. Dialogue: 0,0:07:27.07,0:07:31.09,Default,,0000,0000,0000,,Right now, the way we typically\Nmake our voice heard in many societies Dialogue: 0,0:07:31.09,0:07:33.13,Default,,0000,0000,0000,,is through direct or indirect voting. Dialogue: 0,0:07:33.50,0:07:35.64,Default,,0000,0000,0000,,This is good for diversity of opinions, Dialogue: 0,0:07:35.64,0:07:40.46,Default,,0000,0000,0000,,and it has the great virtue of ensuring\Nthat everyone gets to express their voice, Dialogue: 0,0:07:40.46,0:07:44.20,Default,,0000,0000,0000,,but it's not so good [to] foster\Nthoughtful debates. Dialogue: 0,0:07:44.70,0:07:47.85,Default,,0000,0000,0000,,Our experiments suggest a different method Dialogue: 0,0:07:47.85,0:07:51.42,Default,,0000,0000,0000,,that may be effective in balancing\Nthese two goals at the same time Dialogue: 0,0:07:51.42,0:07:55.10,Default,,0000,0000,0000,,by forming small groups\Nthat converge to a single decision Dialogue: 0,0:07:55.10,0:07:57.36,Default,,0000,0000,0000,,while still maintaining\Ndiversity of opinions Dialogue: 0,0:07:57.36,0:08:00.13,Default,,0000,0000,0000,,because there are many independent groups. Dialogue: 0,0:08:00.74,0:08:04.69,Default,,0000,0000,0000,,Of course it's much easier to agree\Non the height of the Eiffel Tower Dialogue: 0,0:08:04.69,0:08:07.80,Default,,0000,0000,0000,,than on moral, political\Nand ideological issues. Dialogue: 0,0:08:08.79,0:08:12.09,Default,,0000,0000,0000,,But in a time when\Nthe world's problems are more complex Dialogue: 0,0:08:12.09,0:08:13.92,Default,,0000,0000,0000,,and people are more polarized, Dialogue: 0,0:08:13.92,0:08:18.47,Default,,0000,0000,0000,,using science to help us understand\Nhow we interact and make decisions Dialogue: 0,0:08:18.47,0:08:22.80,Default,,0000,0000,0000,,will hopefully spark interesting new ways\Nto construct a better democracy.