1 00:00:01,373 --> 00:00:03,117 Is it just me, 2 00:00:03,141 --> 00:00:05,473 or are there other people here 3 00:00:05,497 --> 00:00:07,841 that are a little bit disappointed with democracy? 4 00:00:08,986 --> 00:00:11,322 (Applause) 5 00:00:12,141 --> 00:00:14,212 So let's look at a few numbers. 6 00:00:14,934 --> 00:00:17,107 If we look across the world, 7 00:00:17,131 --> 00:00:21,023 the median turnout in presidential elections 8 00:00:21,047 --> 00:00:22,698 over the last 30 years 9 00:00:22,722 --> 00:00:25,344 has been just 67 percent. 10 00:00:26,329 --> 00:00:28,302 Now, if we go to Europe 11 00:00:28,326 --> 00:00:32,754 and we look at people that participated in EU parliamentary elections, 12 00:00:32,778 --> 00:00:34,849 the median turnout in those elections 13 00:00:34,873 --> 00:00:36,977 is just 42 percent. 14 00:00:38,125 --> 00:00:39,794 Now let's go to New York, 15 00:00:39,818 --> 00:00:44,499 and let's see how many people voted in the last election for mayor. 16 00:00:44,523 --> 00:00:48,340 We will find that only 24 percent of people showed up to vote. 17 00:00:49,063 --> 00:00:52,158 What that means is that, if "Friends" was still running, 18 00:00:52,182 --> 00:00:55,530 Joey and maybe Phoebe would have shown up to vote. 19 00:00:55,554 --> 00:00:56,844 (Laughter) 20 00:00:57,434 --> 00:01:01,860 And you cannot blame them because people are tired of politicians. 21 00:01:01,884 --> 00:01:05,771 And people are tired of other people using the data that they have generated 22 00:01:05,795 --> 00:01:07,993 to communicate with their friends and family, 23 00:01:08,017 --> 00:01:10,111 to target political propaganda at them. 24 00:01:10,519 --> 00:01:13,247 But the thing about this is that this is not new. 25 00:01:13,271 --> 00:01:16,496 Nowadays, people use likes to target propaganda at you 26 00:01:16,520 --> 00:01:19,893 before they use your zip code or your gender or your age, 27 00:01:19,917 --> 00:01:23,487 because the idea of targeting people with propaganda for political purposes 28 00:01:23,511 --> 00:01:25,103 is as old as politics. 29 00:01:25,530 --> 00:01:27,808 And the reason why that idea is there 30 00:01:27,832 --> 00:01:31,304 is because democracy has a basic vulnerability. 31 00:01:31,710 --> 00:01:33,642 This is the idea of a representative. 32 00:01:34,049 --> 00:01:37,993 In principle, democracy is the ability of people to exert power. 33 00:01:38,017 --> 00:01:41,835 But in practice, we have to delegate that power to a representative 34 00:01:41,859 --> 00:01:44,114 that can exert that power for us. 35 00:01:44,561 --> 00:01:46,386 That representative is a bottleneck, 36 00:01:46,410 --> 00:01:47,707 or a weak spot. 37 00:01:47,731 --> 00:01:51,680 It is the place that you want to target if you want to attack democracy 38 00:01:51,704 --> 00:01:55,193 because you can capture democracy by either capturing that representative 39 00:01:55,217 --> 00:01:57,365 or capturing the way that people choose it. 40 00:01:58,065 --> 00:01:59,481 So the big question is: 41 00:01:59,505 --> 00:02:01,209 Is this the end of history? 42 00:02:01,989 --> 00:02:05,092 Is this the best that we can do 43 00:02:05,878 --> 00:02:08,945 or, actually, are there alternatives? 44 00:02:10,130 --> 00:02:12,484 Some people have been thinking about alternatives, 45 00:02:12,508 --> 00:02:16,187 and one of the ideas that is out there is the idea of direct democracy. 46 00:02:16,790 --> 00:02:19,269 This is the idea of bypassing politicians completely 47 00:02:19,293 --> 00:02:21,696 and having people vote directly on issues, 48 00:02:21,720 --> 00:02:23,985 having people vote directly on bills. 49 00:02:24,415 --> 00:02:25,751 But this idea is naive 50 00:02:25,775 --> 00:02:28,946 because there's too many things that we would need to choose. 51 00:02:28,970 --> 00:02:31,752 If you look at the 114th US Congress, 52 00:02:31,776 --> 00:02:34,263 you will have seen that the House of Representatives 53 00:02:34,287 --> 00:02:37,176 considered more than 6,000 bills, 54 00:02:37,200 --> 00:02:39,856 the Senate considered more than 3,000 bills 55 00:02:39,880 --> 00:02:42,688 and they approved more than 300 laws. 56 00:02:42,712 --> 00:02:44,291 Those would be many decisions 57 00:02:44,315 --> 00:02:46,502 that each person would have to make a week 58 00:02:46,526 --> 00:02:48,668 on topics that they know little about. 59 00:02:49,229 --> 00:02:51,510 So there's a big cognitive bandwidth problem 60 00:02:51,534 --> 00:02:55,526 if we're going to try to think about direct democracy as a viable alternative. 61 00:02:56,205 --> 00:03:00,640 So some people think about the idea of liquid democracy, or fluid democracy, 62 00:03:00,664 --> 00:03:04,440 which is the idea that you endorse your political power to someone, 63 00:03:04,464 --> 00:03:06,164 who can endorse it to someone else, 64 00:03:06,188 --> 00:03:08,729 and, eventually, you create a large follower network 65 00:03:08,753 --> 00:03:12,047 in which, at the end, there's a few people that are making decisions 66 00:03:12,071 --> 00:03:15,214 on behalf of all of their followers and their followers. 67 00:03:16,326 --> 00:03:20,455 But this idea also doesn't solve the problem of the cognitive bandwidth 68 00:03:20,479 --> 00:03:24,356 and, to be honest, it's also quite similar to the idea of having a representative. 69 00:03:24,795 --> 00:03:28,263 So what I'm going to do today is I'm going to be a little bit provocative, 70 00:03:28,277 --> 00:03:30,577 and I'm going to ask you, well: 71 00:03:30,601 --> 00:03:37,163 What if, instead of trying to bypass politicians, 72 00:03:37,187 --> 00:03:39,387 we tried to automate them? 73 00:03:45,871 --> 00:03:48,797 The idea of automation is not new. 74 00:03:48,821 --> 00:03:50,901 It was started more than 300 years ago, 75 00:03:50,925 --> 00:03:53,993 when French weavers decided to automate the loom. 76 00:03:54,820 --> 00:03:59,180 The winner of that industrial war was Joseph-Marie Jacquard. 77 00:03:59,204 --> 00:04:00,955 He was a French weaver and merchant 78 00:04:00,979 --> 00:04:03,419 that married the loom with the steam engine 79 00:04:03,443 --> 00:04:05,633 to create autonomous looms. 80 00:04:05,657 --> 00:04:08,410 And in those autonomous looms, he gained control. 81 00:04:08,434 --> 00:04:12,319 He could now make fabrics that were more complex and more sophisticated 82 00:04:12,343 --> 00:04:14,471 than the ones they were able to do by hand. 83 00:04:15,193 --> 00:04:17,825 But also, by winning that industrial war, 84 00:04:17,849 --> 00:04:21,373 he laid out what has become the blueprint of automation. 85 00:04:22,135 --> 00:04:25,005 The way that we automate things for the last 300 years 86 00:04:25,029 --> 00:04:26,411 has always been the same: 87 00:04:27,006 --> 00:04:29,515 we first identify a need, 88 00:04:29,539 --> 00:04:32,723 then we create a tool to satisfy that need, 89 00:04:32,747 --> 00:04:34,787 like the loom, in this case, 90 00:04:34,811 --> 00:04:37,202 and then we study how people use that tool 91 00:04:37,226 --> 00:04:38,711 to automate that user. 92 00:04:39,242 --> 00:04:42,303 That's how we came from the mechanical loom 93 00:04:42,327 --> 00:04:44,223 to the autonomous loom, 94 00:04:44,247 --> 00:04:46,367 and that took us a thousand years. 95 00:04:46,391 --> 00:04:48,462 Now, it's taken us only a hundred years 96 00:04:48,486 --> 00:04:51,697 to use the same script to automate the car. 97 00:04:53,286 --> 00:04:55,738 But the thing is that, this time around, 98 00:04:55,762 --> 00:04:57,891 automation is kind of for real. 99 00:04:57,915 --> 00:05:01,236 This is a video that a colleague of mine from Toshiba shared with me 100 00:05:01,260 --> 00:05:04,519 that shows the factory that manufactures solid state drives. 101 00:05:04,543 --> 00:05:06,561 The entire factory is a robot. 102 00:05:06,585 --> 00:05:08,510 There are no humans in that factory. 103 00:05:09,033 --> 00:05:11,254 And the robots are soon to leave the factories 104 00:05:11,278 --> 00:05:13,300 and become part of our world, 105 00:05:13,324 --> 00:05:15,159 become part of our workforce. 106 00:05:15,183 --> 00:05:16,956 So what I do in my day job 107 00:05:16,980 --> 00:05:20,972 is actually create tools that integrate data for entire countries 108 00:05:20,996 --> 00:05:24,462 so that we can ultimately have the foundations that we need 109 00:05:24,486 --> 00:05:28,173 for a future in which we need to also manage those machines. 110 00:05:29,195 --> 00:05:32,101 But today, I'm not here to talk to you about these tools 111 00:05:32,125 --> 00:05:33,949 that integrate data for countries. 112 00:05:34,463 --> 00:05:37,085 But I'm here to talk to you about another idea 113 00:05:37,109 --> 00:05:41,974 that might help us think about how to use artificial intelligence in democracy. 114 00:05:41,998 --> 00:05:46,731 Because the tools that I build are designed for executive decisions. 115 00:05:46,755 --> 00:05:50,597 These are decisions that can be cast in some sort of term of objectivity -- 116 00:05:50,621 --> 00:05:52,366 public investment decisions. 117 00:05:52,885 --> 00:05:55,516 But there are decisions that are legislative, 118 00:05:55,540 --> 00:05:59,327 and these decisions that are legislative require communication among people 119 00:05:59,351 --> 00:06:01,051 that have different points of view, 120 00:06:01,075 --> 00:06:03,688 require participation, require debate, 121 00:06:03,712 --> 00:06:05,190 require deliberation. 122 00:06:06,241 --> 00:06:09,045 And for a long time, we have thought that, well, 123 00:06:09,069 --> 00:06:12,529 what we need to improve democracy is actually more communication. 124 00:06:12,553 --> 00:06:16,262 So all of the technologies that we have advanced in the context of democracy, 125 00:06:16,286 --> 00:06:19,064 whether they are newspapers or whether it is social media, 126 00:06:19,088 --> 00:06:21,470 have tried to provide us with more communication. 127 00:06:22,103 --> 00:06:23,925 But we've been down that rabbit hole, 128 00:06:23,949 --> 00:06:26,697 and we know that's not what's going to solve the problem. 129 00:06:26,721 --> 00:06:28,717 Because it's not a communication problem, 130 00:06:28,741 --> 00:06:30,489 it's a cognitive bandwidth problem. 131 00:06:30,513 --> 00:06:32,879 So if the problem is one of cognitive bandwidth, 132 00:06:32,903 --> 00:06:35,490 well, adding more communication to people 133 00:06:35,514 --> 00:06:38,258 is not going to be what's going to solve it. 134 00:06:38,282 --> 00:06:41,395 What we are going to need instead is to have other technologies 135 00:06:41,419 --> 00:06:44,465 that help us deal with some of the communication 136 00:06:44,489 --> 00:06:46,731 that we are overloaded with. 137 00:06:46,755 --> 00:06:48,454 Think of, like, a little avatar, 138 00:06:48,478 --> 00:06:49,817 a software agent, 139 00:06:49,841 --> 00:06:51,719 a digital Jiminy Cricket -- 140 00:06:51,743 --> 00:06:52,981 (Laughter) 141 00:06:53,005 --> 00:06:57,017 that basically is able to answer things on your behalf. 142 00:06:57,759 --> 00:06:59,546 And if we had that technology, 143 00:06:59,570 --> 00:07:02,048 we would be able to offload some of the communication 144 00:07:02,072 --> 00:07:06,219 and help, maybe, make better decisions or decisions at a larger scale. 145 00:07:06,860 --> 00:07:10,579 And the thing is that the idea of software agents is also not new. 146 00:07:10,603 --> 00:07:12,712 We already use them all the time. 147 00:07:13,216 --> 00:07:14,737 We use software agents 148 00:07:14,761 --> 00:07:18,436 to choose the way that we're going to drive to a certain location, 149 00:07:19,070 --> 00:07:21,171 the music that we're going to listen to 150 00:07:21,758 --> 00:07:24,779 or to get suggestions for the next books that we should read. 151 00:07:25,994 --> 00:07:28,568 So there is an obvious idea in the 21st century 152 00:07:28,592 --> 00:07:31,235 that was as obvious as the idea 153 00:07:31,259 --> 00:07:36,840 of putting together a steam engine with a loom at the time of Jacquard. 154 00:07:37,538 --> 00:07:41,994 And that idea is combining direct democracy with software agents. 155 00:07:42,849 --> 00:07:44,970 Imagine, for a second, a world 156 00:07:44,994 --> 00:07:48,160 in which, instead of having a representative that represents you 157 00:07:48,184 --> 00:07:49,758 and millions of other people, 158 00:07:49,782 --> 00:07:52,818 you can have a representative that represents only you, 159 00:07:53,504 --> 00:07:55,758 with your nuanced political views -- 160 00:07:55,782 --> 00:07:59,126 that weird combination of libertarian and liberal 161 00:07:59,150 --> 00:08:01,542 and maybe a little bit conservative on some issues 162 00:08:01,566 --> 00:08:03,674 and maybe very progressive on others. 163 00:08:03,698 --> 00:08:06,965 Politicians nowadays are packages, and they're full of compromises. 164 00:08:06,989 --> 00:08:10,623 But you might have someone that can represent only you, 165 00:08:10,647 --> 00:08:12,499 if you are willing to give up the idea 166 00:08:12,523 --> 00:08:14,772 that that representative is a human. 167 00:08:15,229 --> 00:08:17,311 If that representative is a software agent, 168 00:08:17,335 --> 00:08:21,505 we could have a senate that has as many senators as we have citizens. 169 00:08:21,529 --> 00:08:24,387 And those senators are going to be able to read every bill 170 00:08:24,411 --> 00:08:27,158 and they're going to be able to vote on each one of them. 171 00:08:27,822 --> 00:08:30,778 So there's an obvious idea that maybe we want to consider. 172 00:08:30,802 --> 00:08:33,224 But I understand that in this day and age, 173 00:08:33,248 --> 00:08:35,137 this idea might be quite scary. 174 00:08:36,391 --> 00:08:39,831 In fact, thinking of a robot coming from the future 175 00:08:39,855 --> 00:08:41,528 to help us run our governments 176 00:08:41,552 --> 00:08:43,183 sounds terrifying. 177 00:08:44,223 --> 00:08:45,874 But we've been there before. 178 00:08:45,898 --> 00:08:47,171 (Laughter) 179 00:08:47,195 --> 00:08:49,690 And actually he was quite a nice guy. 180 00:08:51,677 --> 00:08:58,111 So what would the Jacquard loom version of this idea look like? 181 00:08:58,135 --> 00:09:00,036 It would be a very simple system. 182 00:09:00,060 --> 00:09:03,518 Imagine a system that you log in and you create your avatar, 183 00:09:03,542 --> 00:09:05,998 and then you're going to start training your avatar. 184 00:09:06,022 --> 00:09:08,704 So you can provide your avatar with your reading habits, 185 00:09:08,728 --> 00:09:10,589 or connect it to your social media, 186 00:09:10,613 --> 00:09:13,021 or you can connect it to other data, 187 00:09:13,045 --> 00:09:15,317 for example by taking psychological tests. 188 00:09:15,341 --> 00:09:18,309 And the nice thing about this is that there's no deception. 189 00:09:18,333 --> 00:09:21,672 You are not providing data to communicate with your friends and family 190 00:09:21,696 --> 00:09:24,847 that then gets used in a political system. 191 00:09:24,871 --> 00:09:28,575 You are providing data to a system that is designed to be used 192 00:09:28,599 --> 00:09:30,715 to make political decisions on your behalf. 193 00:09:31,264 --> 00:09:35,244 Then you take that data and you choose a training algorithm, 194 00:09:35,268 --> 00:09:36,831 because it's an open marketplace 195 00:09:36,855 --> 00:09:39,641 in which different people can submit different algorithms 196 00:09:39,665 --> 00:09:44,059 to predict how you're going to vote, based on the data you have provided. 197 00:09:44,083 --> 00:09:47,538 And the system is open, so nobody controls the algorithms; 198 00:09:47,562 --> 00:09:49,674 there are algorithms that become more popular 199 00:09:49,698 --> 00:09:51,421 and others that become less popular. 200 00:09:51,445 --> 00:09:53,252 Eventually, you can audit the system. 201 00:09:53,276 --> 00:09:55,157 You can see how your avatar is working. 202 00:09:55,181 --> 00:09:57,333 If you like it, you can leave it on autopilot. 203 00:09:57,357 --> 00:09:59,419 If you want to be a little more controlling, 204 00:09:59,443 --> 00:10:01,411 you can actually choose that they ask you 205 00:10:01,435 --> 00:10:03,503 every time they're going to make a decision, 206 00:10:03,527 --> 00:10:05,162 or you can be anywhere in between. 207 00:10:05,186 --> 00:10:07,591 One of the reasons why we use democracy so little 208 00:10:07,615 --> 00:10:11,183 may be because democracy has a very bad user interface. 209 00:10:11,207 --> 00:10:13,690 And if we improve the user interface of democracy, 210 00:10:13,714 --> 00:10:15,841 we might be able to use it more. 211 00:10:16,452 --> 00:10:19,659 Of course, there's a lot of questions that you might have. 212 00:10:20,473 --> 00:10:22,634 Well, how do you train these avatars? 213 00:10:22,658 --> 00:10:24,552 How do you keep the data secure? 214 00:10:24,576 --> 00:10:27,824 How do you keep the systems distributed and auditable? 215 00:10:27,848 --> 00:10:29,910 How about my grandmother, who's 80 years old 216 00:10:29,946 --> 00:10:31,906 and doesn't know how to use the internet? 217 00:10:32,262 --> 00:10:34,483 Trust me, I've heard them all. 218 00:10:34,507 --> 00:10:39,067 So when you think about an idea like this, you have to beware of pessimists 219 00:10:39,091 --> 00:10:43,410 because they are known to have a problem for every solution. 220 00:10:43,434 --> 00:10:45,259 (Laughter) 221 00:10:45,283 --> 00:10:48,323 So I want to invite you to think about the bigger ideas. 222 00:10:48,347 --> 00:10:51,973 The questions I just showed you are little ideas 223 00:10:51,997 --> 00:10:54,899 because they are questions about how this would not work. 224 00:10:55,502 --> 00:10:57,483 The big ideas are ideas of: 225 00:10:57,507 --> 00:10:59,314 What else can you do with this 226 00:10:59,338 --> 00:11:01,227 if this would happen to work? 227 00:11:01,774 --> 00:11:05,219 And one of those ideas is, well, who writes the laws? 228 00:11:05,854 --> 00:11:10,077 In the beginning, we could have the avatars that we already have, 229 00:11:10,101 --> 00:11:13,598 voting on laws that are written by the senators or politicians 230 00:11:13,622 --> 00:11:14,973 that we already have. 231 00:11:15,491 --> 00:11:17,205 But if this were to work, 232 00:11:17,902 --> 00:11:20,252 you could write an algorithm 233 00:11:20,276 --> 00:11:22,426 that could try to write a law 234 00:11:22,450 --> 00:11:24,871 that would get a certain percentage of approval, 235 00:11:24,895 --> 00:11:26,677 and you could reverse the process. 236 00:11:26,701 --> 00:11:30,213 Now, you might think that this idea is ludicrous and we should not do it, 237 00:11:30,237 --> 00:11:33,023 but you cannot deny that it's an idea that is only possible 238 00:11:33,047 --> 00:11:36,067 in a world in which direct democracy and software agents 239 00:11:36,091 --> 00:11:38,747 are a viable form of participation. 240 00:11:40,596 --> 00:11:43,349 So how do we start the revolution? 241 00:11:44,238 --> 00:11:47,548 We don't start this revolution with picket fences or protests 242 00:11:47,572 --> 00:11:51,762 or by demanding our current politicians to be changed into robots. 243 00:11:51,786 --> 00:11:53,335 That's not going to work. 244 00:11:53,359 --> 00:11:54,971 This is much more simple, 245 00:11:54,995 --> 00:11:56,154 much slower 246 00:11:56,178 --> 00:11:57,592 and much more humble. 247 00:11:57,616 --> 00:12:01,965 We start this revolution by creating simple systems like this in grad schools, 248 00:12:01,989 --> 00:12:04,083 in libraries, in nonprofits. 249 00:12:04,107 --> 00:12:06,761 And we try to figure out all of those little questions 250 00:12:06,785 --> 00:12:08,006 and those little problems 251 00:12:08,030 --> 00:12:11,931 that we're going to have to figure out to make this idea something viable, 252 00:12:11,955 --> 00:12:14,306 to make this idea something that we can trust. 253 00:12:14,330 --> 00:12:17,965 And as we create those systems that have a hundred people, a thousand people, 254 00:12:17,989 --> 00:12:21,759 a hundred thousand people voting in ways that are not politically binding, 255 00:12:21,783 --> 00:12:23,801 we're going to develop trust in this idea, 256 00:12:23,825 --> 00:12:25,344 the world is going to change, 257 00:12:25,368 --> 00:12:28,243 and those that are as little as my daughter is right now 258 00:12:28,267 --> 00:12:29,604 are going to grow up. 259 00:12:30,580 --> 00:12:32,949 And by the time my daughter is my age, 260 00:12:32,973 --> 00:12:37,409 maybe this idea, that I know today is very crazy, 261 00:12:37,433 --> 00:12:41,567 might not be crazy to her and to her friends. 262 00:12:41,956 --> 00:12:43,793 And at that point, 263 00:12:43,817 --> 00:12:46,420 we will be at the end of our history, 264 00:12:46,444 --> 00:12:49,265 but they will be at the beginning of theirs. 265 00:12:49,646 --> 00:12:50,829 Thank you. 266 00:12:50,853 --> 00:12:53,895 (Applause)