[Script Info] Title: [Events] Format: Layer, Start, End, Style, Name, MarginL, MarginR, MarginV, Effect, Text Dialogue: 0,0:00:01.37,0:00:03.12,Default,,0000,0000,0000,,Is it just me, Dialogue: 0,0:00:03.14,0:00:05.47,Default,,0000,0000,0000,,or are there other people here Dialogue: 0,0:00:05.50,0:00:07.84,Default,,0000,0000,0000,,that are a little bit\Ndisappointed with democracy? Dialogue: 0,0:00:08.99,0:00:11.32,Default,,0000,0000,0000,,(Applause) Dialogue: 0,0:00:12.14,0:00:14.21,Default,,0000,0000,0000,,So let's look at a few numbers. Dialogue: 0,0:00:14.93,0:00:17.11,Default,,0000,0000,0000,,If we look across the world, Dialogue: 0,0:00:17.13,0:00:21.02,Default,,0000,0000,0000,,the median turnout\Nin presidential elections Dialogue: 0,0:00:21.05,0:00:22.70,Default,,0000,0000,0000,,over the last 30 years Dialogue: 0,0:00:22.72,0:00:25.34,Default,,0000,0000,0000,,has been just 67 percent. Dialogue: 0,0:00:26.33,0:00:28.30,Default,,0000,0000,0000,,Now, if we go to Europe Dialogue: 0,0:00:28.33,0:00:32.75,Default,,0000,0000,0000,,and we look at people that participated\Nin EU parliamentary elections, Dialogue: 0,0:00:32.78,0:00:34.85,Default,,0000,0000,0000,,the median turnout in those elections Dialogue: 0,0:00:34.87,0:00:36.98,Default,,0000,0000,0000,,is just 42 percent. Dialogue: 0,0:00:38.12,0:00:39.79,Default,,0000,0000,0000,,Now let's go to New York, Dialogue: 0,0:00:39.82,0:00:44.50,Default,,0000,0000,0000,,and let's see how many people voted\Nin the last election for mayor. Dialogue: 0,0:00:44.52,0:00:48.34,Default,,0000,0000,0000,,We will find that only\N24 percent of people showed up to vote. Dialogue: 0,0:00:49.06,0:00:52.16,Default,,0000,0000,0000,,What that means is that,\Nif "Friends" was still running, Dialogue: 0,0:00:52.18,0:00:55.53,Default,,0000,0000,0000,,Joey and maybe Phoebe\Nwould have shown up to vote. Dialogue: 0,0:00:55.55,0:00:56.84,Default,,0000,0000,0000,,(Laughter) Dialogue: 0,0:00:57.43,0:01:01.86,Default,,0000,0000,0000,,And you cannot blame them\Nbecause people are tired of politicians. Dialogue: 0,0:01:01.88,0:01:05.77,Default,,0000,0000,0000,,And people are tired of other people\Nusing the data that they have generated Dialogue: 0,0:01:05.80,0:01:07.99,Default,,0000,0000,0000,,to communicate with\Ntheir friends and family, Dialogue: 0,0:01:08.02,0:01:10.11,Default,,0000,0000,0000,,to target political propaganda at them. Dialogue: 0,0:01:10.52,0:01:13.25,Default,,0000,0000,0000,,But the thing about this\Nis that this is not new. Dialogue: 0,0:01:13.27,0:01:16.50,Default,,0000,0000,0000,,Nowadays, people use likes\Nto target propaganda at you Dialogue: 0,0:01:16.52,0:01:19.89,Default,,0000,0000,0000,,before they use your zip code\Nor your gender or your age, Dialogue: 0,0:01:19.92,0:01:23.49,Default,,0000,0000,0000,,because the idea of targeting people\Nwith propaganda for political purposes Dialogue: 0,0:01:23.51,0:01:25.10,Default,,0000,0000,0000,,is as old as politics. Dialogue: 0,0:01:25.53,0:01:27.81,Default,,0000,0000,0000,,And the reason why that idea is there Dialogue: 0,0:01:27.83,0:01:31.30,Default,,0000,0000,0000,,is because democracy\Nhas a basic vulnerability. Dialogue: 0,0:01:31.71,0:01:33.64,Default,,0000,0000,0000,,This is the idea of a representative. Dialogue: 0,0:01:34.05,0:01:37.99,Default,,0000,0000,0000,,In principle, democracy is the ability\Nof people to exert power. Dialogue: 0,0:01:38.02,0:01:41.84,Default,,0000,0000,0000,,But in practice, we have to delegate\Nthat power to a representative Dialogue: 0,0:01:41.86,0:01:44.11,Default,,0000,0000,0000,,that can exert that power for us. Dialogue: 0,0:01:44.56,0:01:46.39,Default,,0000,0000,0000,,That representative is a bottleneck, Dialogue: 0,0:01:46.41,0:01:47.71,Default,,0000,0000,0000,,or a weak spot. Dialogue: 0,0:01:47.73,0:01:51.68,Default,,0000,0000,0000,,It is the place that you want to target\Nif you want to attack democracy Dialogue: 0,0:01:51.70,0:01:55.19,Default,,0000,0000,0000,,because you can capture democracy\Nby either capturing that representative Dialogue: 0,0:01:55.22,0:01:57.36,Default,,0000,0000,0000,,or capturing the way\Nthat people choose it. Dialogue: 0,0:01:58.06,0:01:59.48,Default,,0000,0000,0000,,So the big question is: Dialogue: 0,0:01:59.50,0:02:01.21,Default,,0000,0000,0000,,Is this the end of history? Dialogue: 0,0:02:01.99,0:02:05.09,Default,,0000,0000,0000,,Is this the best that we can do Dialogue: 0,0:02:05.88,0:02:08.94,Default,,0000,0000,0000,,or, actually, are there alternatives? Dialogue: 0,0:02:10.13,0:02:12.48,Default,,0000,0000,0000,,Some people have been thinking\Nabout alternatives, Dialogue: 0,0:02:12.51,0:02:16.19,Default,,0000,0000,0000,,and one of the ideas that is out there\Nis the idea of direct democracy. Dialogue: 0,0:02:16.79,0:02:19.27,Default,,0000,0000,0000,,This is the idea of bypassing\Npoliticians completely Dialogue: 0,0:02:19.29,0:02:21.70,Default,,0000,0000,0000,,and having people vote directly on issues, Dialogue: 0,0:02:21.72,0:02:23.98,Default,,0000,0000,0000,,having people vote directly on bills. Dialogue: 0,0:02:24.42,0:02:25.75,Default,,0000,0000,0000,,But this idea is naive Dialogue: 0,0:02:25.78,0:02:28.95,Default,,0000,0000,0000,,because there's too many things\Nthat we would need to choose. Dialogue: 0,0:02:28.97,0:02:31.75,Default,,0000,0000,0000,,If you look at the 114th US Congress, Dialogue: 0,0:02:31.78,0:02:34.26,Default,,0000,0000,0000,,you will have seen that\Nthe House of Representatives Dialogue: 0,0:02:34.29,0:02:37.18,Default,,0000,0000,0000,,considered more than 6,000 bills, Dialogue: 0,0:02:37.20,0:02:39.86,Default,,0000,0000,0000,,the Senate considered\Nmore than 3,000 bills Dialogue: 0,0:02:39.88,0:02:42.69,Default,,0000,0000,0000,,and they approved more than 300 laws. Dialogue: 0,0:02:42.71,0:02:44.29,Default,,0000,0000,0000,,Those would be many decisions Dialogue: 0,0:02:44.32,0:02:46.50,Default,,0000,0000,0000,,that each person would have to make a week Dialogue: 0,0:02:46.53,0:02:48.67,Default,,0000,0000,0000,,on topics that they know little about. Dialogue: 0,0:02:49.23,0:02:51.51,Default,,0000,0000,0000,,So there's a big cognitive\Nbandwidth problem Dialogue: 0,0:02:51.53,0:02:55.53,Default,,0000,0000,0000,,if we're going to try to think about\Ndirect democracy as a viable alternative. Dialogue: 0,0:02:56.20,0:03:00.64,Default,,0000,0000,0000,,So some people think about the idea\Nof liquid democracy, or fluid democracy, Dialogue: 0,0:03:00.66,0:03:04.44,Default,,0000,0000,0000,,which is the idea that you endorse\Nyour political power to someone, Dialogue: 0,0:03:04.46,0:03:06.16,Default,,0000,0000,0000,,who can endorse it to someone else, Dialogue: 0,0:03:06.19,0:03:08.73,Default,,0000,0000,0000,,and, eventually, you create\Na large follower network Dialogue: 0,0:03:08.75,0:03:12.05,Default,,0000,0000,0000,,in which, at the end, there's a few people\Nthat are making decisions Dialogue: 0,0:03:12.07,0:03:15.21,Default,,0000,0000,0000,,on behalf of all of their followers\Nand their followers. Dialogue: 0,0:03:16.33,0:03:20.46,Default,,0000,0000,0000,,But this idea also doesn't solve\Nthe problem of the cognitive bandwidth Dialogue: 0,0:03:20.48,0:03:24.36,Default,,0000,0000,0000,,and, to be honest, it's also quite similar\Nto the idea of having a representative. Dialogue: 0,0:03:24.80,0:03:28.26,Default,,0000,0000,0000,,So what I'm going to do today is\NI'm going to be a little bit provocative, Dialogue: 0,0:03:28.28,0:03:30.58,Default,,0000,0000,0000,,and I'm going to ask you, well: Dialogue: 0,0:03:30.60,0:03:37.16,Default,,0000,0000,0000,,What if, instead of trying\Nto bypass politicians, Dialogue: 0,0:03:37.19,0:03:39.39,Default,,0000,0000,0000,,we tried to automate them? Dialogue: 0,0:03:45.87,0:03:48.80,Default,,0000,0000,0000,,The idea of automation is not new. Dialogue: 0,0:03:48.82,0:03:50.90,Default,,0000,0000,0000,,It was started more than 300 years ago, Dialogue: 0,0:03:50.92,0:03:53.99,Default,,0000,0000,0000,,when French weavers decided\Nto automate the loom. Dialogue: 0,0:03:54.82,0:03:59.18,Default,,0000,0000,0000,,The winner of that industrial war\Nwas Joseph-Marie Jacquard. Dialogue: 0,0:03:59.20,0:04:00.96,Default,,0000,0000,0000,,He was a French weaver and merchant Dialogue: 0,0:04:00.98,0:04:03.42,Default,,0000,0000,0000,,that married the loom\Nwith the steam engine Dialogue: 0,0:04:03.44,0:04:05.63,Default,,0000,0000,0000,,to create autonomous looms. Dialogue: 0,0:04:05.66,0:04:08.41,Default,,0000,0000,0000,,And in those autonomous looms,\Nhe gained control. Dialogue: 0,0:04:08.43,0:04:12.32,Default,,0000,0000,0000,,He could now make fabrics that were\Nmore complex and more sophisticated Dialogue: 0,0:04:12.34,0:04:14.47,Default,,0000,0000,0000,,than the ones they\Nwere able to do by hand. Dialogue: 0,0:04:15.19,0:04:17.82,Default,,0000,0000,0000,,But also, by winning that industrial war, Dialogue: 0,0:04:17.85,0:04:21.37,Default,,0000,0000,0000,,he laid out what has become\Nthe blueprint of automation. Dialogue: 0,0:04:22.14,0:04:25.00,Default,,0000,0000,0000,,The way that we automate things\Nfor the last 300 years Dialogue: 0,0:04:25.03,0:04:26.41,Default,,0000,0000,0000,,has always been the same: Dialogue: 0,0:04:27.01,0:04:29.52,Default,,0000,0000,0000,,we first identify a need, Dialogue: 0,0:04:29.54,0:04:32.72,Default,,0000,0000,0000,,then we create a tool\Nto satisfy that need, Dialogue: 0,0:04:32.75,0:04:34.79,Default,,0000,0000,0000,,like the loom, in this case, Dialogue: 0,0:04:34.81,0:04:37.20,Default,,0000,0000,0000,,and then we study how people use that tool Dialogue: 0,0:04:37.23,0:04:38.71,Default,,0000,0000,0000,,to automate that user. Dialogue: 0,0:04:39.24,0:04:42.30,Default,,0000,0000,0000,,That's how we came\Nfrom the mechanical loom Dialogue: 0,0:04:42.33,0:04:44.22,Default,,0000,0000,0000,,to the autonomous loom, Dialogue: 0,0:04:44.25,0:04:46.37,Default,,0000,0000,0000,,and that took us a thousand years. Dialogue: 0,0:04:46.39,0:04:48.46,Default,,0000,0000,0000,,Now, it's taken us only a hundred years Dialogue: 0,0:04:48.49,0:04:51.70,Default,,0000,0000,0000,,to use the same script\Nto automate the car. Dialogue: 0,0:04:53.29,0:04:55.74,Default,,0000,0000,0000,,But the thing is that, this time around, Dialogue: 0,0:04:55.76,0:04:57.89,Default,,0000,0000,0000,,automation is kind of for real. Dialogue: 0,0:04:57.92,0:05:01.24,Default,,0000,0000,0000,,This is a video that a colleague of mine\Nfrom Toshiba shared with me Dialogue: 0,0:05:01.26,0:05:04.52,Default,,0000,0000,0000,,that shows the factory\Nthat manufactures solid state drives. Dialogue: 0,0:05:04.54,0:05:06.56,Default,,0000,0000,0000,,The entire factory is a robot. Dialogue: 0,0:05:06.58,0:05:08.51,Default,,0000,0000,0000,,There are no humans in that factory. Dialogue: 0,0:05:09.03,0:05:11.25,Default,,0000,0000,0000,,And the robots are soon\Nto leave the factories Dialogue: 0,0:05:11.28,0:05:13.30,Default,,0000,0000,0000,,and become part of our world, Dialogue: 0,0:05:13.32,0:05:15.16,Default,,0000,0000,0000,,become part of our workforce. Dialogue: 0,0:05:15.18,0:05:16.96,Default,,0000,0000,0000,,So what I do in my day job Dialogue: 0,0:05:16.98,0:05:20.97,Default,,0000,0000,0000,,is actually create tools that integrate\Ndata for entire countries Dialogue: 0,0:05:20.100,0:05:24.46,Default,,0000,0000,0000,,so that we can ultimately have\Nthe foundations that we need Dialogue: 0,0:05:24.49,0:05:28.17,Default,,0000,0000,0000,,for a future in which we need\Nto also manage those machines. Dialogue: 0,0:05:29.20,0:05:32.10,Default,,0000,0000,0000,,But today, I'm not here\Nto talk to you about these tools Dialogue: 0,0:05:32.12,0:05:33.95,Default,,0000,0000,0000,,that integrate data for countries. Dialogue: 0,0:05:34.46,0:05:37.08,Default,,0000,0000,0000,,But I'm here to talk to you\Nabout another idea Dialogue: 0,0:05:37.11,0:05:41.97,Default,,0000,0000,0000,,that might help us think about how to use\Nartificial intelligence in democracy. Dialogue: 0,0:05:41.100,0:05:46.73,Default,,0000,0000,0000,,Because the tools that I build\Nare designed for executive decisions. Dialogue: 0,0:05:46.76,0:05:50.60,Default,,0000,0000,0000,,These are decisions that can be cast\Nin some sort of term of objectivity -- Dialogue: 0,0:05:50.62,0:05:52.37,Default,,0000,0000,0000,,public investment decisions. Dialogue: 0,0:05:52.88,0:05:55.52,Default,,0000,0000,0000,,But there are decisions\Nthat are legislative, Dialogue: 0,0:05:55.54,0:05:59.33,Default,,0000,0000,0000,,and these decisions that are legislative\Nrequire communication among people Dialogue: 0,0:05:59.35,0:06:01.05,Default,,0000,0000,0000,,that have different points of view, Dialogue: 0,0:06:01.08,0:06:03.69,Default,,0000,0000,0000,,require participation, require debate, Dialogue: 0,0:06:03.71,0:06:05.19,Default,,0000,0000,0000,,require deliberation. Dialogue: 0,0:06:06.24,0:06:09.04,Default,,0000,0000,0000,,And for a long time,\Nwe have thought that, well, Dialogue: 0,0:06:09.07,0:06:12.53,Default,,0000,0000,0000,,what we need to improve democracy\Nis actually more communication. Dialogue: 0,0:06:12.55,0:06:16.26,Default,,0000,0000,0000,,So all of the technologies that we have\Nadvanced in the context of democracy, Dialogue: 0,0:06:16.29,0:06:19.06,Default,,0000,0000,0000,,whether they are newspapers\Nor whether it is social media, Dialogue: 0,0:06:19.09,0:06:21.47,Default,,0000,0000,0000,,have tried to provide us\Nwith more communication. Dialogue: 0,0:06:22.10,0:06:23.92,Default,,0000,0000,0000,,But we've been down that rabbit hole, Dialogue: 0,0:06:23.95,0:06:26.70,Default,,0000,0000,0000,,and we know that's not\Nwhat's going to solve the problem. Dialogue: 0,0:06:26.72,0:06:28.72,Default,,0000,0000,0000,,Because it's not a communication problem, Dialogue: 0,0:06:28.74,0:06:30.49,Default,,0000,0000,0000,,it's a cognitive bandwidth problem. Dialogue: 0,0:06:30.51,0:06:32.88,Default,,0000,0000,0000,,So if the problem is one\Nof cognitive bandwidth, Dialogue: 0,0:06:32.90,0:06:35.49,Default,,0000,0000,0000,,well, adding more communication to people Dialogue: 0,0:06:35.51,0:06:38.26,Default,,0000,0000,0000,,is not going to be\Nwhat's going to solve it. Dialogue: 0,0:06:38.28,0:06:41.40,Default,,0000,0000,0000,,What we are going to need instead\Nis to have other technologies Dialogue: 0,0:06:41.42,0:06:44.46,Default,,0000,0000,0000,,that help us deal with\Nsome of the communication Dialogue: 0,0:06:44.49,0:06:46.73,Default,,0000,0000,0000,,that we are overloaded with. Dialogue: 0,0:06:46.76,0:06:48.45,Default,,0000,0000,0000,,Think of, like, a little avatar, Dialogue: 0,0:06:48.48,0:06:49.82,Default,,0000,0000,0000,,a software agent, Dialogue: 0,0:06:49.84,0:06:51.72,Default,,0000,0000,0000,,a digital Jiminy Cricket -- Dialogue: 0,0:06:51.74,0:06:52.98,Default,,0000,0000,0000,,(Laughter) Dialogue: 0,0:06:53.00,0:06:57.02,Default,,0000,0000,0000,,that basically is able\Nto answer things on your behalf. Dialogue: 0,0:06:57.76,0:06:59.55,Default,,0000,0000,0000,,And if we had that technology, Dialogue: 0,0:06:59.57,0:07:02.05,Default,,0000,0000,0000,,we would be able to offload\Nsome of the communication Dialogue: 0,0:07:02.07,0:07:06.22,Default,,0000,0000,0000,,and help, maybe, make better decisions\Nor decisions at a larger scale. Dialogue: 0,0:07:06.86,0:07:10.58,Default,,0000,0000,0000,,And the thing is that the idea\Nof software agents is also not new. Dialogue: 0,0:07:10.60,0:07:12.71,Default,,0000,0000,0000,,We already use them all the time. Dialogue: 0,0:07:13.22,0:07:14.74,Default,,0000,0000,0000,,We use software agents Dialogue: 0,0:07:14.76,0:07:18.44,Default,,0000,0000,0000,,to choose the way that we're going\Nto drive to a certain location, Dialogue: 0,0:07:19.07,0:07:21.17,Default,,0000,0000,0000,,the music that we're going to listen to Dialogue: 0,0:07:21.76,0:07:24.78,Default,,0000,0000,0000,,or to get suggestions\Nfor the next books that we should read. Dialogue: 0,0:07:25.99,0:07:28.57,Default,,0000,0000,0000,,So there is an obvious idea\Nin the 21st century Dialogue: 0,0:07:28.59,0:07:31.24,Default,,0000,0000,0000,,that was as obvious as the idea Dialogue: 0,0:07:31.26,0:07:36.84,Default,,0000,0000,0000,,of putting together a steam engine\Nwith a loom at the time of Jacquard. Dialogue: 0,0:07:37.54,0:07:41.99,Default,,0000,0000,0000,,And that idea is combining\Ndirect democracy with software agents. Dialogue: 0,0:07:42.85,0:07:44.97,Default,,0000,0000,0000,,Imagine, for a second, a world Dialogue: 0,0:07:44.99,0:07:48.16,Default,,0000,0000,0000,,in which, instead of having\Na representative that represents you Dialogue: 0,0:07:48.18,0:07:49.76,Default,,0000,0000,0000,,and millions of other people, Dialogue: 0,0:07:49.78,0:07:52.82,Default,,0000,0000,0000,,you can have a representative\Nthat represents only you, Dialogue: 0,0:07:53.50,0:07:55.76,Default,,0000,0000,0000,,with your nuanced political views -- Dialogue: 0,0:07:55.78,0:07:59.13,Default,,0000,0000,0000,,that weird combination\Nof libertarian and liberal Dialogue: 0,0:07:59.15,0:08:01.54,Default,,0000,0000,0000,,and maybe a little bit\Nconservative on some issues Dialogue: 0,0:08:01.57,0:08:03.67,Default,,0000,0000,0000,,and maybe very progressive on others. Dialogue: 0,0:08:03.70,0:08:06.96,Default,,0000,0000,0000,,Politicians nowadays are packages,\Nand they're full of compromises. Dialogue: 0,0:08:06.99,0:08:10.62,Default,,0000,0000,0000,,But you might have someone\Nthat can represent only you, Dialogue: 0,0:08:10.65,0:08:12.50,Default,,0000,0000,0000,,if you are willing to give up the idea Dialogue: 0,0:08:12.52,0:08:14.77,Default,,0000,0000,0000,,that that representative is a human. Dialogue: 0,0:08:15.23,0:08:17.31,Default,,0000,0000,0000,,If that representative\Nis a software agent, Dialogue: 0,0:08:17.34,0:08:21.50,Default,,0000,0000,0000,,we could have a senate that has\Nas many senators as we have citizens. Dialogue: 0,0:08:21.53,0:08:24.39,Default,,0000,0000,0000,,And those senators are going to be able\Nto read every bill Dialogue: 0,0:08:24.41,0:08:27.16,Default,,0000,0000,0000,,and they're going to be able\Nto vote on each one of them. Dialogue: 0,0:08:27.82,0:08:30.78,Default,,0000,0000,0000,,So there's an obvious idea\Nthat maybe we want to consider. Dialogue: 0,0:08:30.80,0:08:33.22,Default,,0000,0000,0000,,But I understand that in this day and age, Dialogue: 0,0:08:33.25,0:08:35.14,Default,,0000,0000,0000,,this idea might be quite scary. Dialogue: 0,0:08:36.39,0:08:39.83,Default,,0000,0000,0000,,In fact, thinking of a robot\Ncoming from the future Dialogue: 0,0:08:39.86,0:08:41.53,Default,,0000,0000,0000,,to help us run our governments Dialogue: 0,0:08:41.55,0:08:43.18,Default,,0000,0000,0000,,sounds terrifying. Dialogue: 0,0:08:44.22,0:08:45.87,Default,,0000,0000,0000,,But we've been there before. Dialogue: 0,0:08:45.90,0:08:47.17,Default,,0000,0000,0000,,(Laughter) Dialogue: 0,0:08:47.20,0:08:49.69,Default,,0000,0000,0000,,And actually he was quite a nice guy. Dialogue: 0,0:08:51.68,0:08:58.11,Default,,0000,0000,0000,,So what would the Jacquard loom\Nversion of this idea look like? Dialogue: 0,0:08:58.14,0:09:00.04,Default,,0000,0000,0000,,It would be a very simple system. Dialogue: 0,0:09:00.06,0:09:03.52,Default,,0000,0000,0000,,Imagine a system that you log in\Nand you create your avatar, Dialogue: 0,0:09:03.54,0:09:05.100,Default,,0000,0000,0000,,and then you're going\Nto start training your avatar. Dialogue: 0,0:09:06.02,0:09:08.70,Default,,0000,0000,0000,,So you can provide your avatar\Nwith your reading habits, Dialogue: 0,0:09:08.73,0:09:10.59,Default,,0000,0000,0000,,or connect it to your social media, Dialogue: 0,0:09:10.61,0:09:13.02,Default,,0000,0000,0000,,or you can connect it to other data, Dialogue: 0,0:09:13.04,0:09:15.32,Default,,0000,0000,0000,,for example by taking\Npsychological tests. Dialogue: 0,0:09:15.34,0:09:18.31,Default,,0000,0000,0000,,And the nice thing about this\Nis that there's no deception. Dialogue: 0,0:09:18.33,0:09:21.67,Default,,0000,0000,0000,,You are not providing data to communicate\Nwith your friends and family Dialogue: 0,0:09:21.70,0:09:24.85,Default,,0000,0000,0000,,that then gets used in a political system. Dialogue: 0,0:09:24.87,0:09:28.58,Default,,0000,0000,0000,,You are providing data to a system\Nthat is designed to be used Dialogue: 0,0:09:28.60,0:09:30.72,Default,,0000,0000,0000,,to make political decisions\Non your behalf. Dialogue: 0,0:09:31.26,0:09:35.24,Default,,0000,0000,0000,,Then you take that data and you choose\Na training algorithm, Dialogue: 0,0:09:35.27,0:09:36.83,Default,,0000,0000,0000,,because it's an open marketplace Dialogue: 0,0:09:36.86,0:09:39.64,Default,,0000,0000,0000,,in which different people\Ncan submit different algorithms Dialogue: 0,0:09:39.66,0:09:44.06,Default,,0000,0000,0000,,to predict how you're going to vote,\Nbased on the data you have provided. Dialogue: 0,0:09:44.08,0:09:47.54,Default,,0000,0000,0000,,And the system is open,\Nso nobody controls the algorithms; Dialogue: 0,0:09:47.56,0:09:49.67,Default,,0000,0000,0000,,there are algorithms\Nthat become more popular Dialogue: 0,0:09:49.70,0:09:51.42,Default,,0000,0000,0000,,and others that become less popular. Dialogue: 0,0:09:51.44,0:09:53.25,Default,,0000,0000,0000,,Eventually, you can audit the system. Dialogue: 0,0:09:53.28,0:09:55.16,Default,,0000,0000,0000,,You can see how your avatar is working. Dialogue: 0,0:09:55.18,0:09:57.33,Default,,0000,0000,0000,,If you like it,\Nyou can leave it on autopilot. Dialogue: 0,0:09:57.36,0:09:59.42,Default,,0000,0000,0000,,If you want to be\Na little more controlling, Dialogue: 0,0:09:59.44,0:10:01.41,Default,,0000,0000,0000,,you can actually choose that they ask you Dialogue: 0,0:10:01.44,0:10:03.50,Default,,0000,0000,0000,,every time they're going\Nto make a decision, Dialogue: 0,0:10:03.53,0:10:05.16,Default,,0000,0000,0000,,or you can be anywhere in between. Dialogue: 0,0:10:05.19,0:10:07.59,Default,,0000,0000,0000,,One of the reasons\Nwhy we use democracy so little Dialogue: 0,0:10:07.62,0:10:11.18,Default,,0000,0000,0000,,may be because democracy\Nhas a very bad user interface. Dialogue: 0,0:10:11.21,0:10:13.69,Default,,0000,0000,0000,,And if we improve the user\Ninterface of democracy, Dialogue: 0,0:10:13.71,0:10:15.84,Default,,0000,0000,0000,,we might be able to use it more. Dialogue: 0,0:10:16.45,0:10:19.66,Default,,0000,0000,0000,,Of course, there's a lot of questions\Nthat you might have. Dialogue: 0,0:10:20.47,0:10:22.63,Default,,0000,0000,0000,,Well, how do you train these avatars? Dialogue: 0,0:10:22.66,0:10:24.55,Default,,0000,0000,0000,,How do you keep the data secure? Dialogue: 0,0:10:24.58,0:10:27.82,Default,,0000,0000,0000,,How do you keep the systems\Ndistributed and auditable? Dialogue: 0,0:10:27.85,0:10:29.91,Default,,0000,0000,0000,,How about my grandmother,\Nwho's 80 years old Dialogue: 0,0:10:29.95,0:10:31.91,Default,,0000,0000,0000,,and doesn't know how to use the internet? Dialogue: 0,0:10:32.26,0:10:34.48,Default,,0000,0000,0000,,Trust me, I've heard them all. Dialogue: 0,0:10:34.51,0:10:39.07,Default,,0000,0000,0000,,So when you think about an idea like this,\Nyou have to beware of pessimists Dialogue: 0,0:10:39.09,0:10:43.41,Default,,0000,0000,0000,,because they are known to have\Na problem for every solution. Dialogue: 0,0:10:43.43,0:10:45.26,Default,,0000,0000,0000,,(Laughter) Dialogue: 0,0:10:45.28,0:10:48.32,Default,,0000,0000,0000,,So I want to invite you to think\Nabout the bigger ideas. Dialogue: 0,0:10:48.35,0:10:51.97,Default,,0000,0000,0000,,The questions I just showed you\Nare little ideas Dialogue: 0,0:10:51.100,0:10:54.90,Default,,0000,0000,0000,,because they are questions\Nabout how this would not work. Dialogue: 0,0:10:55.50,0:10:57.48,Default,,0000,0000,0000,,The big ideas are ideas of: Dialogue: 0,0:10:57.51,0:10:59.31,Default,,0000,0000,0000,,What else can you do with this Dialogue: 0,0:10:59.34,0:11:01.23,Default,,0000,0000,0000,,if this would happen to work? Dialogue: 0,0:11:01.77,0:11:05.22,Default,,0000,0000,0000,,And one of those ideas is,\Nwell, who writes the laws? Dialogue: 0,0:11:05.85,0:11:10.08,Default,,0000,0000,0000,,In the beginning, we could have\Nthe avatars that we already have, Dialogue: 0,0:11:10.10,0:11:13.60,Default,,0000,0000,0000,,voting on laws that are written\Nby the senators or politicians Dialogue: 0,0:11:13.62,0:11:14.97,Default,,0000,0000,0000,,that we already have. Dialogue: 0,0:11:15.49,0:11:17.20,Default,,0000,0000,0000,,But if this were to work, Dialogue: 0,0:11:17.90,0:11:20.25,Default,,0000,0000,0000,,you could write an algorithm Dialogue: 0,0:11:20.28,0:11:22.43,Default,,0000,0000,0000,,that could try to write a law Dialogue: 0,0:11:22.45,0:11:24.87,Default,,0000,0000,0000,,that would get a certain\Npercentage of approval, Dialogue: 0,0:11:24.90,0:11:26.68,Default,,0000,0000,0000,,and you could reverse the process. Dialogue: 0,0:11:26.70,0:11:30.21,Default,,0000,0000,0000,,Now, you might think that this idea\Nis ludicrous and we should not do it, Dialogue: 0,0:11:30.24,0:11:33.02,Default,,0000,0000,0000,,but you cannot deny that it's an idea\Nthat is only possible Dialogue: 0,0:11:33.05,0:11:36.07,Default,,0000,0000,0000,,in a world in which direct democracy\Nand software agents Dialogue: 0,0:11:36.09,0:11:38.75,Default,,0000,0000,0000,,are a viable form of participation. Dialogue: 0,0:11:40.60,0:11:43.35,Default,,0000,0000,0000,,So how do we start the revolution? Dialogue: 0,0:11:44.24,0:11:47.55,Default,,0000,0000,0000,,We don't start this revolution\Nwith picket fences or protests Dialogue: 0,0:11:47.57,0:11:51.76,Default,,0000,0000,0000,,or by demanding our current politicians\Nto be changed into robots. Dialogue: 0,0:11:51.79,0:11:53.34,Default,,0000,0000,0000,,That's not going to work. Dialogue: 0,0:11:53.36,0:11:54.97,Default,,0000,0000,0000,,This is much more simple, Dialogue: 0,0:11:54.100,0:11:56.15,Default,,0000,0000,0000,,much slower Dialogue: 0,0:11:56.18,0:11:57.59,Default,,0000,0000,0000,,and much more humble. Dialogue: 0,0:11:57.62,0:12:01.96,Default,,0000,0000,0000,,We start this revolution by creating\Nsimple systems like this in grad schools, Dialogue: 0,0:12:01.99,0:12:04.08,Default,,0000,0000,0000,,in libraries, in nonprofits. Dialogue: 0,0:12:04.11,0:12:06.76,Default,,0000,0000,0000,,And we try to figure out\Nall of those little questions Dialogue: 0,0:12:06.78,0:12:08.01,Default,,0000,0000,0000,,and those little problems Dialogue: 0,0:12:08.03,0:12:11.93,Default,,0000,0000,0000,,that we're going to have to figure out\Nto make this idea something viable, Dialogue: 0,0:12:11.96,0:12:14.31,Default,,0000,0000,0000,,to make this idea something\Nthat we can trust. Dialogue: 0,0:12:14.33,0:12:17.96,Default,,0000,0000,0000,,And as we create those systems that have\Na hundred people, a thousand people, Dialogue: 0,0:12:17.99,0:12:21.76,Default,,0000,0000,0000,,a hundred thousand people voting\Nin ways that are not politically binding, Dialogue: 0,0:12:21.78,0:12:23.80,Default,,0000,0000,0000,,we're going to develop trust in this idea, Dialogue: 0,0:12:23.82,0:12:25.34,Default,,0000,0000,0000,,the world is going to change, Dialogue: 0,0:12:25.37,0:12:28.24,Default,,0000,0000,0000,,and those that are as little\Nas my daughter is right now Dialogue: 0,0:12:28.27,0:12:29.60,Default,,0000,0000,0000,,are going to grow up. Dialogue: 0,0:12:30.58,0:12:32.95,Default,,0000,0000,0000,,And by the time my daughter is my age, Dialogue: 0,0:12:32.97,0:12:37.41,Default,,0000,0000,0000,,maybe this idea, that I know\Ntoday is very crazy, Dialogue: 0,0:12:37.43,0:12:41.57,Default,,0000,0000,0000,,might not be crazy to her\Nand to her friends. Dialogue: 0,0:12:41.96,0:12:43.79,Default,,0000,0000,0000,,And at that point, Dialogue: 0,0:12:43.82,0:12:46.42,Default,,0000,0000,0000,,we will be at the end of our history, Dialogue: 0,0:12:46.44,0:12:49.26,Default,,0000,0000,0000,,but they will be\Nat the beginning of theirs. Dialogue: 0,0:12:49.65,0:12:50.83,Default,,0000,0000,0000,,Thank you. Dialogue: 0,0:12:50.85,0:12:53.90,Default,,0000,0000,0000,,(Applause)