0:00:13.298,0:00:17.148 The rise of the machines! 0:00:17.928,0:00:22.748 Who here is scared of killer robots? 0:00:23.140,0:00:25.250 (Laughter) 0:00:25.612,0:00:27.212 I am! 0:00:28.226,0:00:31.896 I used to work in UAVs -[br]Unmanned Aerial Vehicles - 0:00:31.896,0:00:36.736 and all I could think seeing[br]these things is that someday, 0:00:37.133,0:00:40.903 somebody is going to strap[br]a machine-gun to these things, 0:00:40.903,0:00:43.853 and they're going[br]to hunt me down in swarms. 0:00:44.688,0:00:49.848 I work in robotics at Brown University[br]and I'm scared of robots. 0:00:50.541,0:00:53.281 Actually, I'm kind of terrified, 0:00:53.761,0:00:55.811 but, can you blame me? 0:00:55.811,0:00:59.501 Ever since I was a kid,[br]all I've seen are movies 0:00:59.501,0:01:03.011 that portrayed the ascendance[br]of Artificial Intelligence 0:01:03.011,0:01:05.811 and our inevitable conflict with it - 0:01:05.811,0:01:11.041 2001 Space Odyssey,[br]The Terminator, The Matrix - 0:01:11.800,0:01:16.200 and the stories they tell are pretty scary: 0:01:16.200,0:01:20.918 rogue bands of humans running away[br]from super intelligent machines. 0:01:21.935,0:01:26.795 That scares me. From the hands, [br]it seems like it scares you as well. 0:01:26.795,0:01:30.235 I know it is scary to Elon Musk. 0:01:30.825,0:01:35.245 But, you know, we have a little bit[br]of time before the robots rise up. 0:01:35.245,0:01:38.571 Robots like the PR2[br]that I have at my initiative, 0:01:38.571,0:01:41.381 they can't even open the door yet. 0:01:42.191,0:01:46.707 So in my mind, this discussion[br]of super intelligent robots 0:01:46.707,0:01:51.997 is a little bit of a distraction[br]from something far more insidious 0:01:51.997,0:01:56.217 that is going on with AI systems[br]across the country. 0:01:56.917,0:02:00.067 You see, right now,[br]there are people - 0:02:00.067,0:02:04.227 doctors, judges, accountants - 0:02:04.227,0:02:07.957 who are getting information[br]from an AI system 0:02:07.957,0:02:12.717 and treating it as if it is information[br]from a trusted colleague. 0:02:13.931,0:02:16.901 It's this trust that bothers me 0:02:17.141,0:02:20.182 not because of how often[br]AI gets it wrong. 0:02:20.182,0:02:24.403 AI researchers pride themselves[br]in accuracy on results. 0:02:24.869,0:02:27.849 It's how badly it gets it wrong[br]when it makes a mistake 0:02:27.849,0:02:29.779 that has me worried. 0:02:29.779,0:02:33.579 These systems do not fail gracefully. 0:02:34.240,0:02:36.960 So, let's take a look[br]at what this looks like. 0:02:37.120,0:02:42.560 This is a dog that has been misidentified[br]as a wolf by an AI algorithm. 0:02:43.233,0:02:45.239 The researchers wanted to know: 0:02:45.239,0:02:49.509 why did this particular husky[br]get misidentified as a wolf? 0:02:49.751,0:02:52.721 So they rewrote the algorithm[br]to explain to them 0:02:52.721,0:02:55.651 the parts of the picture[br]it was paying attention to 0:02:55.651,0:02:58.501 when the AI algorithm made its decision. 0:02:59.039,0:03:02.749 In this picture, what do you[br]think it paid attention to? 0:03:02.869,0:03:05.099 What would you pay attention to? 0:03:05.359,0:03:10.479 Maybe the eyes,[br]maybe the ears, the snout ... 0:03:13.041,0:03:16.531 This is what it paid attention to: 0:03:16.981,0:03:20.391 mostly the snow[br]and the background of the picture. 0:03:21.003,0:03:25.853 You see, there was bias in the data set[br]that was fed to this algorithm. 0:03:26.293,0:03:30.373 Most of the pictures of wolves were in snow, 0:03:30.573,0:03:34.783 so the AI algorithm conflated[br]the presence or absence of snow 0:03:34.783,0:03:38.373 for the presence or absence of a wolf. 0:03:39.912,0:03:42.027 The scary thing about this 0:03:42.027,0:03:46.287 is the researchers had[br]no idea this was happening 0:03:46.287,0:03:50.107 until they rewrote[br]the algorithm to explain itself. 0:03:50.836,0:03:55.326 And that's the thing with AI algorithms,[br]deep learning, machine learning. 0:03:55.326,0:03:59.346 Even the developers who work on this stuff 0:03:59.346,0:04:02.396 have no idea what it's doing. 0:04:03.001,0:04:07.591 So, that might be[br]a great example for a research, 0:04:07.591,0:04:10.281 but what does this mean in the real world? 0:04:10.611,0:04:15.841 The Compas Criminal Sentencing[br]algorithm is used in 13 states 0:04:15.841,0:04:17.991 to determine criminal recidivism 0:04:17.991,0:04:22.471 or the risk of committing[br]a crime again after you're released. 0:04:23.199,0:04:26.959 ProPublica found[br]that if you're African-American, 0:04:26.959,0:04:32.023 Compas was 77% more likely to qualify[br]you as a potentially violent offender 0:04:32.023,0:04:34.123 than if you're a Caucasian. 0:04:34.784,0:04:39.404 This is a real system being used[br]in the real world by real judges 0:04:39.404,0:04:42.434 to make decisions about real people's lives. 0:04:44.115,0:04:48.815 Why would the judges trust it[br]if it seems to exhibit bias? 0:04:49.866,0:04:55.176 Well, the reason they use Compas[br]is because it is a model for efficiency. 0:04:55.622,0:05:00.072 Compas lets them go[br]through caseloads much faster 0:05:00.072,0:05:02.992 in a backlogged criminal justice system. 0:05:04.877,0:05:07.297 Why would they question[br]their own software? 0:05:07.297,0:05:10.957 It's been requisitioned by the State,[br]approved by their IT Department. 0:05:10.957,0:05:13.357 Why would they question it? 0:05:13.357,0:05:16.513 Well, the people sentenced[br]by Compas have questioned it, 0:05:16.513,0:05:18.853 and their lawsuits should chill us all. 0:05:19.243,0:05:22.123 The Wisconsin State Supreme Court ruled 0:05:22.123,0:05:25.643 that compass did not deny[br]a defendant due process 0:05:25.643,0:05:28.433 provided it was used "properly." 0:05:28.963,0:05:30.688 In the same set of rulings, they ruled 0:05:30.688,0:05:34.758 that the defendant could not inspect[br]the source code of Compass. 0:05:35.700,0:05:39.990 It has to be used properly[br]but you can't inspect the source code? 0:05:40.425,0:05:43.425 This is a disturbing set of rulings[br]when taken together 0:05:43.425,0:05:46.175 for anyone facing criminal sentencing. 0:05:46.625,0:05:50.705 You may not care about this because[br]you're not facing criminal sentencing, 0:05:51.056,0:05:55.056 but what if I told you[br]that black box AI algorithms like this 0:05:55.056,0:05:59.376 are being used to decide whether or not[br]you can get a loan for your house, 0:06:00.144,0:06:02.844 whether you get a job interview, 0:06:03.364,0:06:05.863 whether you get Medicaid, 0:06:05.954,0:06:10.434 and are even driving cars [br]and trucks down the highway. 0:06:10.831,0:06:14.531 Would you want the public[br]to be able to inspect the algorithm 0:06:14.531,0:06:17.239 that's trying to make a decisiom[br]between a shopping cart [br] 0:06:17.239,0:06:20.899 and a baby carriage[br]in a self-driving truck, 0:06:20.899,0:06:23.679 in the same way the dog/wolf[br]algorithm was trying to decide 0:06:23.679,0:06:26.069 between a dog or a wolf? 0:06:26.282,0:06:31.462 Are you potentially a metaphorical dog[br]who's been misidentified as a wolf 0:06:31.462,0:06:34.262 by somebody's AI algorithm? 0:06:34.868,0:06:38.718 Considering the complexity[br]of people, it's possible. 0:06:38.811,0:06:42.031 Is there anything[br]you can do about it now? 0:06:42.031,0:06:46.841 Probably not, and that's[br]what we need to focus on. 0:06:47.487,0:06:50.567 We need to demand[br]standards of accountability, 0:06:50.567,0:06:55.397 transparency and recourse in AI systems. 0:06:56.456,0:07:01.034 ISO, the International Standards[br]Organization, just formed a committee 0:07:01.034,0:07:04.504 to make decisions about[br]what to do for AI standards. 0:07:04.923,0:07:08.739 They're about five years out[br]from coming up with a standard. 0:07:08.989,0:07:12.479 These systems are being used now, 0:07:13.671,0:07:19.361 not just in loans, but they're being[br]used in vehicles like I was saying. 0:07:20.841,0:07:25.273 They're being used in things like[br]Cooperative Adaptive Cruise Control. 0:07:25.273,0:07:27.973 It's funny that they call that "cruise control" 0:07:27.973,0:07:32.703 because the type of controller used[br]in cruise control, a PID controller, 0:07:32.703,0:07:38.323 was used for 30 years in chemical plants[br]before it ever made into a car. 0:07:39.139,0:07:41.138 The type of controller that's used[br] 0:07:41.138,0:07:44.628 to drive a self-driving car[br]and a machine learning, 0:07:44.628,0:07:48.878 that's only been used[br]in research since 2007. 0:07:49.680,0:07:52.230 These are new technologies. 0:07:52.470,0:07:56.430 We need to demand the standards[br]and we need to demand regulation 0:07:56.430,0:08:00.340 so that we don't get snake oil [br]in the marketplace. 0:08:00.819,0:08:05.059 And we also have to have[br]a little bit of skepticism. 0:08:05.861,0:08:07.871 The experiments in Authority 0:08:07.871,0:08:11.121 done by Stanley Milgram[br]after World War II, 0:08:11.121,0:08:16.031 showed that your average person[br]would follow an authority figures orders 0:08:16.031,0:08:19.741 even if it meant harming their fellow citizen. 0:08:20.461,0:08:22.850 In this experiment, 0:08:22.850,0:08:27.130 every day Americans would shock an actor 0:08:27.689,0:08:31.269 past the point of him[br]complaining about her trouble, 0:08:31.577,0:08:35.427 past the point of him screaming in pain, 0:08:35.934,0:08:40.894 past the point of him[br]going silent in simulated death, 0:08:41.613,0:08:44.099 all because somebody 0:08:44.099,0:08:47.969 with no credentials, in a lab coat, 0:08:47.969,0:08:50.795 was saying some variation of the phrase 0:08:50.795,0:08:54.475 "The experiment must continue." 0:08:56.945,0:09:02.398 In AI, we have Milgram's[br]ultimate authority figure. 0:09:03.656,0:09:08.366 We have a dispassionate[br]system that can't reflect, 0:09:09.370,0:09:12.640 that can't make another decision, 0:09:12.920,0:09:14.902 that there is no recourse to, 0:09:15.074,0:09:20.454 that will always say "The system[br]or "The process must continue." 0:09:23.313,0:09:25.883 Now, I'm going to tell you a little story. 0:09:25.883,0:09:29.723 It's about a car trip I took[br]driving across country. 0:09:30.790,0:09:34.690 I was coming into Salt Lake City[br]and it started raining. 0:09:35.211,0:09:39.900 As I climbed into the mountains,[br]that rain turned into snow, 0:09:40.380,0:09:42.580 and pretty soon that snow was whiteout. 0:09:42.580,0:09:45.720 I couldn't see the taillights[br]of the car in front of me. 0:09:46.153,0:09:48.023 I started skidding. 0:09:48.023,0:09:51.033 I went 360 one way,[br]I went 360 the other way. 0:09:51.033,0:09:52.773 I went off the highway.[br] 0:09:52.773,0:09:54.953 Mud-coated my windows,[br]I couldn't see a thing. 0:09:54.953,0:09:59.103 I was terrified some car was going[br]to come crashing into me. 0:09:59.924,0:10:03.864 Now, I'm telling you this story[br]to get you thinking 0:10:03.864,0:10:07.324 about how something small [br]and seemingly mundane 0:10:07.324,0:10:10.094 like a little bit precipitation, 0:10:10.094,0:10:14.584 can easily grow[br]into something very dangerous. 0:10:15.409,0:10:19.789 We are driving in the rain[br]with AI right now, 0:10:20.442,0:10:23.402 and that rain will turn to snow, 0:10:23.887,0:10:27.457 and that snow could become a blizzard. 0:10:28.097,0:10:30.347 We need to pause, 0:10:30.537,0:10:32.857 check the conditions, 0:10:33.002,0:10:35.502 put in place safety standards, 0:10:35.642,0:10:41.232 and ask ourselves[br]how far do we want to go, 0:10:42.267,0:10:46.437 because the economic incentives[br]for AI and automation 0:10:46.437,0:10:48.363 to replace human labor 0:10:48.363,0:10:53.313 will be beyond anything we have seen[br]since the Industrial Revolution. 0:10:54.043,0:10:58.040 Human salary demands can't compete 0:10:58.040,0:11:01.650 with the base cost of electricity. 0:11:02.480,0:11:07.915 AIs and robots will replace[br]fry cooks and fast-food joints 0:11:07.915,0:11:10.415 and radiologists in hospitals. 0:11:11.135,0:11:14.330 Someday, the AI will diagnose your cancer, 0:11:14.330,0:11:17.430 and a robot will perform the surgery. 0:11:17.960,0:11:22.420 Only a healthy skepticism of these systems 0:11:22.420,0:11:25.981 is going to help keep people in the loop. 0:11:26.291,0:11:31.473 And I'm confident, if we can[br]keep people in the loop, 0:11:31.473,0:11:36.393 if we can build transparent[br]AI systems like the dog/wolf example 0:11:36.393,0:11:39.658 where the AI explained[br]what it was doing to people, 0:11:39.658,0:11:42.588 and people were able to spot-check it, 0:11:42.588,0:11:47.028 we can create new jobs[br]for people partnering with AI. 0:11:48.573,0:11:51.043 If we work together with AI, 0:11:51.043,0:11:55.563 we will probably be able to solve[br]some of our greatest challenges. 0:11:56.764,0:12:01.354 But to do that, we need[br]to lead and not follow. 0:12:01.801,0:12:05.491 We need to choose to be less like robots,[br] 0:12:05.491,0:12:10.131 and we need to build the robots[br]to be more like people, 0:12:11.020,0:12:13.350 because ultimately, 0:12:13.350,0:12:18.160 the only thing we need to fear[br]is not killer robots, 0:12:18.700,0:12:21.641 it's our own intellectual laziness. 0:12:22.215,0:12:26.655 The only thing we need[br]to fear is ourselves. 0:12:26.989,0:12:28.441 Thank you. 0:12:28.441,0:12:29.831 (Applause)