1 00:00:13,298 --> 00:00:17,148 The rise of the machines! 2 00:00:17,928 --> 00:00:22,748 Who here is scared of killer robots? 3 00:00:23,140 --> 00:00:25,250 (Laughter) 4 00:00:25,612 --> 00:00:27,212 I am! 5 00:00:28,226 --> 00:00:31,896 I used to work in UAVs - Unmanned Aerial Vehicles - 6 00:00:31,896 --> 00:00:36,736 and all I could think seeing these things is that someday, 7 00:00:37,133 --> 00:00:40,903 somebody is going to strap a machine-gun to these things, 8 00:00:40,903 --> 00:00:43,853 and they're going to hunt me down in swarms. 9 00:00:44,688 --> 00:00:49,848 I work in robotics at Brown University and I'm scared of robots. 10 00:00:50,541 --> 00:00:53,281 Actually, I'm kind of terrified, 11 00:00:53,761 --> 00:00:55,811 but, can you blame me? 12 00:00:55,811 --> 00:00:59,501 Ever since I was a kid, all I've seen are movies 13 00:00:59,501 --> 00:01:03,011 that portrayed the ascendance of Artificial Intelligence 14 00:01:03,011 --> 00:01:05,811 and our inevitable conflict with it - 15 00:01:05,811 --> 00:01:11,041 2001 Space Odyssey, The Terminator, The Matrix - 16 00:01:11,800 --> 00:01:16,200 and the stories they tell are pretty scary: 17 00:01:16,200 --> 00:01:20,918 rogue bands of humans running away from super intelligent machines. 18 00:01:21,935 --> 00:01:26,795 That scares me. From the hands, it seems like it scares you as well. 19 00:01:26,795 --> 00:01:30,235 I know it is scary to Elon Musk. 20 00:01:30,825 --> 00:01:35,245 But, you know, we have a little bit of time before the robots rise up. 21 00:01:35,245 --> 00:01:38,571 Robots like the PR2 that I have at my initiative, 22 00:01:38,571 --> 00:01:41,381 they can't even open the door yet. 23 00:01:42,191 --> 00:01:46,707 So in my mind, this discussion of super intelligent robots 24 00:01:46,707 --> 00:01:51,997 is a little bit of a distraction from something far more insidious 25 00:01:51,997 --> 00:01:56,217 that is going on with AI systems across the country. 26 00:01:56,917 --> 00:02:00,067 You see, right now, there are people - 27 00:02:00,067 --> 00:02:04,227 doctors, judges, accountants - 28 00:02:04,227 --> 00:02:07,957 who are getting information from an AI system 29 00:02:07,957 --> 00:02:12,717 and treating it as if it is information from a trusted colleague. 30 00:02:13,931 --> 00:02:16,901 It's this trust that bothers me 31 00:02:17,141 --> 00:02:20,182 not because of how often AI gets it wrong. 32 00:02:20,182 --> 00:02:24,403 AI researchers pride themselves in accuracy on results. 33 00:02:24,869 --> 00:02:27,849 It's how badly it gets it wrong when it makes a mistake 34 00:02:27,849 --> 00:02:29,779 that has me worried. 35 00:02:29,779 --> 00:02:33,579 These systems do not fail gracefully. 36 00:02:34,240 --> 00:02:36,960 So, let's take a look at what this looks like. 37 00:02:37,120 --> 00:02:42,560 This is a dog that has been misidentified as a wolf by an AI algorithm. 38 00:02:43,233 --> 00:02:45,239 The researchers wanted to know: 39 00:02:45,239 --> 00:02:49,509 why did this particular husky get misidentified as a wolf? 40 00:02:49,751 --> 00:02:52,721 So they rewrote the algorithm to explain to them 41 00:02:52,721 --> 00:02:55,651 the parts of the picture it was paying attention to 42 00:02:55,651 --> 00:02:58,501 when the AI algorithm made its decision. 43 00:02:59,039 --> 00:03:02,749 In this picture, what do you think it paid attention to? 44 00:03:02,869 --> 00:03:05,099 What would you pay attention to? 45 00:03:05,359 --> 00:03:10,479 Maybe the eyes, maybe the ears, the snout ... 46 00:03:13,041 --> 00:03:16,531 This is what it paid attention to: 47 00:03:16,981 --> 00:03:20,391 mostly the snow and the background of the picture. 48 00:03:21,003 --> 00:03:25,853 You see, there was bias in the data set that was fed to this algorithm. 49 00:03:26,293 --> 00:03:30,373 Most of the pictures of wolves were in snow, 50 00:03:30,573 --> 00:03:34,783 so the AI algorithm conflated the presence or absence of snow 51 00:03:34,783 --> 00:03:38,373 for the presence or absence of a wolf. 52 00:03:39,912 --> 00:03:42,027 The scary thing about this 53 00:03:42,027 --> 00:03:46,287 is the researchers had no idea this was happening 54 00:03:46,287 --> 00:03:50,107 until they rewrote the algorithm to explain itself. 55 00:03:50,836 --> 00:03:55,326 And that's the thing with AI algorithms, deep learning, machine learning. 56 00:03:55,326 --> 00:03:59,346 Even the developers who work on this stuff 57 00:03:59,346 --> 00:04:02,396 have no idea what it's doing. 58 00:04:03,001 --> 00:04:07,591 So, that might be a great example for a research, 59 00:04:07,591 --> 00:04:10,281 but what does this mean in the real world? 60 00:04:10,611 --> 00:04:15,841 The Compas Criminal Sentencing algorithm is used in 13 states 61 00:04:15,841 --> 00:04:17,991 to determine criminal recidivism 62 00:04:17,991 --> 00:04:22,471 or the risk of committing a crime again after you're released. 63 00:04:23,199 --> 00:04:26,959 ProPublica found that if you're African-American, 64 00:04:26,959 --> 00:04:32,023 Compas was 77% more likely to qualify you as a potentially violent offender 65 00:04:32,023 --> 00:04:34,123 than if you're a Caucasian. 66 00:04:34,784 --> 00:04:39,404 This is a real system being used in the real world by real judges 67 00:04:39,404 --> 00:04:42,434 to make decisions about real people's lives. 68 00:04:44,115 --> 00:04:48,815 Why would the judges trust it if it seems to exhibit bias? 69 00:04:49,866 --> 00:04:55,176 Well, the reason they use Compas is because it is a model for efficiency. 70 00:04:55,622 --> 00:05:00,072 Compas lets them go through caseloads much faster 71 00:05:00,072 --> 00:05:02,992 in a backlogged criminal justice system. 72 00:05:04,877 --> 00:05:07,297 Why would they question their own software? 73 00:05:07,297 --> 00:05:10,957 It's been requisitioned by the State, approved by their IT Department. 74 00:05:10,957 --> 00:05:13,357 Why would they question it? 75 00:05:13,357 --> 00:05:16,513 Well, the people sentenced by Compas have questioned it, 76 00:05:16,513 --> 00:05:18,853 and their lawsuits should chill us all. 77 00:05:19,243 --> 00:05:22,123 The Wisconsin State Supreme Court ruled 78 00:05:22,123 --> 00:05:25,643 that compass did not deny a defendant due process 79 00:05:25,643 --> 00:05:28,433 provided it was used "properly." 80 00:05:28,963 --> 00:05:30,688 In the same set of rulings, they ruled 81 00:05:30,688 --> 00:05:34,758 that the defendant could not inspect the source code of Compass. 82 00:05:35,700 --> 00:05:39,990 It has to be used properly but you can't inspect the source code? 83 00:05:40,425 --> 00:05:43,425 This is a disturbing set of rulings when taken together 84 00:05:43,425 --> 00:05:46,175 for anyone facing criminal sentencing. 85 00:05:46,625 --> 00:05:50,705 You may not care about this because you're not facing criminal sentencing, 86 00:05:51,056 --> 00:05:55,056 but what if I told you that black box AI algorithms like this 87 00:05:55,056 --> 00:05:59,376 are being used to decide whether or not you can get a loan for your house, 88 00:06:00,144 --> 00:06:02,844 whether you get a job interview, 89 00:06:03,364 --> 00:06:05,863 whether you get Medicaid, 90 00:06:05,954 --> 00:06:10,434 and are even driving cars and trucks down the highway. 91 00:06:10,831 --> 00:06:14,531 Would you want the public to be able to inspect the algorithm 92 00:06:14,531 --> 00:06:17,239 that's trying to make a decisiom between a shopping cart 93 00:06:17,239 --> 00:06:20,899 and a baby carriage in a self-driving truck, 94 00:06:20,899 --> 00:06:23,679 in the same way the dog/wolf algorithm was trying to decide 95 00:06:23,679 --> 00:06:26,069 between a dog or a wolf? 96 00:06:26,282 --> 00:06:31,462 Are you potentially a metaphorical dog who's been misidentified as a wolf 97 00:06:31,462 --> 00:06:34,262 by somebody's AI algorithm? 98 00:06:34,868 --> 00:06:38,718 Considering the complexity of people, it's possible. 99 00:06:38,811 --> 00:06:42,031 Is there anything you can do about it now? 100 00:06:42,031 --> 00:06:46,841 Probably not, and that's what we need to focus on. 101 00:06:47,487 --> 00:06:50,567 We need to demand standards of accountability, 102 00:06:50,567 --> 00:06:55,397 transparency and recourse in AI systems. 103 00:06:56,456 --> 00:07:01,034 ISO, the International Standards Organization, just formed a committee 104 00:07:01,034 --> 00:07:04,504 to make decisions about what to do for AI standards. 105 00:07:04,923 --> 00:07:08,739 They're about five years out from coming up with a standard. 106 00:07:08,989 --> 00:07:12,479 These systems are being used now, 107 00:07:13,671 --> 00:07:19,361 not just in loans, but they're being used in vehicles like I was saying. 108 00:07:20,841 --> 00:07:25,273 They're being used in things like Cooperative Adaptive Cruise Control. 109 00:07:25,273 --> 00:07:27,973 It's funny that they call that "cruise control" 110 00:07:27,973 --> 00:07:32,703 because the type of controller used in cruise control, a PID controller, 111 00:07:32,703 --> 00:07:38,323 was used for 30 years in chemical plants before it ever made into a car. 112 00:07:39,139 --> 00:07:41,138 The type of controller that's used 113 00:07:41,138 --> 00:07:44,628 to drive a self-driving car and a machine learning, 114 00:07:44,628 --> 00:07:48,878 that's only been used in research since 2007. 115 00:07:49,680 --> 00:07:52,230 These are new technologies. 116 00:07:52,470 --> 00:07:56,430 We need to demand the standards and we need to demand regulation 117 00:07:56,430 --> 00:08:00,340 so that we don't get snake oil in the marketplace. 118 00:08:00,819 --> 00:08:05,059 And we also have to have a little bit of skepticism. 119 00:08:05,861 --> 00:08:07,871 The experiments in Authority 120 00:08:07,871 --> 00:08:11,121 done by Stanley Milgram after World War II, 121 00:08:11,121 --> 00:08:16,031 showed that your average person would follow an authority figures orders 122 00:08:16,031 --> 00:08:19,741 even if it meant harming their fellow citizen. 123 00:08:20,461 --> 00:08:22,850 In this experiment, 124 00:08:22,850 --> 00:08:27,130 every day Americans would shock an actor 125 00:08:27,689 --> 00:08:31,269 past the point of him complaining about her trouble, 126 00:08:31,577 --> 00:08:35,427 past the point of him screaming in pain, 127 00:08:35,934 --> 00:08:40,894 past the point of him going silent in simulated death, 128 00:08:41,613 --> 00:08:44,099 all because somebody 129 00:08:44,099 --> 00:08:47,969 with no credentials, in a lab coat, 130 00:08:47,969 --> 00:08:50,795 was saying some variation of the phrase 131 00:08:50,795 --> 00:08:54,475 "The experiment must continue." 132 00:08:56,945 --> 00:09:02,398 In AI, we have Milgram's ultimate authority figure. 133 00:09:03,656 --> 00:09:08,366 We have a dispassionate system that can't reflect, 134 00:09:09,370 --> 00:09:12,640 that can't make another decision, 135 00:09:12,920 --> 00:09:14,902 that there is no recourse to, 136 00:09:15,074 --> 00:09:20,454 that will always say "The system or "The process must continue." 137 00:09:23,313 --> 00:09:25,883 Now, I'm going to tell you a little story. 138 00:09:25,883 --> 00:09:29,723 It's about a car trip I took driving across country. 139 00:09:30,790 --> 00:09:34,690 I was coming into Salt Lake City and it started raining. 140 00:09:35,211 --> 00:09:39,900 As I climbed into the mountains, that rain turned into snow, 141 00:09:40,380 --> 00:09:42,580 and pretty soon that snow was whiteout. 142 00:09:42,580 --> 00:09:45,720 I couldn't see the taillights of the car in front of me. 143 00:09:46,153 --> 00:09:48,023 I started skidding. 144 00:09:48,023 --> 00:09:51,033 I went 360 one way, I went 360 the other way. 145 00:09:51,033 --> 00:09:52,773 I went off the highway. 146 00:09:52,773 --> 00:09:54,953 Mud-coated my windows, I couldn't see a thing. 147 00:09:54,953 --> 00:09:59,103 I was terrified some car was going to come crashing into me. 148 00:09:59,924 --> 00:10:03,864 Now, I'm telling you this story to get you thinking 149 00:10:03,864 --> 00:10:07,324 about how something small and seemingly mundane 150 00:10:07,324 --> 00:10:10,094 like a little bit precipitation, 151 00:10:10,094 --> 00:10:14,584 can easily grow into something very dangerous. 152 00:10:15,409 --> 00:10:19,789 We are driving in the rain with AI right now, 153 00:10:20,442 --> 00:10:23,402 and that rain will turn to snow, 154 00:10:23,887 --> 00:10:27,457 and that snow could become a blizzard. 155 00:10:28,097 --> 00:10:30,347 We need to pause, 156 00:10:30,537 --> 00:10:32,857 check the conditions, 157 00:10:33,002 --> 00:10:35,502 put in place safety standards, 158 00:10:35,642 --> 00:10:41,232 and ask ourselves how far do we want to go, 159 00:10:42,267 --> 00:10:46,437 because the economic incentives for AI and automation 160 00:10:46,437 --> 00:10:48,363 to replace human labor 161 00:10:48,363 --> 00:10:53,313 will be beyond anything we have seen since the Industrial Revolution. 162 00:10:54,043 --> 00:10:58,040 Human salary demands can't compete 163 00:10:58,040 --> 00:11:01,650 with the base cost of electricity. 164 00:11:02,480 --> 00:11:07,915 AIs and robots will replace fry cooks and fast-food joints 165 00:11:07,915 --> 00:11:10,415 and radiologists in hospitals. 166 00:11:11,135 --> 00:11:14,330 Someday, the AI will diagnose your cancer, 167 00:11:14,330 --> 00:11:17,430 and a robot will perform the surgery. 168 00:11:17,960 --> 00:11:22,420 Only a healthy skepticism of these systems 169 00:11:22,420 --> 00:11:25,981 is going to help keep people in the loop. 170 00:11:26,291 --> 00:11:31,473 And I'm confident, if we can keep people in the loop, 171 00:11:31,473 --> 00:11:36,393 if we can build transparent AI systems like the dog/wolf example 172 00:11:36,393 --> 00:11:39,658 where the AI explained what it was doing to people, 173 00:11:39,658 --> 00:11:42,588 and people were able to spot-check it, 174 00:11:42,588 --> 00:11:47,028 we can create new jobs for people partnering with AI. 175 00:11:48,573 --> 00:11:51,043 If we work together with AI, 176 00:11:51,043 --> 00:11:55,563 we will probably be able to solve some of our greatest challenges. 177 00:11:56,764 --> 00:12:01,354 But to do that, we need to lead and not follow. 178 00:12:01,801 --> 00:12:05,491 We need to choose to be less like robots, 179 00:12:05,491 --> 00:12:10,131 and we need to build the robots to be more like people, 180 00:12:11,020 --> 00:12:13,350 because ultimately, 181 00:12:13,350 --> 00:12:18,160 the only thing we need to fear is not killer robots, 182 00:12:18,700 --> 00:12:21,641 it's our own intellectual laziness. 183 00:12:22,215 --> 00:12:26,655 The only thing we need to fear is ourselves. 184 00:12:26,989 --> 00:12:28,441 Thank you. 185 00:12:28,441 --> 00:12:29,831 (Applause)