0:00:13.961,0:00:17.095 Hello, I'm Joy, a poet of code, 0:00:17.119,0:00:22.112 on a mission to stop[br]an unseen force that's rising, 0:00:22.136,0:00:24.992 a force that I called "the coded gaze," 0:00:25.016,0:00:28.325 my term for algorithmic bias. 0:00:28.349,0:00:32.649 Algorithmic bias, like human bias,[br]results in unfairness. 0:00:32.673,0:00:38.695 However, algorithms, like viruses,[br]can spread bias on a massive scale 0:00:38.719,0:00:40.301 at a rapid pace. 0:00:40.863,0:00:45.250 Algorithmic bias can also lead[br]to exclusionary experiences 0:00:45.274,0:00:47.402 and discriminatory practices. 0:00:47.426,0:00:49.487 Let me show you what I mean. 0:00:50.340,0:00:52.776 (Video) Joy Boulamwini: Hi, camera.[br]I've got a face. 0:00:53.242,0:00:55.107 Can you see my face? 0:00:55.131,0:00:56.756 No-glasses face? 0:00:58.461,0:01:00.207 You can see her face. 0:01:01.084,0:01:02.691 What about my face? 0:01:05.874,0:01:07.054 (Laughter) 0:01:07.078,0:01:09.737 I've got a mask. Can you see my mask? 0:01:11.914,0:01:14.279 Joy Boulamwini: So how did this happen? 0:01:14.303,0:01:17.444 Why am I sitting in front of a computer 0:01:17.468,0:01:18.892 in a white mask, 0:01:18.916,0:01:22.566 trying to be detected by a cheap webcam? 0:01:22.590,0:01:24.881 Well, when I'm not fighting the coded gaze 0:01:24.905,0:01:26.425 as a poet of code, 0:01:26.449,0:01:29.721 I'm a graduate student[br]at the MIT Media Lab, 0:01:29.745,0:01:34.662 and there I have the opportunity to work[br]on all sorts of whimsical projects, 0:01:34.686,0:01:36.713 including the Aspire Mirror, 0:01:36.737,0:01:41.871 a project I did so I could project[br]digital masks onto my reflection. 0:01:41.895,0:01:44.245 So in the morning, if I wanted[br]to feel powerful, 0:01:44.269,0:01:45.703 I could put on a lion. 0:01:45.727,0:01:49.223 If I wanted to be uplifted,[br]I might have a quote. 0:01:49.247,0:01:52.236 So I used generic[br]facial recognition software 0:01:52.260,0:01:53.611 to build the system, 0:01:53.635,0:01:58.738 but found it was really hard to test it[br]unless I wore a white mask. 0:01:59.722,0:02:04.068 Unfortunately, I've run[br]into this issue before. 0:02:04.092,0:02:08.435 When I was an undergraduate[br]at Georgia Tech studying computer science, 0:02:08.459,0:02:10.514 I used to work on social robots, 0:02:10.538,0:02:14.315 and one of my tasks was to get a robot[br]to play peek-a-boo, 0:02:14.339,0:02:16.272 a simple turn-taking game 0:02:16.296,0:02:20.617 where partners cover their face[br]and then uncover it saying, "Peek-a-boo!" 0:02:20.741,0:02:25.170 The problem is, peek-a-boo[br]doesn't really work if I can't see you, 0:02:25.194,0:02:27.693 and my robot couldn't see me. 0:02:27.717,0:02:31.667 But I borrowed my roommate's face[br]to get the project done, 0:02:31.691,0:02:33.071 submitted the assignment, 0:02:33.095,0:02:36.848 and figured, you know what,[br]somebody else will solve this problem. 0:02:37.499,0:02:39.502 Not too long after, 0:02:39.526,0:02:43.685 I was in Hong Kong[br]for an entrepreneurship competition. 0:02:44.169,0:02:46.863 The organizers decided[br]to take participants 0:02:46.887,0:02:49.259 on a tour of local start-ups. 0:02:49.283,0:02:51.998 One of the start-ups had a social robot, 0:02:52.022,0:02:53.934 and they decided to do a demo. 0:02:53.958,0:02:56.938 The demo worked on everybody[br]until it got to me, 0:02:56.962,0:02:58.885 and you can probably guess it. 0:02:58.909,0:03:01.874 It couldn't detect my face. 0:03:01.898,0:03:04.409 I asked the developers what was going on, 0:03:04.433,0:03:09.966 and it turned out we had used the same[br]generic facial recognition software. 0:03:09.990,0:03:11.640 Halfway around the world, 0:03:11.664,0:03:15.516 I learned that algorithmic bias[br]can travel as quickly 0:03:15.540,0:03:18.710 as it takes to download[br]some files off of the internet. 0:03:19.575,0:03:22.651 So what's going on?[br]Why isn't my face being detected? 0:03:22.675,0:03:26.031 Well, we have to look[br]at how we give machines sight. 0:03:26.055,0:03:29.464 Computer vision uses[br]machine learning techniques 0:03:29.488,0:03:31.368 to do facial recognition. 0:03:31.392,0:03:35.289 So how this works is, you create[br]a training set with examples of faces. 0:03:35.313,0:03:38.131 This is a face. This is a face.[br]This is not a face. 0:03:38.155,0:03:42.674 And over time, you can teach a computer[br]how to recognize other faces. 0:03:42.698,0:03:46.687 However, if the training sets[br]aren't really that diverse, 0:03:46.711,0:03:50.060 any face that deviates too much[br]from the established norm 0:03:50.084,0:03:51.733 will be harder to detect, 0:03:51.757,0:03:53.720 which is what was happening to me. 0:03:53.744,0:03:56.126 But don't worry -- there's some good news. 0:03:56.150,0:03:58.921 Training sets don't just[br]materialize out of nowhere. 0:03:58.945,0:04:00.733 We actually can create them. 0:04:00.757,0:04:04.933 So there's an opportunity to create[br]full-spectrum training sets 0:04:04.957,0:04:08.781 that reflect a richer[br]portrait of humanity. 0:04:08.805,0:04:11.026 Now you've seen in my examples 0:04:11.050,0:04:12.818 how social robots 0:04:12.842,0:04:17.453 was how I found out about exclusion[br]with algorithmic bias. 0:04:17.477,0:04:22.292 But algorithmic bias can also lead[br]to discriminatory practices. 0:04:23.267,0:04:24.720 Across the US, 0:04:24.744,0:04:28.942 police departments are starting to use[br]facial recognition software 0:04:28.966,0:04:31.425 in their crime-fighting arsenal. 0:04:31.449,0:04:33.462 Georgetown Law published a report 0:04:33.486,0:04:40.249 showing that one in two adults[br]in the US -- that's 117 million people -- 0:04:40.273,0:04:43.807 have their faces[br]in facial recognition networks. 0:04:43.831,0:04:48.383 Police departments can currently look[br]at these networks unregulated, 0:04:48.407,0:04:52.693 using algorithms that have not[br]been audited for accuracy. 0:04:52.717,0:04:56.581 Yet we know facial recognition[br]is not fail proof, 0:04:56.605,0:05:00.784 and labeling faces consistently[br]remains a challenge. 0:05:00.808,0:05:02.570 You might have seen this on Facebook. 0:05:02.594,0:05:05.582 My friends and I laugh all the time[br]when we see other people 0:05:05.606,0:05:08.064 mislabeled in our photos. 0:05:08.088,0:05:13.679 But misidentifying a suspected criminal[br]is no laughing matter, 0:05:13.703,0:05:16.530 nor is breaching civil liberties. 0:05:16.554,0:05:19.759 Machine learning is being used[br]for facial recognition, 0:05:19.783,0:05:24.288 but it's also extending beyond the realm[br]of computer vision. 0:05:25.096,0:05:29.112 In her book, "Weapons[br]of Math Destruction," 0:05:29.136,0:05:35.817 data scientist Cathy O'Neil[br]talks about the rising new WMDs -- 0:05:35.841,0:05:40.194 widespread, mysterious[br]and destructive algorithms 0:05:40.218,0:05:43.182 that are increasingly being used[br]to make decisions 0:05:43.206,0:05:46.383 that impact more aspects of our lives. 0:05:46.407,0:05:48.277 So who gets hired or fired? 0:05:48.301,0:05:50.413 Do you get that loan?[br]Do you get insurance? 0:05:50.437,0:05:53.940 Are you admitted into the college[br]you wanted to get into? 0:05:53.964,0:05:57.473 Do you and I pay the same price[br]for the same product 0:05:57.497,0:05:59.939 purchased on the same platform? 0:05:59.963,0:06:03.722 Law enforcement is also starting[br]to use machine learning 0:06:03.746,0:06:06.035 for predictive policing. 0:06:06.059,0:06:09.553 Some judges use machine-generated[br]risk scores to determine 0:06:09.577,0:06:13.979 how long an individual[br]is going to spend in prison. 0:06:14.003,0:06:16.457 So we really have to think[br]about these decisions. 0:06:16.481,0:06:17.663 Are they fair? 0:06:17.687,0:06:20.577 And we've seen that algorithmic bias 0:06:20.601,0:06:23.975 doesn't necessarily always[br]lead to fair outcomes. 0:06:23.999,0:06:25.963 So what can we do about it? 0:06:25.987,0:06:29.667 Well, we can start thinking about[br]how we create more inclusive code 0:06:29.691,0:06:32.681 and employ inclusive coding practices. 0:06:32.705,0:06:35.014 It really starts with people. 0:06:35.538,0:06:37.499 So who codes matters. 0:06:37.523,0:06:41.642 Are we creating full-spectrum teams[br]with diverse individuals 0:06:41.666,0:06:44.077 who can check each other's blind spots? 0:06:44.101,0:06:47.646 On the technical side,[br]how we code matters. 0:06:47.670,0:06:51.321 Are we factoring in fairness[br]as we're developing systems? 0:06:51.345,0:06:54.258 And finally, why we code matters. 0:06:54.615,0:06:59.698 We've used tools of computational creation[br]to unlock immense wealth. 0:06:59.722,0:07:04.169 We now have the opportunity[br]to unlock even greater equality 0:07:04.193,0:07:07.123 if we make social change a priority 0:07:07.147,0:07:09.317 and not an afterthought. 0:07:09.838,0:07:14.360 And so these are the three tenets[br]that will make up the "incoding" movement. 0:07:14.384,0:07:16.036 Who codes matters, 0:07:16.060,0:07:17.603 how we code matters 0:07:17.627,0:07:19.650 and why we code matters. 0:07:19.674,0:07:22.773 So to go towards incoding,[br]we can start thinking about 0:07:22.797,0:07:25.961 building platforms that can identify bias 0:07:25.985,0:07:29.063 by collecting people's experiences[br]like the ones I shared, 0:07:29.087,0:07:32.157 but also auditing existing software. 0:07:32.181,0:07:35.946 We can also start to create[br]more inclusive training sets. 0:07:35.970,0:07:38.773 Imagine a "Selfies for Inclusion" campaign 0:07:38.797,0:07:42.452 where you and I can help[br]developers test and create 0:07:42.476,0:07:44.569 more inclusive training sets. 0:07:45.132,0:07:47.960 And we can also start thinking[br]more conscientiously 0:07:47.984,0:07:53.375 about the social impact[br]of the technology that we're developing. 0:07:53.399,0:07:55.792 To get the incoding movement started, 0:07:55.816,0:07:58.663 I've launched the Algorithmic[br]Justice League, 0:07:58.687,0:08:04.559 where anyone who cares about fairness[br]can help fight the coded gaze. 0:08:04.583,0:08:07.879 On codedgaze.com, you can report bias, 0:08:07.903,0:08:10.348 request audits, become a tester 0:08:10.372,0:08:13.143 and join the ongoing conversation, 0:08:13.167,0:08:15.454 #codedgaze. 0:08:16.572,0:08:19.059 So I invite you to join me 0:08:19.083,0:08:22.802 in creating a world where technology[br]works for all of us, 0:08:22.826,0:08:24.723 not just some of us, 0:08:24.747,0:08:29.395 a world where we value inclusion[br]and center social change. 0:08:29.419,0:08:30.594 Thank you. 0:08:30.618,0:08:35.912 (Applause) 0:08:36.763,0:08:39.617 But I have one question: 0:08:39.641,0:08:41.700 Will you join me in the fight? 0:08:41.724,0:08:43.009 (Laughter) 0:08:43.033,0:08:46.720 (Applause)