0:00:01.041,0:00:02.682 Hello, I'm Joy, 0:00:02.682,0:00:04.199 a poet of code 0:00:04.199,0:00:06.581 on a mission to stop 0:00:06.581,0:00:09.216 an unseen force that's rising, 0:00:09.216,0:00:12.096 a force that I called the coded gaze, 0:00:12.096,0:00:15.429 my term for algorithmic bias. 0:00:15.429,0:00:17.844 Algorithmic bias, like human bias, 0:00:17.844,0:00:20.232 results in unfairness. 0:00:20.232,0:00:22.601 However, algorithms, like viruses, 0:00:22.601,0:00:25.799 can spread bias on a massive scale 0:00:25.799,0:00:28.118 at a rapid pace. 0:00:28.118,0:00:32.528 Algorithmic bias can also lead[br]to exclusionary experiences, 0:00:32.528,0:00:34.596 and discriminatory practices. 0:00:34.596,0:00:38.377 Let me show you what I mean. 0:00:38.377,0:00:40.506 (Video) Joy Boulamwini: Camera.[br]I've got a face. 0:00:40.506,0:00:42.704 Can you see my face? 0:00:42.704,0:00:45.627 No glasses face. 0:00:45.627,0:00:48.343 You can see her face. 0:00:48.343,0:00:51.016 What about my face? 0:00:53.819,0:00:54.699 (Laughter) 0:00:54.699,0:00:58.886 I've got a mask. Can you see my mask? 0:00:58.886,0:01:02.195 Joy Boulamwini: So how did this happen? 0:01:02.195,0:01:04.711 Why am I sitting in front of a computer 0:01:04.711,0:01:06.158 in a white mask 0:01:06.158,0:01:09.831 trying to be detected by a cheap webcam? 0:01:09.831,0:01:12.145 Well, when I'm not fighting the coded gaze 0:01:12.145,0:01:13.549 as a poet of code, 0:01:13.549,0:01:16.845 I'm a graduate student[br]at the MIT Media Lab, 0:01:16.845,0:01:19.058 and there I have the opportunity 0:01:19.058,0:01:21.786 to work on all sorts[br]of whimsical projects, 0:01:21.786,0:01:24.095 including the Aspire Mirror, 0:01:24.095,0:01:28.995 a project I did so I could project[br]digital masks onto my reflection. 0:01:28.995,0:01:31.369 So in the morning, if I wanted[br]to feel powerful, 0:01:31.369,0:01:32.827 I could put on a lion. 0:01:32.827,0:01:36.347 If I wanted to be uplifted,[br]I might have a quote. 0:01:36.347,0:01:39.430 So I used generic facial[br]recognition software 0:01:39.430,0:01:40.735 to build the system, 0:01:40.735,0:01:43.624 but found that it was[br]really hard to test it 0:01:43.624,0:01:46.659 unless I wore a white mask. 0:01:46.659,0:01:51.091 Unfortunately, I've run[br]into this issue before. 0:01:51.091,0:01:55.519 When I was an undergraduate[br]at Georgia Tech studying computer science, 0:01:55.519,0:01:57.598 I used to work on social robots, 0:01:57.598,0:01:59.974 and one of my tasks was to get a robot 0:01:59.974,0:02:01.399 to play peek-a-boo, 0:02:01.399,0:02:03.268 a simple turn-taking game 0:02:03.268,0:02:07.312 where partners cover their face[br]and then uncover it saying "Peek-a-boo." 0:02:07.312,0:02:12.093 The problem is, peek-a-boo[br]doesn't really work if I can't see you, 0:02:12.093,0:02:14.427 and my robot couldn't see me. 0:02:14.427,0:02:18.401 But I borrowed my roommate's face[br]to get the project done, 0:02:18.401,0:02:20.372 submitted the assignment, and figured, 0:02:20.372,0:02:24.411 you know what? Somebody else[br]will solve this problem. 0:02:24.411,0:02:26.299 Not too longer after, 0:02:26.299,0:02:30.956 I was in Hong Kong[br]for an entrepreneurship competition. 0:02:30.956,0:02:33.507 The organizers decided[br]to take participants 0:02:33.507,0:02:35.993 on a tour of local start-ups. 0:02:35.993,0:02:38.732 One of the start-ups had a social robot, 0:02:38.732,0:02:40.668 and they decided to do a demo. 0:02:40.668,0:02:43.672 The demo worked on everybody[br]until it got to me, 0:02:43.672,0:02:45.741 and you can probably guess it. 0:02:45.741,0:02:48.608 It couldn't detect my face. 0:02:48.608,0:02:51.143 I asked the developers[br]what was going on, 0:02:51.143,0:02:52.354 and it turned out 0:02:52.354,0:02:56.729 we had used the same generic[br]facial recognition software. 0:02:56.729,0:02:58.402 Halfway around the world, 0:02:58.402,0:03:01.997 I learned that algorithmic bias[br]can travel as quickly 0:03:01.997,0:03:05.262 as it takes to download[br]some files off of the Internet. 0:03:05.262,0:03:07.406 So what's going on?[br]Why isn't my face being detected? 0:03:07.406,0:03:12.765 Well, we have to look at[br]how we give machines sight. 0:03:12.765,0:03:16.198 Computer vision uses[br]machine learning techniques 0:03:16.198,0:03:18.382 to do facial recognition. 0:03:18.382,0:03:20.658 So how this works is[br]you create a training set 0:03:20.658,0:03:22.023 with examples of faces. 0:03:22.023,0:03:24.931 This is face. This is face.[br]This is not a face. 0:03:24.931,0:03:29.455 And over time, you can teach[br]a computer how to recognize other faces. 0:03:29.455,0:03:33.263 However, if the training sets[br]aren't really that diverse, 0:03:33.263,0:03:36.459 any face that deviates too much[br]from the established norm 0:03:36.459,0:03:38.467 will be harder to detect, 0:03:38.467,0:03:40.454 which is what was happening to me. 0:03:40.454,0:03:42.702 But don't worry, there's some good news. 0:03:42.702,0:03:45.730 Training sets don't just[br]materialize out of nowhere. 0:03:45.730,0:03:47.467 We actually can create them. 0:03:47.467,0:03:51.553 So there's an opportunity to create[br]full spectrum training sets 0:03:51.553,0:03:55.515 that reflect a richer[br]portrait of humanity. 0:03:55.515,0:03:57.797 Now you've seen in my examples 0:03:57.797,0:03:59.552 how social robots 0:03:59.552,0:04:04.246 was how I found out about exclusion[br]with algorithmic bias, 0:04:04.246,0:04:10.160 but algorithmic bias can also lead[br]to discriminatory practices. 0:04:10.160,0:04:13.955 Across the U.S., police departments[br]are starting to use 0:04:13.955,0:04:15.676 facial recognition software 0:04:15.676,0:04:18.289 in their crime-fighting arsenal. 0:04:18.289,0:04:18.802 Georgetown Law published a report[br]showing that one in two 0:04:18.802,0:04:19.052 adults in the U.S. --[br]that's 117 million people -- 0:04:19.052,0:04:30.180 have their faces[br]in facial recognition networks. 0:04:30.180,0:04:35.117 Police departments can currently look[br]at these networks unregulated 0:04:35.117,0:04:39.484 using algorithms that have not[br]been audited for accuracy. 0:04:39.484,0:04:41.691 Yet we know facial recognition 0:04:41.691,0:04:43.232 is not failproof, 0:04:43.232,0:04:47.518 and labeling faces consistently[br]remains a challenge. 0:04:47.518,0:04:49.277 You might have seen this on Facebook. 0:04:49.277,0:04:52.316 My friends and I laugh all the time[br]when we see other people 0:04:52.316,0:04:54.918 mislabeled in our photos. 0:04:54.918,0:04:58.406 But misidentifying a suspected criminal 0:04:58.406,0:05:00.547 is no laughing matter, 0:05:00.547,0:05:03.364 nor is breaching civil liberties. 0:05:03.364,0:05:06.411 Machine learning is being used[br]for facial recognition, 0:05:06.411,0:05:09.083 but it's also extending beyond the realm 0:05:09.083,0:05:11.934 of computer vision. 0:05:11.934,0:05:15.865 In her book[br]"Weapons of Math Destruction," 0:05:15.865,0:05:18.178 data scientist Cathy O'Neil 0:05:18.178,0:05:22.718 talks about the rising new WMDs -- 0:05:22.718,0:05:26.806 widespread, mysterious,[br]and destructive algorithms 0:05:26.806,0:05:29.673 that are increasingly being used[br]to make decisions 0:05:29.673,0:05:32.883 that impact more aspects of our lives. 0:05:32.883,0:05:35.318 So who gets hired or fired? 0:05:35.318,0:05:37.135 Do you get that loan?[br]Do you get insurance? 0:05:37.135,0:05:40.730 Are you admitted into the college[br]that you wanted to get into? 0:05:40.730,0:05:44.207 Do you and I pay the same price[br]for the same product 0:05:44.207,0:05:46.749 purchased on the same platform? 0:05:46.749,0:05:50.502 Law enforcement is also starting[br]to use machine learning 0:05:50.502,0:05:52.769 for predictive policing. 0:05:52.769,0:05:56.287 Some judges use machine-generated[br]risk scores to determine 0:05:56.287,0:06:00.728 how long an individual[br]is going to spend in prison. 0:06:00.728,0:06:03.221 So we really have to think[br]about these decisions. 0:06:03.221,0:06:04.397 Are they fair? 0:06:04.397,0:06:07.225 And we've seen that algorithmic bias 0:06:07.225,0:06:10.709 doesn't necessarily always[br]lead to fair outcomes. 0:06:10.709,0:06:12.697 So what can we do about it? 0:06:12.697,0:06:16.401 Well, we can start thinking about[br]how we create more inclusive code 0:06:16.401,0:06:19.415 and employ inclusive coding practices. 0:06:19.415,0:06:22.087 It really starts with people. 0:06:22.087,0:06:24.383 So who codes matters. 0:06:24.383,0:06:28.288 Are we creating full spectrum teams[br]with diverse individuals 0:06:28.288,0:06:30.811 who can check each other's blind spots? 0:06:30.811,0:06:32.508 On the technical side, 0:06:32.508,0:06:34.380 how we code matters. 0:06:34.380,0:06:38.055 Are we factoring in fairness[br]as we're developing systems? 0:06:38.055,0:06:41.452 And finally, why we code matters. 0:06:41.452,0:06:44.157 We've used tools of computational creation 0:06:44.157,0:06:46.335 to unlock immense wealth. 0:06:46.335,0:06:48.391 We now have the opportunity 0:06:48.391,0:06:51.088 to unlock even greater equality 0:06:51.088,0:06:53.565 if we make social change a priority 0:06:53.565,0:06:56.732 and not an afterthought. 0:06:56.732,0:07:00.989 And so these are the three tenets[br]that will make up the incoding movement. 0:07:00.989,0:07:02.677 Who codes matters, 0:07:02.677,0:07:04.337 how we code matters, 0:07:04.337,0:07:06.505 and why we code matters. 0:07:06.505,0:07:09.507 So to go towards incoding,[br]we can start thinking about 0:07:09.507,0:07:12.589 building platforms that can identify bias 0:07:12.589,0:07:15.913 by collecting people's experiences[br]like the ones I shared, 0:07:15.913,0:07:18.891 but also auditing existing software. 0:07:18.891,0:07:22.680 We can also start to create[br]more inclusive training sets. 0:07:22.680,0:07:25.575 Imagine a selfies for inclusion campaign 0:07:25.575,0:07:27.915 where you and I can help developers 0:07:27.915,0:07:31.383 test and create more[br]inclusive training sets. 0:07:31.383,0:07:34.738 And we can also start[br]thinking more conscientiously 0:07:34.738,0:07:40.189 about the social impact[br]of the technology that we're developing. 0:07:40.189,0:07:42.526 To get the incoding movement started, 0:07:42.526,0:07:45.470 I've launched the Algorithmic[br]Justice League, 0:07:45.470,0:07:48.488 where anyone who cares about fairness 0:07:48.488,0:07:51.372 can help fight the coded gaze. 0:07:51.372,0:07:54.165 On codedgaze.com, you can report bias, 0:07:54.165,0:07:56.070 request audits, become a tester, 0:07:56.070,0:07:59.877 and join the ongoing conversation, 0:07:59.877,0:08:03.326 #codedgaze. 0:08:03.326,0:08:05.919 So I invite you to join me 0:08:05.919,0:08:09.615 in creating a world where technology[br]works for all of us, 0:08:09.615,0:08:11.678 not just some of us, 0:08:11.678,0:08:16.069 a world where we value inclusion[br]and center social change. 0:08:16.069,0:08:18.258 Thank you. 0:08:18.258,0:08:23.003 (Applause) 0:08:23.533,0:08:26.928 But I have one question. 0:08:26.928,0:08:28.780 Will you join me in the fight? 0:08:28.780,0:08:30.988 (Laughter) 0:08:30.988,0:08:33.348 (Applause)