0:00:01.041,0:00:04.175 Hello, I'm Joy, a poet of code, 0:00:04.199,0:00:09.192 on a mission to stop[br]an unseen force that's rising, 0:00:09.216,0:00:12.072 a force that I called "the coded gaze," 0:00:12.096,0:00:15.405 my term for algorithmic bias. 0:00:15.429,0:00:19.729 Algorithmic bias, like human bias,[br]results in unfairness. 0:00:19.753,0:00:25.775 However, algorithms, like viruses,[br]can spread bias on a massive scale 0:00:25.799,0:00:27.381 at a rapid pace. 0:00:27.943,0:00:32.330 Algorithmic bias can also lead[br]to exclusionary experiences 0:00:32.354,0:00:34.482 and discriminatory practices. 0:00:34.506,0:00:36.567 Let me show you what I mean. 0:00:36.980,0:00:39.416 (Video) Joy Buolamwini: Hi, camera.[br]I've got a face. 0:00:40.162,0:00:42.026 Can you see my face? 0:00:42.051,0:00:43.676 No-glasses face? 0:00:43.701,0:00:45.915 You can see her face. 0:00:46.237,0:00:48.482 What about my face? 0:00:51.890,0:00:55.640 I've got a mask. Can you see my mask? 0:00:56.474,0:00:58.839 Joy Buolamwini: So how did this happen? 0:00:58.863,0:01:02.004 Why am I sitting in front of a computer 0:01:02.028,0:01:03.452 in a white mask, 0:01:03.476,0:01:07.126 trying to be detected by a cheap webcam? 0:01:07.150,0:01:09.441 Well, when I'm not fighting the coded gaze 0:01:09.465,0:01:10.985 as a poet of code, 0:01:11.009,0:01:14.281 I'm a graduate student[br]at the MIT Media Lab, 0:01:14.305,0:01:19.222 and there I have the opportunity to work[br]on all sorts of whimsical projects, 0:01:19.246,0:01:21.273 including the Aspire Mirror, 0:01:21.297,0:01:26.431 a project I did so I could project[br]digital masks onto my reflection. 0:01:26.455,0:01:28.805 So in the morning, if I wanted[br]to feel powerful, 0:01:28.829,0:01:30.263 I could put on a lion. 0:01:30.287,0:01:33.783 If I wanted to be uplifted,[br]I might have a quote. 0:01:33.807,0:01:36.796 So I used generic[br]facial recognition software 0:01:36.820,0:01:38.171 to build the system, 0:01:38.195,0:01:43.298 but found it was really hard to test it[br]unless I wore a white mask. 0:01:44.282,0:01:48.628 Unfortunately, I've run[br]into this issue before. 0:01:48.652,0:01:52.955 When I was an undergraduate[br]at Georgia Tech studying computer science, 0:01:52.979,0:01:55.034 I used to work on social robots, 0:01:55.058,0:01:58.835 and one of my tasks was to get a robot[br]to play peek-a-boo, 0:01:58.859,0:02:00.542 a simple turn-taking game 0:02:00.566,0:02:04.887 where partners cover their face[br]and then uncover it saying, "Peek-a-boo!" 0:02:04.911,0:02:09.340 The problem is, peek-a-boo[br]doesn't really work if I can't see you, 0:02:09.364,0:02:11.863 and my robot couldn't see me. 0:02:11.887,0:02:15.837 But I borrowed my roommate's face[br]to get the project done, 0:02:15.861,0:02:17.241 submitted the assignment, 0:02:17.265,0:02:21.018 and figured, you know what,[br]somebody else will solve this problem. 0:02:21.669,0:02:23.672 Not too long after, 0:02:23.696,0:02:27.855 I was in Hong Kong[br]for an entrepreneurship competition. 0:02:28.339,0:02:31.033 The organizers decided[br]to take participants 0:02:31.057,0:02:33.429 on a tour of local start-ups. 0:02:33.453,0:02:36.168 One of the start-ups had a social robot, 0:02:36.192,0:02:38.104 and they decided to do a demo. 0:02:38.128,0:02:41.108 The demo worked on everybody[br]until it got to me, 0:02:41.132,0:02:43.055 and you can probably guess it. 0:02:43.079,0:02:46.044 It couldn't detect my face. 0:02:46.068,0:02:48.579 I asked the developers what was going on, 0:02:48.603,0:02:54.136 and it turned out we had used the same[br]generic facial recognition software. 0:02:54.160,0:02:55.810 Halfway around the world, 0:02:55.834,0:02:59.686 I learned that algorithmic bias[br]can travel as quickly 0:02:59.710,0:03:02.880 as it takes to download[br]some files off of the internet. 0:03:03.745,0:03:06.821 So what's going on?[br]Why isn't my face being detected? 0:03:06.845,0:03:10.201 Well, we have to look[br]at how we give machines sight. 0:03:10.225,0:03:13.634 Computer vision uses[br]machine learning techniques 0:03:13.658,0:03:15.538 to do facial recognition. 0:03:15.562,0:03:19.459 So how this works is, you create[br]a training set with examples of faces. 0:03:19.483,0:03:22.301 This is a face. This is a face.[br]This is not a face. 0:03:22.325,0:03:26.844 And over time, you can teach a computer[br]how to recognize other faces. 0:03:26.868,0:03:30.857 However, if the training sets[br]aren't really that diverse, 0:03:30.881,0:03:34.230 any face that deviates too much[br]from the established norm 0:03:34.254,0:03:35.903 will be harder to detect, 0:03:35.927,0:03:37.890 which is what was happening to me. 0:03:37.914,0:03:40.296 But don't worry -- there's some good news. 0:03:40.320,0:03:43.091 Training sets don't just[br]materialize out of nowhere. 0:03:43.115,0:03:44.903 We actually can create them. 0:03:44.927,0:03:49.103 So there's an opportunity to create[br]full-spectrum training sets 0:03:49.127,0:03:52.951 that reflect a richer[br]portrait of humanity. 0:03:52.975,0:03:55.196 Now you've seen in my examples 0:03:55.220,0:03:56.988 how social robots 0:03:57.012,0:04:01.623 was how I found out about exclusion[br]with algorithmic bias. 0:04:01.647,0:04:06.462 But algorithmic bias can also lead[br]to discriminatory practices. 0:04:07.437,0:04:08.890 Across the US, 0:04:08.914,0:04:13.112 police departments are starting to use[br]facial recognition software 0:04:13.136,0:04:15.595 in their crime-fighting arsenal. 0:04:15.619,0:04:17.632 Georgetown Law published a report 0:04:17.656,0:04:24.419 showing that one in two adults[br]in the US -- that's 117 million people -- 0:04:24.443,0:04:27.977 have their faces[br]in facial recognition networks. 0:04:28.001,0:04:32.553 Police departments can currently look[br]at these networks unregulated, 0:04:32.577,0:04:36.863 using algorithms that have not[br]been audited for accuracy. 0:04:36.887,0:04:40.751 Yet we know facial recognition[br]is not fail proof, 0:04:40.775,0:04:44.954 and labeling faces consistently[br]remains a challenge. 0:04:44.978,0:04:46.740 You might have seen this on Facebook. 0:04:46.764,0:04:49.752 My friends and I laugh all the time[br]when we see other people 0:04:49.776,0:04:52.234 mislabeled in our photos. 0:04:52.258,0:04:57.849 But misidentifying a suspected criminal[br]is no laughing matter, 0:04:57.873,0:05:00.700 nor is breaching civil liberties. 0:05:00.724,0:05:03.929 Machine learning is being used[br]for facial recognition, 0:05:03.953,0:05:08.458 but it's also extending beyond the realm[br]of computer vision. 0:05:09.266,0:05:13.282 In her book, "Weapons[br]of Math Destruction," 0:05:13.306,0:05:19.987 data scientist Cathy O'Neil[br]talks about the rising new WMDs -- 0:05:20.011,0:05:24.364 widespread, mysterious[br]and destructive algorithms 0:05:24.388,0:05:27.352 that are increasingly being used[br]to make decisions 0:05:27.376,0:05:30.553 that impact more aspects of our lives. 0:05:30.577,0:05:32.447 So who gets hired or fired? 0:05:32.471,0:05:34.583 Do you get that loan?[br]Do you get insurance? 0:05:34.607,0:05:38.110 Are you admitted into the college[br]you wanted to get into? 0:05:38.134,0:05:41.643 Do you and I pay the same price[br]for the same product 0:05:41.667,0:05:44.109 purchased on the same platform? 0:05:44.133,0:05:47.892 Law enforcement is also starting[br]to use machine learning 0:05:47.916,0:05:50.205 for predictive policing. 0:05:50.229,0:05:53.723 Some judges use machine-generated[br]risk scores to determine 0:05:53.747,0:05:58.149 how long an individual[br]is going to spend in prison. 0:05:58.173,0:06:00.627 So we really have to think[br]about these decisions. 0:06:00.651,0:06:01.833 Are they fair? 0:06:01.857,0:06:04.747 And we've seen that algorithmic bias 0:06:04.771,0:06:08.145 doesn't necessarily always[br]lead to fair outcomes. 0:06:08.169,0:06:10.133 So what can we do about it? 0:06:10.157,0:06:13.837 Well, we can start thinking about[br]how we create more inclusive code 0:06:13.861,0:06:16.851 and employ inclusive coding practices. 0:06:16.875,0:06:19.184 It really starts with people. 0:06:19.708,0:06:21.669 So who codes matters. 0:06:21.693,0:06:25.812 Are we creating full-spectrum teams[br]with diverse individuals 0:06:25.836,0:06:28.247 who can check each other's blind spots? 0:06:28.271,0:06:31.816 On the technical side,[br]how we code matters. 0:06:31.840,0:06:35.491 Are we factoring in fairness[br]as we're developing systems? 0:06:35.515,0:06:38.428 And finally, why we code matters. 0:06:38.785,0:06:43.868 We've used tools of computational creation[br]to unlock immense wealth. 0:06:43.892,0:06:48.339 We now have the opportunity[br]to unlock even greater equality 0:06:48.363,0:06:51.293 if we make social change a priority 0:06:51.317,0:06:53.487 and not an afterthought. 0:06:54.008,0:06:58.530 And so these are the three tenets[br]that will make up the "incoding" movement. 0:06:58.554,0:07:00.206 Who codes matters, 0:07:00.230,0:07:01.773 how we code matters 0:07:01.797,0:07:03.820 and why we code matters. 0:07:03.844,0:07:06.943 So to go towards incoding,[br]we can start thinking about 0:07:06.967,0:07:10.131 building platforms that can identify bias 0:07:10.155,0:07:13.233 by collecting people's experiences[br]like the ones I shared, 0:07:13.257,0:07:16.327 but also auditing existing software. 0:07:16.351,0:07:20.116 We can also start to create[br]more inclusive training sets. 0:07:20.140,0:07:22.943 Imagine a "Selfies for Inclusion" campaign 0:07:22.967,0:07:26.622 where you and I can help[br]developers test and create 0:07:26.646,0:07:28.739 more inclusive training sets. 0:07:29.302,0:07:32.130 And we can also start thinking[br]more conscientiously 0:07:32.154,0:07:37.545 about the social impact[br]of the technology that we're developing. 0:07:37.569,0:07:39.962 To get the incoding movement started, 0:07:39.986,0:07:42.833 I've launched the Algorithmic[br]Justice League, 0:07:42.857,0:07:48.729 where anyone who cares about fairness[br]can help fight the coded gaze. 0:07:48.753,0:07:52.049 On codedgaze.com, you can report bias, 0:07:52.073,0:07:54.518 request audits, become a tester 0:07:54.542,0:07:57.313 and join the ongoing conversation, 0:07:57.337,0:07:59.624 #codedgaze. 0:08:00.742,0:08:03.229 So I invite you to join me 0:08:03.253,0:08:06.972 in creating a world where technology[br]works for all of us, 0:08:06.996,0:08:08.893 not just some of us, 0:08:08.917,0:08:13.505 a world where we value inclusion[br]and center social change. 0:08:13.529,0:08:14.704 Thank you. 0:08:14.728,0:08:18.999 (Applause) 0:08:20.873,0:08:23.727 But I have one question: 0:08:23.751,0:08:25.810 Will you join me in the fight? 0:08:25.834,0:08:27.119 (Laughter) 0:08:27.143,0:08:30.830 (Applause)