1 00:00:13,961 --> 00:00:17,095 Hello, I'm Joy, a poet of code, 2 00:00:17,119 --> 00:00:22,112 on a mission to stop an unseen force that's rising, 3 00:00:22,136 --> 00:00:24,992 a force that I called "the coded gaze," 4 00:00:25,016 --> 00:00:28,325 my term for algorithmic bias. 5 00:00:28,349 --> 00:00:32,649 Algorithmic bias, like human bias, results in unfairness. 6 00:00:32,673 --> 00:00:38,695 However, algorithms, like viruses, can spread bias on a massive scale 7 00:00:38,719 --> 00:00:40,301 at a rapid pace. 8 00:00:40,863 --> 00:00:45,250 Algorithmic bias can also lead to exclusionary experiences 9 00:00:45,274 --> 00:00:47,402 and discriminatory practices. 10 00:00:47,426 --> 00:00:49,487 Let me show you what I mean. 11 00:00:50,340 --> 00:00:52,776 (Video) Joy Boulamwini: Hi, camera. I've got a face. 12 00:00:53,242 --> 00:00:55,107 Can you see my face? 13 00:00:55,131 --> 00:00:56,756 No-glasses face? 14 00:00:58,461 --> 00:01:00,207 You can see her face. 15 00:01:01,084 --> 00:01:02,691 What about my face? 16 00:01:05,874 --> 00:01:07,054 (Laughter) 17 00:01:07,078 --> 00:01:09,737 I've got a mask. Can you see my mask? 18 00:01:11,914 --> 00:01:14,279 Joy Boulamwini: So how did this happen? 19 00:01:14,303 --> 00:01:17,444 Why am I sitting in front of a computer 20 00:01:17,468 --> 00:01:18,892 in a white mask, 21 00:01:18,916 --> 00:01:22,566 trying to be detected by a cheap webcam? 22 00:01:22,590 --> 00:01:24,881 Well, when I'm not fighting the coded gaze 23 00:01:24,905 --> 00:01:26,425 as a poet of code, 24 00:01:26,449 --> 00:01:29,721 I'm a graduate student at the MIT Media Lab, 25 00:01:29,745 --> 00:01:34,662 and there I have the opportunity to work on all sorts of whimsical projects, 26 00:01:34,686 --> 00:01:36,713 including the Aspire Mirror, 27 00:01:36,737 --> 00:01:41,871 a project I did so I could project digital masks onto my reflection. 28 00:01:41,895 --> 00:01:44,245 So in the morning, if I wanted to feel powerful, 29 00:01:44,269 --> 00:01:45,703 I could put on a lion. 30 00:01:45,727 --> 00:01:49,223 If I wanted to be uplifted, I might have a quote. 31 00:01:49,247 --> 00:01:52,236 So I used generic facial recognition software 32 00:01:52,260 --> 00:01:53,611 to build the system, 33 00:01:53,635 --> 00:01:58,738 but found it was really hard to test it unless I wore a white mask. 34 00:01:59,722 --> 00:02:04,068 Unfortunately, I've run into this issue before. 35 00:02:04,092 --> 00:02:08,435 When I was an undergraduate at Georgia Tech studying computer science, 36 00:02:08,459 --> 00:02:10,514 I used to work on social robots, 37 00:02:10,538 --> 00:02:14,315 and one of my tasks was to get a robot to play peek-a-boo, 38 00:02:14,339 --> 00:02:16,272 a simple turn-taking game 39 00:02:16,296 --> 00:02:20,617 where partners cover their face and then uncover it saying, "Peek-a-boo!" 40 00:02:20,741 --> 00:02:25,170 The problem is, peek-a-boo doesn't really work if I can't see you, 41 00:02:25,194 --> 00:02:27,693 and my robot couldn't see me. 42 00:02:27,717 --> 00:02:31,667 But I borrowed my roommate's face to get the project done, 43 00:02:31,691 --> 00:02:33,071 submitted the assignment, 44 00:02:33,095 --> 00:02:36,848 and figured, you know what, somebody else will solve this problem. 45 00:02:37,499 --> 00:02:39,502 Not too long after, 46 00:02:39,526 --> 00:02:43,685 I was in Hong Kong for an entrepreneurship competition. 47 00:02:44,169 --> 00:02:46,863 The organizers decided to take participants 48 00:02:46,887 --> 00:02:49,259 on a tour of local start-ups. 49 00:02:49,283 --> 00:02:51,998 One of the start-ups had a social robot, 50 00:02:52,022 --> 00:02:53,934 and they decided to do a demo. 51 00:02:53,958 --> 00:02:56,938 The demo worked on everybody until it got to me, 52 00:02:56,962 --> 00:02:58,885 and you can probably guess it. 53 00:02:58,909 --> 00:03:01,874 It couldn't detect my face. 54 00:03:01,898 --> 00:03:04,409 I asked the developers what was going on, 55 00:03:04,433 --> 00:03:09,966 and it turned out we had used the same generic facial recognition software. 56 00:03:09,990 --> 00:03:11,640 Halfway around the world, 57 00:03:11,664 --> 00:03:15,516 I learned that algorithmic bias can travel as quickly 58 00:03:15,540 --> 00:03:18,710 as it takes to download some files off of the internet. 59 00:03:19,575 --> 00:03:22,651 So what's going on? Why isn't my face being detected? 60 00:03:22,675 --> 00:03:26,031 Well, we have to look at how we give machines sight. 61 00:03:26,055 --> 00:03:29,464 Computer vision uses machine learning techniques 62 00:03:29,488 --> 00:03:31,368 to do facial recognition. 63 00:03:31,392 --> 00:03:35,289 So how this works is, you create a training set with examples of faces. 64 00:03:35,313 --> 00:03:38,131 This is a face. This is a face. This is not a face. 65 00:03:38,155 --> 00:03:42,674 And over time, you can teach a computer how to recognize other faces. 66 00:03:42,698 --> 00:03:46,687 However, if the training sets aren't really that diverse, 67 00:03:46,711 --> 00:03:50,060 any face that deviates too much from the established norm 68 00:03:50,084 --> 00:03:51,733 will be harder to detect, 69 00:03:51,757 --> 00:03:53,720 which is what was happening to me. 70 00:03:53,744 --> 00:03:56,126 But don't worry -- there's some good news. 71 00:03:56,150 --> 00:03:58,921 Training sets don't just materialize out of nowhere. 72 00:03:58,945 --> 00:04:00,733 We actually can create them. 73 00:04:00,757 --> 00:04:04,933 So there's an opportunity to create full-spectrum training sets 74 00:04:04,957 --> 00:04:08,781 that reflect a richer portrait of humanity. 75 00:04:08,805 --> 00:04:11,026 Now you've seen in my examples 76 00:04:11,050 --> 00:04:12,818 how social robots 77 00:04:12,842 --> 00:04:17,453 was how I found out about exclusion with algorithmic bias. 78 00:04:17,477 --> 00:04:22,292 But algorithmic bias can also lead to discriminatory practices. 79 00:04:23,267 --> 00:04:24,720 Across the US, 80 00:04:24,744 --> 00:04:28,942 police departments are starting to use facial recognition software 81 00:04:28,966 --> 00:04:31,425 in their crime-fighting arsenal. 82 00:04:31,449 --> 00:04:33,462 Georgetown Law published a report 83 00:04:33,486 --> 00:04:40,249 showing that one in two adults in the US -- that's 117 million people -- 84 00:04:40,273 --> 00:04:43,807 have their faces in facial recognition networks. 85 00:04:43,831 --> 00:04:48,383 Police departments can currently look at these networks unregulated, 86 00:04:48,407 --> 00:04:52,693 using algorithms that have not been audited for accuracy. 87 00:04:52,717 --> 00:04:56,581 Yet we know facial recognition is not fail proof, 88 00:04:56,605 --> 00:05:00,784 and labeling faces consistently remains a challenge. 89 00:05:00,808 --> 00:05:02,570 You might have seen this on Facebook. 90 00:05:02,594 --> 00:05:05,582 My friends and I laugh all the time when we see other people 91 00:05:05,606 --> 00:05:08,064 mislabeled in our photos. 92 00:05:08,088 --> 00:05:13,679 But misidentifying a suspected criminal is no laughing matter, 93 00:05:13,703 --> 00:05:16,530 nor is breaching civil liberties. 94 00:05:16,554 --> 00:05:19,759 Machine learning is being used for facial recognition, 95 00:05:19,783 --> 00:05:24,288 but it's also extending beyond the realm of computer vision. 96 00:05:25,096 --> 00:05:29,112 In her book, "Weapons of Math Destruction," 97 00:05:29,136 --> 00:05:35,817 data scientist Cathy O'Neil talks about the rising new WMDs -- 98 00:05:35,841 --> 00:05:40,194 widespread, mysterious and destructive algorithms 99 00:05:40,218 --> 00:05:43,182 that are increasingly being used to make decisions 100 00:05:43,206 --> 00:05:46,383 that impact more aspects of our lives. 101 00:05:46,407 --> 00:05:48,277 So who gets hired or fired? 102 00:05:48,301 --> 00:05:50,413 Do you get that loan? Do you get insurance? 103 00:05:50,437 --> 00:05:53,940 Are you admitted into the college you wanted to get into? 104 00:05:53,964 --> 00:05:57,473 Do you and I pay the same price for the same product 105 00:05:57,497 --> 00:05:59,939 purchased on the same platform? 106 00:05:59,963 --> 00:06:03,722 Law enforcement is also starting to use machine learning 107 00:06:03,746 --> 00:06:06,035 for predictive policing. 108 00:06:06,059 --> 00:06:09,553 Some judges use machine-generated risk scores to determine 109 00:06:09,577 --> 00:06:13,979 how long an individual is going to spend in prison. 110 00:06:14,003 --> 00:06:16,457 So we really have to think about these decisions. 111 00:06:16,481 --> 00:06:17,663 Are they fair? 112 00:06:17,687 --> 00:06:20,577 And we've seen that algorithmic bias 113 00:06:20,601 --> 00:06:23,975 doesn't necessarily always lead to fair outcomes. 114 00:06:23,999 --> 00:06:25,963 So what can we do about it? 115 00:06:25,987 --> 00:06:29,667 Well, we can start thinking about how we create more inclusive code 116 00:06:29,691 --> 00:06:32,681 and employ inclusive coding practices. 117 00:06:32,705 --> 00:06:35,014 It really starts with people. 118 00:06:35,538 --> 00:06:37,499 So who codes matters. 119 00:06:37,523 --> 00:06:41,642 Are we creating full-spectrum teams with diverse individuals 120 00:06:41,666 --> 00:06:44,077 who can check each other's blind spots? 121 00:06:44,101 --> 00:06:47,646 On the technical side, how we code matters. 122 00:06:47,670 --> 00:06:51,321 Are we factoring in fairness as we're developing systems? 123 00:06:51,345 --> 00:06:54,258 And finally, why we code matters. 124 00:06:54,615 --> 00:06:59,698 We've used tools of computational creation to unlock immense wealth. 125 00:06:59,722 --> 00:07:04,169 We now have the opportunity to unlock even greater equality 126 00:07:04,193 --> 00:07:07,123 if we make social change a priority 127 00:07:07,147 --> 00:07:09,317 and not an afterthought. 128 00:07:09,838 --> 00:07:14,360 And so these are the three tenets that will make up the "incoding" movement. 129 00:07:14,384 --> 00:07:16,036 Who codes matters, 130 00:07:16,060 --> 00:07:17,603 how we code matters 131 00:07:17,627 --> 00:07:19,650 and why we code matters. 132 00:07:19,674 --> 00:07:22,773 So to go towards incoding, we can start thinking about 133 00:07:22,797 --> 00:07:25,961 building platforms that can identify bias 134 00:07:25,985 --> 00:07:29,063 by collecting people's experiences like the ones I shared, 135 00:07:29,087 --> 00:07:32,157 but also auditing existing software. 136 00:07:32,181 --> 00:07:35,946 We can also start to create more inclusive training sets. 137 00:07:35,970 --> 00:07:38,773 Imagine a "Selfies for Inclusion" campaign 138 00:07:38,797 --> 00:07:42,452 where you and I can help developers test and create 139 00:07:42,476 --> 00:07:44,569 more inclusive training sets. 140 00:07:45,132 --> 00:07:47,960 And we can also start thinking more conscientiously 141 00:07:47,984 --> 00:07:53,375 about the social impact of the technology that we're developing. 142 00:07:53,399 --> 00:07:55,792 To get the incoding movement started, 143 00:07:55,816 --> 00:07:58,663 I've launched the Algorithmic Justice League, 144 00:07:58,687 --> 00:08:04,559 where anyone who cares about fairness can help fight the coded gaze. 145 00:08:04,583 --> 00:08:07,879 On codedgaze.com, you can report bias, 146 00:08:07,903 --> 00:08:10,348 request audits, become a tester 147 00:08:10,372 --> 00:08:13,143 and join the ongoing conversation, 148 00:08:13,167 --> 00:08:15,454 #codedgaze. 149 00:08:16,572 --> 00:08:19,059 So I invite you to join me 150 00:08:19,083 --> 00:08:22,802 in creating a world where technology works for all of us, 151 00:08:22,826 --> 00:08:24,723 not just some of us, 152 00:08:24,747 --> 00:08:29,395 a world where we value inclusion and center social change. 153 00:08:29,419 --> 00:08:30,594 Thank you. 154 00:08:30,618 --> 00:08:35,912 (Applause) 155 00:08:36,763 --> 00:08:39,617 But I have one question: 156 00:08:39,641 --> 00:08:41,700 Will you join me in the fight? 157 00:08:41,724 --> 00:08:43,009 (Laughter) 158 00:08:43,033 --> 00:08:46,720 (Applause)