1 00:00:00,875 --> 00:00:04,643 How many decisions have been made about you today, 2 00:00:04,667 --> 00:00:07,268 or this week or this year, 3 00:00:07,292 --> 00:00:09,250 by artificial intelligence? 4 00:00:10,958 --> 00:00:12,643 I build AI for a living, 5 00:00:12,667 --> 00:00:15,684 so full disclosure -- I'm kind of a nerd. 6 00:00:15,708 --> 00:00:18,101 And because I'm kind of a nerd, 7 00:00:18,125 --> 00:00:20,476 wherever some new news story comes out 8 00:00:20,500 --> 00:00:23,934 about artificial intelligence stealing all our jobs, 9 00:00:23,958 --> 00:00:28,143 or robots getting citizenship of an actual country, 10 00:00:28,167 --> 00:00:31,309 I'm the person my friends and followers message 11 00:00:31,333 --> 00:00:32,875 freaking out about the future. 12 00:00:33,833 --> 00:00:35,934 We see this everywhere. 13 00:00:35,958 --> 00:00:40,851 This media panic that our robot overlords are taking over. 14 00:00:40,875 --> 00:00:42,792 We could blame Hollywood for that. 15 00:00:44,125 --> 00:00:48,250 But in reality, that's not the problem we should be focusing on. 16 00:00:49,250 --> 00:00:52,893 There is a more pressing danger, a bigger risk with AI, 17 00:00:52,917 --> 00:00:54,500 that we need to fix first. 18 00:00:55,417 --> 00:00:57,726 So we are back to this question: 19 00:00:57,750 --> 00:01:02,458 How many decisions have been made about you today by AI? 20 00:01:03,792 --> 00:01:05,768 And how many of these 21 00:01:05,792 --> 00:01:10,292 were based on your gender, your race or your background? 22 00:01:12,500 --> 00:01:15,268 Algorithms are being used all the time 23 00:01:15,292 --> 00:01:19,125 to make decisions about who we are and what we want. 24 00:01:20,208 --> 00:01:23,851 Some of the women in this room will know what I'm talking about 25 00:01:23,875 --> 00:01:27,643 if you've been made to sit through those pregnancy test adverts on YouTube 26 00:01:27,667 --> 00:01:29,726 like 1,000 times. 27 00:01:29,750 --> 00:01:32,601 Or you've scrolled past adverts of fertility clinics 28 00:01:32,625 --> 00:01:34,667 on your Facebook feed. 29 00:01:35,625 --> 00:01:38,018 Or in my case, Indian marriage bureaus. 30 00:01:38,042 --> 00:01:39,309 (Laughter) 31 00:01:39,333 --> 00:01:42,309 But AI isn't just being used to make decisions 32 00:01:42,333 --> 00:01:44,934 about what products we want to buy 33 00:01:44,958 --> 00:01:47,458 or which show we want to binge watch next. 34 00:01:49,042 --> 00:01:54,226 I wonder how you'd feel about someone who thought things like this: 35 00:01:54,250 --> 00:01:56,184 a black or Latino person 36 00:01:56,208 --> 00:02:00,333 is less likely than a white person to pay off their loan on time. 37 00:02:01,542 --> 00:02:04,351 A person called John makes a better programmer 38 00:02:04,375 --> 00:02:06,042 than a person called Mary. 39 00:02:07,250 --> 00:02:12,333 A black man is more likely to be a repeat offender than a white man. 40 00:02:14,958 --> 00:02:16,226 You're probably thinking, 41 00:02:16,250 --> 00:02:20,000 "Wow, that sounds like a pretty sexist, racist person," right? 42 00:02:21,000 --> 00:02:25,851 These are some real decisions that AI has made very recently, 43 00:02:25,875 --> 00:02:28,809 based on the biases it has learned from us, 44 00:02:28,833 --> 00:02:30,083 from the humans. 45 00:02:31,750 --> 00:02:36,559 AI is being used to help decide whether or not you get that job interview; 46 00:02:36,583 --> 00:02:38,976 how much you pay for your car insurance; 47 00:02:39,000 --> 00:02:40,893 how good your credit score is; 48 00:02:40,917 --> 00:02:44,042 and even what rating you get in your annual performance review. 49 00:02:45,083 --> 00:02:48,226 But these decisions are all being filtered through 50 00:02:48,250 --> 00:02:54,125 its assumptions about our identity, our race, our gender, our age. 51 00:02:56,250 --> 00:02:58,518 How is that happening? 52 00:02:58,542 --> 00:03:02,059 Now, imagine an AI is helping a hiring manager 53 00:03:02,083 --> 00:03:04,934 find the next tech leader in the company. 54 00:03:04,958 --> 00:03:08,059 So far, the manager has been hiring mostly men. 55 00:03:08,083 --> 00:03:12,833 So the AI learns men are more likely to be programmers than women. 56 00:03:13,542 --> 00:03:16,434 And it's a very short leap from there to: 57 00:03:16,458 --> 00:03:18,500 men make better programmers than women. 58 00:03:19,417 --> 00:03:23,143 We have reinforced our own bias into the AI. 59 00:03:23,167 --> 00:03:26,792 And now, it's screening out female candidates. 60 00:03:28,917 --> 00:03:31,934 Hang on, if a human hiring manager did that, 61 00:03:31,958 --> 00:03:34,309 we'd be outraged, we wouldn't allow it. 62 00:03:34,333 --> 00:03:37,809 This kind of gender discrimination is not OK. 63 00:03:37,833 --> 00:03:42,351 And yet somehow, AI has become above the law, 64 00:03:42,375 --> 00:03:44,458 because a machine made the decision. 65 00:03:45,833 --> 00:03:47,351 That's not it. 66 00:03:47,375 --> 00:03:52,250 We are also reinforcing our bias in how we interact with AI. 67 00:03:52,917 --> 00:03:58,893 How often do you use a voice assistant like Siri, Alexa or even Cortana? 68 00:03:58,917 --> 00:04:01,476 They all have two things in common: 69 00:04:01,500 --> 00:04:04,601 one, they can never get my name right, 70 00:04:04,625 --> 00:04:07,292 and second, they are all female. 71 00:04:08,417 --> 00:04:11,184 They are designed to be our obedient servants, 72 00:04:11,208 --> 00:04:14,458 turning your lights on and off, ordering your shopping. 73 00:04:15,125 --> 00:04:18,434 You get male AIs too, but they tend to be more high-powered, 74 00:04:18,458 --> 00:04:21,517 like IBM Watson, making business decisions, 75 00:04:21,541 --> 00:04:25,333 Salesforce Einstein or ROSS, the robot lawyer. 76 00:04:26,208 --> 00:04:30,268 So poor robots, even they suffer from sexism in the workplace. 77 00:04:30,292 --> 00:04:31,417 (Laughter) 78 00:04:32,542 --> 00:04:35,393 Think about how these two things combine 79 00:04:35,417 --> 00:04:40,726 and affect a kid growing up in today's world around AI. 80 00:04:40,750 --> 00:04:43,684 So they're doing some research for a school project 81 00:04:43,708 --> 00:04:46,726 and they Google images of CEO. 82 00:04:46,750 --> 00:04:49,643 The algorithm shows them results of mostly men. 83 00:04:49,667 --> 00:04:52,226 And now, they Google personal assistant. 84 00:04:52,250 --> 00:04:55,684 As you can guess, it shows them mostly females. 85 00:04:55,708 --> 00:04:59,309 And then they want to put on some music, and maybe order some food, 86 00:04:59,333 --> 00:05:05,917 and now, they are barking orders at an obedient female voice assistant. 87 00:05:07,542 --> 00:05:12,851 Some of our brightest minds are creating this technology today. 88 00:05:12,875 --> 00:05:17,059 Technology that they could have created in any way they wanted. 89 00:05:17,083 --> 00:05:22,768 And yet, they have chosen to create it in the style of 1950s "Mad Man" secretary. 90 00:05:22,792 --> 00:05:24,292 Yay! 91 00:05:24,958 --> 00:05:26,268 But OK, don't worry, 92 00:05:26,292 --> 00:05:28,351 this is not going to end with me telling you 93 00:05:28,375 --> 00:05:31,852 that we are all heading towards sexist, racist machines running the world. 94 00:05:32,792 --> 00:05:38,583 The good news about AI is that it is entirely within our control. 95 00:05:39,333 --> 00:05:43,333 We get to teach the right values, the right ethics to AI. 96 00:05:44,167 --> 00:05:46,351 So there are three things we can do. 97 00:05:46,375 --> 00:05:49,726 One, we can be aware of our own biases 98 00:05:49,750 --> 00:05:52,476 and the bias in machines around us. 99 00:05:52,500 --> 00:05:57,018 Two, we can make sure that diverse teams are building this technology. 100 00:05:57,042 --> 00:06:01,958 And three, we have to give it diverse experiences to learn from. 101 00:06:02,875 --> 00:06:06,184 I can talk about the first two from personal experience. 102 00:06:06,208 --> 00:06:07,643 When you work in technology 103 00:06:07,667 --> 00:06:11,059 and you don't look like a Mark Zuckerberg or Elon Musk, 104 00:06:11,083 --> 00:06:14,833 your life is a little bit difficult, your ability gets questioned. 105 00:06:15,875 --> 00:06:17,268 Here's just one example. 106 00:06:17,292 --> 00:06:21,018 Like most developers, I often join online tech forums 107 00:06:21,042 --> 00:06:24,268 and share my knowledge to help others. 108 00:06:24,292 --> 00:06:25,601 And I've found, 109 00:06:25,625 --> 00:06:29,601 when I log on as myself, with my own photo, my own name, 110 00:06:29,625 --> 00:06:34,226 I tend to get questions or comments like this: 111 00:06:34,250 --> 00:06:37,250 "What makes you think you're qualified to talk about AI?" 112 00:06:38,458 --> 00:06:41,934 "What makes you think you know about machine learning?" 113 00:06:41,958 --> 00:06:45,393 So, as you do, I made a new profile, 114 00:06:45,417 --> 00:06:50,268 and this time, instead of my own picture, I chose a cat with a jet pack on it. 115 00:06:50,292 --> 00:06:52,750 And I chose a name that did not reveal my gender. 116 00:06:53,917 --> 00:06:56,643 You can probably guess where this is going, right? 117 00:06:56,667 --> 00:07:03,059 So, this time, I didn't get any of those patronizing comments about my ability 118 00:07:03,083 --> 00:07:06,417 and I was able to actually get some work done. 119 00:07:07,500 --> 00:07:09,351 And it sucks, guys. 120 00:07:09,375 --> 00:07:11,851 I've been building robots since I was 15, 121 00:07:11,875 --> 00:07:14,143 I have a few degrees in computer science, 122 00:07:14,167 --> 00:07:16,601 and yet, I had to hide my gender 123 00:07:16,625 --> 00:07:18,875 in order for my work to be taken seriously. 124 00:07:19,875 --> 00:07:21,768 So, what's going on here? 125 00:07:21,792 --> 00:07:25,000 Are men just better at technology than women? 126 00:07:25,917 --> 00:07:27,476 Another study found 127 00:07:27,500 --> 00:07:32,434 that when women coders on one platform hid their gender, like myself, 128 00:07:32,458 --> 00:07:35,708 their code was accepted four percent more than men. 129 00:07:36,542 --> 00:07:39,458 So this is not about the talent. 130 00:07:39,958 --> 00:07:42,851 This is about an elitism in AI 131 00:07:42,875 --> 00:07:45,667 that says a programmer needs to look like a certain person. 132 00:07:47,375 --> 00:07:50,476 What we really need to do to make AI better 133 00:07:50,500 --> 00:07:53,542 is bring people from all kinds of backgrounds. 134 00:07:54,542 --> 00:07:57,101 We need people who can write and tell stories 135 00:07:57,125 --> 00:07:59,292 to help us create personalities of AI. 136 00:08:00,208 --> 00:08:02,250 We need people who can solve problems. 137 00:08:03,125 --> 00:08:06,893 We need people who face different challenges 138 00:08:06,917 --> 00:08:12,268 and we need people who can tell us what are the real issues that need fixing, 139 00:08:12,292 --> 00:08:15,333 and help us find ways that technology can actually fix it. 140 00:08:17,833 --> 00:08:21,559 Because, when people from diverse backgrounds come together, 141 00:08:21,583 --> 00:08:23,726 when we build things in the right way, 142 00:08:23,750 --> 00:08:25,792 the possibilities are limitless. 143 00:08:26,750 --> 00:08:30,059 And that's what I want to end by talking to you about. 144 00:08:30,083 --> 00:08:34,308 Less racist robots, less machines that are going to take our jobs 145 00:08:34,332 --> 00:08:37,457 and more about what technology can actually achieve. 146 00:08:38,292 --> 00:08:41,726 So, yes, some of the energy in the world of AI, 147 00:08:41,750 --> 00:08:43,143 in the world of technology 148 00:08:43,167 --> 00:08:47,434 is going to be about what ads you see on your stream. 149 00:08:47,458 --> 00:08:52,667 But a lot of it is going towards making the world so much better. 150 00:08:53,500 --> 00:08:57,268 Think about a pregnant woman in the Democratic Republic of Congo, 151 00:08:57,292 --> 00:09:01,476 who has to walk 17 hours to her nearest rural prenatal clinic 152 00:09:01,500 --> 00:09:03,351 to get a checkup. 153 00:09:03,375 --> 00:09:06,292 What if she could get diagnosis on her phone, instead? 154 00:09:07,750 --> 00:09:09,559 Or think about what AI could do 155 00:09:09,583 --> 00:09:12,309 for those one in three women in South Africa 156 00:09:12,333 --> 00:09:14,458 who face domestic violence. 157 00:09:15,083 --> 00:09:17,809 If it wasn't safe to talk out loud, 158 00:09:17,833 --> 00:09:20,309 they could get an AI service to raise alarm, 159 00:09:20,333 --> 00:09:22,792 get financial and legal advice. 160 00:09:23,958 --> 00:09:28,976 These are all real examples of projects that people, including myself, 161 00:09:29,000 --> 00:09:31,500 are working on right now, using AI. 162 00:09:33,542 --> 00:09:37,143 So, I'm sure in the next couple of days there will be yet another news story 163 00:09:37,167 --> 00:09:39,851 about the existential risk, 164 00:09:39,875 --> 00:09:42,309 robots taking over and coming for your jobs. 165 00:09:42,333 --> 00:09:43,351 (Laughter) 166 00:09:43,375 --> 00:09:45,684 And when something like that happens, 167 00:09:45,708 --> 00:09:49,309 I know I'll get the same messages worrying about the future. 168 00:09:49,333 --> 00:09:53,000 But I feel incredibly positive about this technology. 169 00:09:55,458 --> 00:10:01,417 This is our chance to remake the world into a much more equal place. 170 00:10:02,458 --> 00:10:06,458 But to do that, we need to build it the right way from the get go. 171 00:10:07,667 --> 00:10:12,750 We need people of different genders, races, sexualities and backgrounds. 172 00:10:14,458 --> 00:10:16,934 We need women to be the makers 173 00:10:16,958 --> 00:10:19,958 and not just the machines who do the makers' bidding. 174 00:10:21,875 --> 00:10:25,643 We need to think very carefully what we teach machines, 175 00:10:25,667 --> 00:10:27,309 what data we give them, 176 00:10:27,333 --> 00:10:30,458 so they don't just repeat our own past mistakes. 177 00:10:32,125 --> 00:10:35,667 So I hope I leave you thinking about two things. 178 00:10:36,542 --> 00:10:41,101 First, I hope you leave thinking about bias today. 179 00:10:41,125 --> 00:10:44,309 And that the next time you scroll past an advert 180 00:10:44,333 --> 00:10:47,143 that assumes you are interested in fertility clinics 181 00:10:47,167 --> 00:10:50,018 or online betting websites, 182 00:10:50,042 --> 00:10:52,059 that you think and remember 183 00:10:52,083 --> 00:10:56,708 that the same technology is assuming that a black man will reoffend. 184 00:10:57,833 --> 00:11:02,000 Or that a woman is more likely to be a personal assistant than a CEO. 185 00:11:02,958 --> 00:11:06,667 And I hope that reminds you that we need to do something about it. 186 00:11:08,917 --> 00:11:10,768 And second, 187 00:11:10,792 --> 00:11:12,684 I hope you think about the fact 188 00:11:12,708 --> 00:11:14,684 that you don't need to look a certain way 189 00:11:14,708 --> 00:11:18,559 or have a certain background in engineering or technology 190 00:11:18,583 --> 00:11:19,851 to create AI, 191 00:11:19,875 --> 00:11:22,750 which is going to be a phenomenal force for our future. 192 00:11:24,166 --> 00:11:26,309 You don't need to look like a Mark Zuckerberg, 193 00:11:26,333 --> 00:11:27,583 you can look like me. 194 00:11:29,250 --> 00:11:32,143 And it is up to all of us in this room 195 00:11:32,167 --> 00:11:34,893 to convince the governments and the corporations 196 00:11:34,917 --> 00:11:37,809 to build AI technology for everyone, 197 00:11:37,833 --> 00:11:40,226 including the edge cases. 198 00:11:40,250 --> 00:11:42,309 And for us all to get education 199 00:11:42,333 --> 00:11:44,708 about this phenomenal technology in the future. 200 00:11:46,167 --> 00:11:48,184 Because if we do that, 201 00:11:48,208 --> 00:11:53,101 then we've only just scratched the surface of what we can achieve with AI. 202 00:11:53,125 --> 00:11:54,393 Thank you. 203 00:11:54,417 --> 00:11:57,125 (Applause)