0:00:00.881,0:00:04.635 How many decisions[br]have been made about you today, 0:00:04.659,0:00:07.278 or this week or this year, 0:00:07.302,0:00:09.238 by artificial intelligence? 0:00:10.943,0:00:12.641 I build AI for a living, 0:00:12.665,0:00:15.236 so full disclosure -- I'm kind of a nerd. 0:00:15.760,0:00:18.093 And because I'm kind of a nerd, 0:00:18.117,0:00:20.471 wherever some new news story comes out 0:00:20.495,0:00:23.918 about artificial intelligence[br]stealing all our jobs, 0:00:23.942,0:00:28.172 or robots getting citizenship[br]of an actual country, 0:00:28.196,0:00:31.292 I'm the person my friends[br]and followers message 0:00:31.316,0:00:33.173 freaking out about the future. 0:00:33.839,0:00:35.474 We see this everywhere. 0:00:35.957,0:00:40.395 This media panic that[br]our robot overlords are taking over. 0:00:40.871,0:00:43.137 We could blame Hollywood for that. 0:00:44.109,0:00:48.267 But in reality, that's not the problem[br]we should be focusing on. 0:00:49.259,0:00:52.910 There is a more pressing danger,[br]a bigger risk with AI, 0:00:52.934,0:00:54.667 that we need to fix first. 0:00:55.394,0:00:57.744 So we are back to this question: 0:00:57.768,0:01:02.466 How many decisions have been made[br]about you today by AI? 0:01:03.800,0:01:05.752 And how many of these 0:01:05.776,0:01:10.275 were based on your gender,[br]your race or your background? 0:01:12.513,0:01:15.247 Algorithms are being used all the time 0:01:15.271,0:01:19.275 to make decisions about who we are[br]and what we want. 0:01:20.204,0:01:23.847 Some of the women in this room[br]will know what I'm talking about 0:01:23.871,0:01:27.609 if you've been made to sit through[br]those pregnancy test adverts on You Tube 0:01:27.633,0:01:29.289 like 1,000 times. 0:01:29.736,0:01:32.611 Or you've scrolled past adverts[br]of fertility clinics 0:01:32.635,0:01:34.974 on your Facebook feed. 0:01:35.641,0:01:38.030 Or in my case, Indian marriage bureaus. 0:01:38.054,0:01:39.307 (Laughter) 0:01:39.331,0:01:42.323 But AI isn't just being used[br]to make decisions 0:01:42.347,0:01:44.950 about what products we want to buy 0:01:44.974,0:01:47.774 or which show we want to binge watch next. 0:01:49.030,0:01:53.567 I wonder how you'd feel about someone[br]who thought things like this: 0:01:54.266,0:01:56.170 a black or Latino person 0:01:56.194,0:02:00.291 is less likely than a white person[br]to pay off their loan on time. 0:02:01.528,0:02:04.366 A person called John[br]makes a better programmer 0:02:04.390,0:02:06.055 than a person called Marry. 0:02:07.230,0:02:12.341 A black man is more likely to be[br]a repeat offender than a white man. 0:02:14.950,0:02:16.144 You're probably thinking, 0:02:16.168,0:02:19.989 "Wow, that sounds like a pretty sexist,[br]racist person," right? 0:02:20.990,0:02:25.859 These are some real decisions[br]that AI has made very recently, 0:02:25.883,0:02:28.812 based on the biases[br]it has learned from us, 0:02:28.836,0:02:29.986 from the humans. 0:02:31.741,0:02:36.556 AI is being used to help decide[br]whether or not you get that job interview; 0:02:36.580,0:02:39.000 how much you pay for your car insurance; 0:02:39.024,0:02:40.889 how good your credit score is; 0:02:40.913,0:02:44.611 and even what rating you get[br]in your annual performance review. 0:02:45.006,0:02:48.219 But these decisions[br]are all being filtered through 0:02:48.243,0:02:54.138 it's assumptions about our identity,[br]our race, our gender, our age. 0:02:56.269,0:02:57.736 How is that happening? 0:02:58.538,0:03:02.102 Now, imagine an AI is helping[br]a hiring manager 0:03:02.126,0:03:04.934 find the next tech leader in the company. 0:03:04.958,0:03:08.077 So far, the manager[br]has been hiring mostly men. 0:03:08.101,0:03:12.729 So the AI learns men are more likely[br]to be programmers than women. 0:03:13.526,0:03:16.454 And it's a very short leap from there to: 0:03:16.478,0:03:18.884 men make better programmers than women. 0:03:19.336,0:03:22.784 We have reinforced[br]our own bias into the AI. 0:03:23.165,0:03:26.791 And now, it's screening out[br]female candidates. 0:03:28.903,0:03:31.926 Hang on, if a human[br]hiring manager did that, 0:03:31.950,0:03:34.307 we'd be outraged, we wouldn't allow it. 0:03:34.331,0:03:37.398 This kind of gender[br]discrimination is not OK. 0:03:37.815,0:03:42.338 And yet somehow,[br]AI has become above the law, 0:03:42.362,0:03:44.489 because a machine made the decision. 0:03:45.838,0:03:46.988 That's not it. 0:03:47.377,0:03:52.053 We are also reinforcing our bias[br]in how we interact with AI. 0:03:52.902,0:03:58.059 How often do you use a voice assistant[br]like Siri, Alexa or even Cortana? 0:03:58.850,0:04:01.445 They all have two things in common: 0:04:01.469,0:04:04.596 one, they can never get my name right, 0:04:04.620,0:04:07.309 and second, they are all female. 0:04:08.419,0:04:11.188 They are designed to be[br]our obedient servants, 0:04:11.212,0:04:14.238 turning your lights on and off,[br]ordering your shopping. 0:04:15.117,0:04:18.426 You get male AIs too,[br]but they tend to be more high-powered, 0:04:18.450,0:04:21.526 like IBM Watson,[br]making business decisions, 0:04:21.550,0:04:25.558 Salesforce Einstein[br]or ROSS, the robot lawyer. 0:04:26.228,0:04:30.267 So poor robots, even they suffer[br]from sexism in the workplace. 0:04:30.291,0:04:31.441 (Laughter) 0:04:32.530,0:04:35.378 Think about how these two things combine 0:04:35.402,0:04:39.984 and affect a kid growing up[br]in today's world around AI. 0:04:40.744,0:04:43.664 So they're doing some research[br]for a school project 0:04:43.688,0:04:46.728 and they google images of CEO. 0:04:46.752,0:04:49.649 The algorithm shows them[br]results of mostly men. 0:04:49.673,0:04:52.221 And now, they google personal assistant. 0:04:52.245,0:04:55.275 As you can guess,[br]it shows them mostly females. 0:04:55.704,0:04:59.323 And then they want to put on some music,[br]and maybe order some food, 0:04:59.347,0:05:05.545 and now, they are barking orders[br]at an obedient female voice assistant. 0:05:07.554,0:05:12.279 Some of our brightest minds[br]are creating this technology today. 0:05:12.848,0:05:16.541 Technology that they could have created[br]in any way they wanted. 0:05:17.085,0:05:22.752 And yet, they have chosen to create it[br]in the style of 1950s "Mad Man" secretary. 0:05:22.776,0:05:23.926 Yay! 0:05:24.958,0:05:26.252 But OK, don't worry, 0:05:26.276,0:05:28.331 this is not going to end[br]with me telling you 0:05:28.355,0:05:31.847 that we are all heading towards[br]sexist, racist machines running the world. 0:05:32.783,0:05:38.457 The good news about AI[br]is that it is entirely within our control. 0:05:39.346,0:05:43.053 We get to teach the right values,[br]the right ethics to AI. 0:05:44.172,0:05:46.354 So there are three things we can do. 0:05:46.378,0:05:49.735 One, we can be aware of our own biases 0:05:49.759,0:05:52.093 and the bias in machines around us. 0:05:52.498,0:05:57.037 Two, we can make sure that our[br]diverse teams building this technology. 0:05:57.061,0:06:01.566 And three, we have to give it[br]diverse experiences to learn from. 0:06:02.864,0:06:06.181 I can talk about the first two[br]from personal experience. 0:06:06.205,0:06:07.659 When you work in technology 0:06:07.683,0:06:11.070 and you don't look like[br]a Mark Zuckerberg or Elon Musk, 0:06:11.094,0:06:14.537 your life is a little bit difficult,[br]your ability gets questioned. 0:06:15.910,0:06:17.263 Here's just one example. 0:06:17.287,0:06:21.038 Like most developers,[br]I often join online tech forums 0:06:21.062,0:06:23.595 and share my knowledge to help others. 0:06:24.299,0:06:25.617 And I've found, 0:06:25.641,0:06:29.585 when I log on as myself,[br]with my own photo, my own name, 0:06:29.609,0:06:34.212 I tend to get questions[br]or comments like this: 0:06:34.236,0:06:37.562 "What makes you think[br]you're qualified to talk about AI?" 0:06:38.443,0:06:41.514 "What makes you think[br]you know about machine learning?" 0:06:41.974,0:06:45.379 So, as you do, I made a new profile, 0:06:45.403,0:06:49.576 and this time, instead of my own picture,[br]I chose a cat with a jet pack on it. 0:06:50.292,0:06:53.152 And I chose a name[br]that did not reveal my gender. 0:06:53.927,0:06:56.633 You can probably guess[br]where this is going, right? 0:06:56.657,0:07:03.077 So, this time, I didn't get any of those[br]patronizing comments about my ability 0:07:03.101,0:07:06.172 and I was able to actually[br]get some work done. 0:07:07.482,0:07:08.966 And it sucks, guys. 0:07:09.371,0:07:11.871 I've been building robots since I was 15, 0:07:11.895,0:07:14.156 I have a few degrees in computer science, 0:07:14.180,0:07:16.593 and yet, I had to hide my gender 0:07:16.617,0:07:19.312 in order for my work[br]to be taken seriously. 0:07:19.887,0:07:21.752 So, what's going on here? 0:07:21.776,0:07:25.212 Are men just better[br]at technology than women? 0:07:25.887,0:07:27.490 Another study found 0:07:27.514,0:07:32.414 that when women coders on one platform[br]hid their gender, like myself, 0:07:32.438,0:07:35.680 their code was accepted[br]four percent more than men. 0:07:36.557,0:07:39.064 So this is not about the talent. 0:07:39.973,0:07:42.847 This is about an elitism in AI 0:07:42.871,0:07:46.162 that says a programmer[br]needs to look like a certain person. 0:07:47.363,0:07:50.490 What we really need to do[br]to make AI better 0:07:50.514,0:07:53.647 is bring people from[br]all kinds of backgrounds. 0:07:54.537,0:07:57.109 We need people who can[br]write and tell stories 0:07:57.133,0:07:59.666 to help us create personalities of AI. 0:08:00.220,0:08:02.753 We need people who can solve problems. 0:08:03.109,0:08:06.886 We need people who face[br]different challenges 0:08:06.910,0:08:12.275 and we need people who can tell us[br]what are the real issues that need fixing, 0:08:12.299,0:08:15.735 and help us find ways[br]that technology can actually fix it. 0:08:17.830,0:08:21.576 Because, when people[br]from diverse backgrounds come together, 0:08:21.600,0:08:23.726 when we build things in the right way, 0:08:23.750,0:08:25.884 the possibilities are limitless. 0:08:26.720,0:08:29.548 And that's what I want to end[br]by talking to you about. 0:08:30.101,0:08:34.322 Less racist robots, less machines[br]that are going to take our jobs, 0:08:34.346,0:08:37.712 and more about what technology[br]can actually achieve. 0:08:38.284,0:08:41.714 So, yes, some of the energy[br]in the world of AI, 0:08:41.738,0:08:43.140 in the world of technology 0:08:43.164,0:08:46.844 is going to be about[br]what ads you see on your stream. 0:08:47.458,0:08:52.271 But a lot of it is going towards[br]making the world so much better. 0:08:53.486,0:08:57.279 Think about a pregnant woman[br]in the Democratic Republic of Congo, 0:08:57.303,0:09:01.482 who has to walk 17 hours[br]to her nearest rural prenatal clinic 0:09:01.506,0:09:02.656 to get a checkup. 0:09:03.355,0:09:06.712 What if she could get diagnosis[br]on her phone, instead? 0:09:07.783,0:09:09.577 Or think about what AI could do 0:09:09.601,0:09:12.300 for those one in three women[br]in South Africa 0:09:12.324,0:09:14.124 who face domestic violence. 0:09:15.101,0:09:17.798 If it wasn't safe to talk out loud, 0:09:17.822,0:09:20.314 they could get an AI service[br]to raise alarm, 0:09:20.338,0:09:22.405 get financial and legal advice. 0:09:23.950,0:09:28.974 These are all real examples of projects[br]that people, including myself, 0:09:28.998,0:09:31.505 are working on right now, using AI. 0:09:33.553,0:09:37.140 So, I'm sure in the next couple of days[br]there will be yet another news story 0:09:37.164,0:09:39.846 about the existential risk, 0:09:39.870,0:09:42.292 robots taking over[br]and coming for your jobs. 0:09:42.316,0:09:43.339 (Laughter) 0:09:43.363,0:09:45.696 And when something like that happens, 0:09:45.720,0:09:49.052 I know I'll get the same messages[br]worrying about the future. 0:09:49.353,0:09:53.019 But I feel incredibly positive[br]about this technology. 0:09:55.456,0:10:01.111 This is our chance to remake the world[br]into a much more equal place. 0:10:02.461,0:10:06.455 But to do that, we need to build it[br]the right way from the get go. 0:10:07.651,0:10:12.546 We need people of different genders,[br]races, sexualities and backgrounds. 0:10:14.451,0:10:16.943 We need women to be the makers 0:10:16.967,0:10:20.500 and not just the machines[br]who do the makers' bidding. 0:10:21.895,0:10:25.625 We need to think very carefully[br]what we teach machines, 0:10:25.649,0:10:27.300 what data we give them, 0:10:27.324,0:10:30.591 so they don't just repeat[br]our own past mistakes. 0:10:32.118,0:10:35.663 So I hope I leave you[br]thinking about two things. 0:10:36.567,0:10:40.614 First, I hope you leave[br]thinking about bias today. 0:10:41.114,0:10:44.289 And that the next time[br]you scroll past an advert 0:10:44.313,0:10:47.127 that assumes you are interested[br]in fertility clinics 0:10:47.151,0:10:49.998 or online betting websites, 0:10:50.022,0:10:52.045 that you think and remember 0:10:52.069,0:10:56.399 that the same technology is assuming[br]that a black man will reoffend. 0:10:57.831,0:11:02.204 Or that a woman is more likely[br]to be a personal assistant than a CEO. 0:11:02.956,0:11:06.661 And I hope that reminds you[br]that we need to do something about it. 0:11:08.908,0:11:10.773 And second, 0:11:10.797,0:11:12.688 I hope you think about the fact 0:11:12.712,0:11:16.133 that you don't need to look a certain way[br]or have a certain background 0:11:16.157,0:11:19.649 in engineering or technology to create AI, 0:11:19.673,0:11:22.731 which is going to be a phenomenal[br]force for our future. 0:11:24.157,0:11:26.315 You don't need to look[br]like a Mark Zuckerberg, 0:11:26.339,0:11:27.739 you can look like me. 0:11:29.266,0:11:32.163 And it is up to all of us in this room 0:11:32.187,0:11:34.879 to convince the governments[br]and the corporations 0:11:34.903,0:11:37.806 to build AI technology for everyone, 0:11:37.830,0:11:39.559 including the edge cases. 0:11:40.225,0:11:42.297 And for us, all to get education 0:11:42.321,0:11:45.017 about this phenomenal[br]technology in the future. 0:11:46.130,0:11:48.193 Because if we do that, 0:11:48.217,0:11:52.633 then we've only just scratched the surface[br]of what we can achieve with AI. 0:11:53.143,0:11:54.301 Thank you. 0:11:54.325,0:11:57.128 (Applause)