WEBVTT 00:00:00.875 --> 00:00:04.643 How many decisions have been made about you today, 00:00:04.667 --> 00:00:07.268 or this week or this year, 00:00:07.292 --> 00:00:09.250 by artificial intelligence? 00:00:10.958 --> 00:00:12.643 I build AI for a living, 00:00:12.667 --> 00:00:15.684 so full disclosure -- I'm kind of a nerd. 00:00:15.708 --> 00:00:18.101 And because I'm kind of a nerd, 00:00:18.125 --> 00:00:20.476 wherever some new news story comes out 00:00:20.500 --> 00:00:23.934 about artificial intelligence stealing all our jobs, 00:00:23.958 --> 00:00:28.143 or robots getting citizenship of an actual country, 00:00:28.167 --> 00:00:31.309 I'm the person my friends and followers message 00:00:31.333 --> 00:00:32.875 freaking out about the future. 00:00:33.833 --> 00:00:35.934 We see this everywhere. 00:00:35.958 --> 00:00:40.851 This media panic that our robot overlords are taking over. 00:00:40.875 --> 00:00:42.792 We could blame Hollywood for that. 00:00:44.125 --> 00:00:48.250 But in reality, that's not the problem we should be focusing on. 00:00:49.250 --> 00:00:52.893 There is a more pressing danger, a bigger risk with AI, 00:00:52.917 --> 00:00:54.500 that we need to fix first. 00:00:55.417 --> 00:00:57.726 So we are back to this question: 00:00:57.750 --> 00:01:02.458 How many decisions have been made about you today by AI? 00:01:03.792 --> 00:01:05.768 And how many of these 00:01:05.792 --> 00:01:10.292 were based on your gender, your race or your background? 00:01:12.500 --> 00:01:15.268 Algorithms are being used all the time 00:01:15.292 --> 00:01:19.125 to make decisions about who we are and what we want. 00:01:20.208 --> 00:01:23.851 Some of the women in this room will know what I'm talking about 00:01:23.875 --> 00:01:27.643 if you've been made to sit through those pregnancy test adverts on YouTube 00:01:27.667 --> 00:01:29.726 like 1,000 times. 00:01:29.750 --> 00:01:32.601 Or you've scrolled past adverts of fertility clinics 00:01:32.625 --> 00:01:34.667 on your Facebook feed. 00:01:35.625 --> 00:01:38.018 Or in my case, Indian marriage bureaus. 00:01:38.042 --> 00:01:39.309 (Laughter) 00:01:39.333 --> 00:01:42.309 But AI isn't just being used to make decisions 00:01:42.333 --> 00:01:44.934 about what products we want to buy 00:01:44.958 --> 00:01:47.458 or which show we want to binge watch next. 00:01:49.042 --> 00:01:54.226 I wonder how you'd feel about someone who thought things like this: 00:01:54.250 --> 00:01:56.184 a black or Latino person 00:01:56.208 --> 00:02:00.333 is less likely than a white person to pay off their loan on time. 00:02:01.542 --> 00:02:04.351 A person called John makes a better programmer 00:02:04.375 --> 00:02:06.042 than a person called Mary. 00:02:07.250 --> 00:02:12.333 A black man is more likely to be a repeat offender than a white man. 00:02:14.958 --> 00:02:16.226 You're probably thinking, 00:02:16.250 --> 00:02:20.000 "Wow, that sounds like a pretty sexist, racist person," right? 00:02:21.000 --> 00:02:25.851 These are some real decisions that AI has made very recently, 00:02:25.875 --> 00:02:28.809 based on the biases it has learned from us, 00:02:28.833 --> 00:02:30.083 from the humans. 00:02:31.750 --> 00:02:36.559 AI is being used to help decide whether or not you get that job interview; 00:02:36.583 --> 00:02:38.976 how much you pay for your car insurance; 00:02:39.000 --> 00:02:40.893 how good your credit score is; 00:02:40.917 --> 00:02:44.042 and even what rating you get in your annual performance review. 00:02:45.083 --> 00:02:48.226 But these decisions are all being filtered through 00:02:48.250 --> 00:02:54.125 its assumptions about our identity, our race, our gender, our age. 00:02:56.250 --> 00:02:58.518 How is that happening? 00:02:58.542 --> 00:03:02.059 Now, imagine an AI is helping a hiring manager 00:03:02.083 --> 00:03:04.934 find the next tech leader in the company. 00:03:04.958 --> 00:03:08.059 So far, the manager has been hiring mostly men. 00:03:08.083 --> 00:03:12.833 So the AI learns men are more likely to be programmers than women. 00:03:13.542 --> 00:03:16.434 And it's a very short leap from there to: 00:03:16.458 --> 00:03:18.500 men make better programmers than women. 00:03:19.417 --> 00:03:23.143 We have reinforced our own bias into the AI. 00:03:23.167 --> 00:03:26.792 And now, it's screening out female candidates. 00:03:28.917 --> 00:03:31.934 Hang on, if a human hiring manager did that, 00:03:31.958 --> 00:03:34.309 we'd be outraged, we wouldn't allow it. 00:03:34.333 --> 00:03:37.809 This kind of gender discrimination is not OK. 00:03:37.833 --> 00:03:42.351 And yet somehow, AI has become above the law, 00:03:42.375 --> 00:03:44.458 because a machine made the decision. 00:03:45.833 --> 00:03:47.351 That's not it. 00:03:47.375 --> 00:03:52.250 We are also reinforcing our bias in how we interact with AI. 00:03:52.917 --> 00:03:58.893 How often do you use a voice assistant like Siri, Alexa or even Cortana? 00:03:58.917 --> 00:04:01.476 They all have two things in common: 00:04:01.500 --> 00:04:04.601 one, they can never get my name right, 00:04:04.625 --> 00:04:07.292 and second, they are all female. 00:04:08.417 --> 00:04:11.184 They are designed to be our obedient servants, 00:04:11.208 --> 00:04:14.458 turning your lights on and off, ordering your shopping. 00:04:15.125 --> 00:04:18.434 You get male AIs too, but they tend to be more high-powered, 00:04:18.458 --> 00:04:21.517 like IBM Watson, making business decisions, 00:04:21.541 --> 00:04:25.333 Salesforce Einstein or ROSS, the robot lawyer. 00:04:26.208 --> 00:04:30.268 So poor robots, even they suffer from sexism in the workplace. 00:04:30.292 --> 00:04:31.417 (Laughter) 00:04:32.542 --> 00:04:35.393 Think about how these two things combine 00:04:35.417 --> 00:04:40.726 and affect a kid growing up in today's world around AI. 00:04:40.750 --> 00:04:43.684 So they're doing some research for a school project 00:04:43.708 --> 00:04:46.726 and they Google images of CEO. 00:04:46.750 --> 00:04:49.643 The algorithm shows them results of mostly men. 00:04:49.667 --> 00:04:52.226 And now, they Google personal assistant. 00:04:52.250 --> 00:04:55.684 As you can guess, it shows them mostly females. 00:04:55.708 --> 00:04:59.309 And then they want to put on some music, and maybe order some food, 00:04:59.333 --> 00:05:05.917 and now, they are barking orders at an obedient female voice assistant. 00:05:07.542 --> 00:05:12.851 Some of our brightest minds are creating this technology today. 00:05:12.875 --> 00:05:17.059 Technology that they could have created in any way they wanted. 00:05:17.083 --> 00:05:22.768 And yet, they have chosen to create it in the style of 1950s "Mad Man" secretary. 00:05:22.792 --> 00:05:24.292 Yay! 00:05:24.958 --> 00:05:26.268 But OK, don't worry, 00:05:26.292 --> 00:05:28.351 this is not going to end with me telling you 00:05:28.375 --> 00:05:31.852 that we are all heading towards sexist, racist machines running the world. 00:05:32.792 --> 00:05:38.583 The good news about AI is that it is entirely within our control. 00:05:39.333 --> 00:05:43.333 We get to teach the right values, the right ethics to AI. 00:05:44.167 --> 00:05:46.351 So there are three things we can do. 00:05:46.375 --> 00:05:49.726 One, we can be aware of our own biases 00:05:49.750 --> 00:05:52.476 and the bias in machines around us. 00:05:52.500 --> 00:05:57.018 Two, we can make sure that diverse teams are building this technology. 00:05:57.042 --> 00:06:01.958 And three, we have to give it diverse experiences to learn from. 00:06:02.875 --> 00:06:06.184 I can talk about the first two from personal experience. 00:06:06.208 --> 00:06:07.643 When you work in technology 00:06:07.667 --> 00:06:11.059 and you don't look like a Mark Zuckerberg or Elon Musk, 00:06:11.083 --> 00:06:14.833 your life is a little bit difficult, your ability gets questioned. 00:06:15.875 --> 00:06:17.268 Here's just one example. 00:06:17.292 --> 00:06:21.018 Like most developers, I often join online tech forums 00:06:21.042 --> 00:06:24.268 and share my knowledge to help others. 00:06:24.292 --> 00:06:25.601 And I've found, 00:06:25.625 --> 00:06:29.601 when I log on as myself, with my own photo, my own name, 00:06:29.625 --> 00:06:34.226 I tend to get questions or comments like this: 00:06:34.250 --> 00:06:37.250 "What makes you think you're qualified to talk about AI?" 00:06:38.458 --> 00:06:41.934 "What makes you think you know about machine learning?" 00:06:41.958 --> 00:06:45.393 So, as you do, I made a new profile, 00:06:45.417 --> 00:06:50.268 and this time, instead of my own picture, I chose a cat with a jet pack on it. 00:06:50.292 --> 00:06:52.750 And I chose a name that did not reveal my gender. 00:06:53.917 --> 00:06:56.643 You can probably guess where this is going, right? 00:06:56.667 --> 00:07:03.059 So, this time, I didn't get any of those patronizing comments about my ability 00:07:03.083 --> 00:07:06.417 and I was able to actually get some work done. 00:07:07.500 --> 00:07:09.351 And it sucks, guys. 00:07:09.375 --> 00:07:11.851 I've been building robots since I was 15, 00:07:11.875 --> 00:07:14.143 I have a few degrees in computer science, 00:07:14.167 --> 00:07:16.601 and yet, I had to hide my gender 00:07:16.625 --> 00:07:18.875 in order for my work to be taken seriously. 00:07:19.875 --> 00:07:21.768 So, what's going on here? 00:07:21.792 --> 00:07:25.000 Are men just better at technology than women? 00:07:25.917 --> 00:07:27.476 Another study found 00:07:27.500 --> 00:07:32.434 that when women coders on one platform hid their gender, like myself, 00:07:32.458 --> 00:07:35.708 their code was accepted four percent more than men. 00:07:36.542 --> 00:07:39.458 So this is not about the talent. 00:07:39.958 --> 00:07:42.851 This is about an elitism in AI 00:07:42.875 --> 00:07:45.667 that says a programmer needs to look like a certain person. 00:07:47.375 --> 00:07:50.476 What we really need to do to make AI better 00:07:50.500 --> 00:07:53.542 is bring people from all kinds of backgrounds. 00:07:54.542 --> 00:07:57.101 We need people who can write and tell stories 00:07:57.125 --> 00:07:59.292 to help us create personalities of AI. 00:08:00.208 --> 00:08:02.250 We need people who can solve problems. 00:08:03.125 --> 00:08:06.893 We need people who face different challenges 00:08:06.917 --> 00:08:12.268 and we need people who can tell us what are the real issues that need fixing, 00:08:12.292 --> 00:08:15.333 and help us find ways that technology can actually fix it. 00:08:17.833 --> 00:08:21.559 Because, when people from diverse backgrounds come together, 00:08:21.583 --> 00:08:23.726 when we build things in the right way, 00:08:23.750 --> 00:08:25.792 the possibilities are limitless. 00:08:26.750 --> 00:08:30.059 And that's what I want to end by talking to you about. 00:08:30.083 --> 00:08:34.308 Less racist robots, less machines that are going to take our jobs 00:08:34.332 --> 00:08:37.457 and more about what technology can actually achieve. 00:08:38.292 --> 00:08:41.726 So, yes, some of the energy in the world of AI, 00:08:41.750 --> 00:08:43.143 in the world of technology 00:08:43.167 --> 00:08:47.434 is going to be about what ads you see on your stream. 00:08:47.458 --> 00:08:52.667 But a lot of it is going towards making the world so much better. 00:08:53.500 --> 00:08:57.268 Think about a pregnant woman in the Democratic Republic of Congo, 00:08:57.292 --> 00:09:01.476 who has to walk 17 hours to her nearest rural prenatal clinic 00:09:01.500 --> 00:09:03.351 to get a checkup. 00:09:03.375 --> 00:09:06.292 What if she could get diagnosis on her phone, instead? 00:09:07.750 --> 00:09:09.559 Or think about what AI could do 00:09:09.583 --> 00:09:12.309 for those one in three women in South Africa 00:09:12.333 --> 00:09:14.458 who face domestic violence. 00:09:15.083 --> 00:09:17.809 If it wasn't safe to talk out loud, 00:09:17.833 --> 00:09:20.309 they could get an AI service to raise alarm, 00:09:20.333 --> 00:09:22.792 get financial and legal advice. 00:09:23.958 --> 00:09:28.976 These are all real examples of projects that people, including myself, 00:09:29.000 --> 00:09:31.500 are working on right now, using AI. 00:09:33.542 --> 00:09:37.143 So, I'm sure in the next couple of days there will be yet another news story 00:09:37.167 --> 00:09:39.851 about the existential risk, 00:09:39.875 --> 00:09:42.309 robots taking over and coming for your jobs. 00:09:42.333 --> 00:09:43.351 (Laughter) 00:09:43.375 --> 00:09:45.684 And when something like that happens, 00:09:45.708 --> 00:09:49.309 I know I'll get the same messages worrying about the future. 00:09:49.333 --> 00:09:53.000 But I feel incredibly positive about this technology. 00:09:55.458 --> 00:10:01.417 This is our chance to remake the world into a much more equal place. 00:10:02.458 --> 00:10:06.458 But to do that, we need to build it the right way from the get go. 00:10:07.667 --> 00:10:12.750 We need people of different genders, races, sexualities and backgrounds. 00:10:14.458 --> 00:10:16.934 We need women to be the makers 00:10:16.958 --> 00:10:19.958 and not just the machines who do the makers' bidding. 00:10:21.875 --> 00:10:25.643 We need to think very carefully what we teach machines, 00:10:25.667 --> 00:10:27.309 what data we give them, 00:10:27.333 --> 00:10:30.458 so they don't just repeat our own past mistakes. 00:10:32.125 --> 00:10:35.667 So I hope I leave you thinking about two things. 00:10:36.542 --> 00:10:41.101 First, I hope you leave thinking about bias today. 00:10:41.125 --> 00:10:44.309 And that the next time you scroll past an advert 00:10:44.333 --> 00:10:47.143 that assumes you are interested in fertility clinics 00:10:47.167 --> 00:10:50.018 or online betting websites, 00:10:50.042 --> 00:10:52.059 that you think and remember 00:10:52.083 --> 00:10:56.708 that the same technology is assuming that a black man will reoffend. 00:10:57.833 --> 00:11:02.000 Or that a woman is more likely to be a personal assistant than a CEO. 00:11:02.958 --> 00:11:06.667 And I hope that reminds you that we need to do something about it. 00:11:08.917 --> 00:11:10.768 And second, 00:11:10.792 --> 00:11:12.684 I hope you think about the fact 00:11:12.708 --> 00:11:14.684 that you don't need to look a certain way 00:11:14.708 --> 00:11:18.559 or have a certain background in engineering or technology 00:11:18.583 --> 00:11:19.851 to create AI, 00:11:19.875 --> 00:11:22.750 which is going to be a phenomenal force for our future. 00:11:24.166 --> 00:11:26.309 You don't need to look like a Mark Zuckerberg, 00:11:26.333 --> 00:11:27.583 you can look like me. 00:11:29.250 --> 00:11:32.143 And it is up to all of us in this room 00:11:32.167 --> 00:11:34.893 to convince the governments and the corporations 00:11:34.917 --> 00:11:37.809 to build AI technology for everyone, 00:11:37.833 --> 00:11:40.226 including the edge cases. 00:11:40.250 --> 00:11:42.309 And for us all to get education 00:11:42.333 --> 00:11:44.708 about this phenomenal technology in the future. 00:11:46.167 --> 00:11:48.184 Because if we do that, 00:11:48.208 --> 00:11:53.101 then we've only just scratched the surface of what we can achieve with AI. 00:11:53.125 --> 00:11:54.393 Thank you. 00:11:54.417 --> 00:11:57.125 (Applause)