WEBVTT 00:00:00.000 --> 00:00:03.204 (Toby Walsh) I wanna talk about artificial intelligence: it's -- 00:00:03.674 --> 00:00:05.089 I'm a professor of artificial intelligence 00:00:05.089 --> 00:00:09.271 and its a great time, 2015, to be working in AI. 00:00:09.544 --> 00:00:10.544 We're making real palpable progress 00:00:10.544 --> 00:00:12.820 and there's loads of money being thrown at us. 00:00:12.820 --> 00:00:17.000 Google just spent five hundred million dollars -- 00:00:17.000 --> 00:00:21.989 pounds buying an AI startup called Deep Mind a couple of weeks ago, 00:00:21.989 --> 00:00:25.172 today they announced that they were going to spend a billion dollars 00:00:25.671 --> 00:00:28.469 setting up an AI lab in Silicon Valley. 00:00:29.001 --> 00:00:31.591 IBM is betting about a third of the company 00:00:31.830 --> 00:00:34.858 on their cognitive AI computing effort. 00:00:35.084 --> 00:00:38.216 So it's really interesting time to be working in AI. 00:00:38.631 --> 00:00:41.782 But the first thing I wanted to help inform you about 00:00:42.250 --> 00:00:46.792 is what is the state of art, what progress have we made in AI 00:00:46.792 --> 00:00:51.187 because Hollywood paints all these pictures, 00:00:51.187 --> 00:00:53.488 these mostly dystopian pictures of AI. 00:00:53.495 --> 00:00:58.672 Whenever the next science fiction movie comes out, I put my head in my hands 00:00:58.672 --> 00:01:01.755 and think Oh my God, what do people think that we're doing? 00:01:01.755 --> 00:01:08.729 So, I wanted to start by just giving you a feel for what actually is really capable. 00:01:09.081 --> 00:01:13.872 So a couple of years ago, IBM Watson won the game show Jeopardy 1, 00:01:13.872 --> 00:01:19.737 the million dollar prize in the game show, Jeopardy P, the reigning human champions. 00:01:19.737 --> 00:01:23.357 Now, you might think, well that's just a party trick, isn't it? 00:01:23.357 --> 00:01:27.625 It's a -- pour enough of Wikipedia and the internet into a computer, 00:01:27.625 --> 00:01:30.543 and it can answer general knowledge questions. 00:01:30.543 --> 00:01:33.204 Well, you guys are being a bit unfair to IBM Watson, 00:01:33.204 --> 00:01:35.881 there are the cryptic questions they are answering. 00:01:35.881 --> 00:01:40.878 But just to give you a real feel for what is technically possible today, 00:01:40.878 --> 00:01:43.586 something that was announced just two days ago: 00:01:43.834 --> 00:01:51.869 some colleagues of mine at NII in Japan passed the University Entrance Exam 00:01:51.869 --> 00:01:53.433 with an AI program. 00:01:53.433 --> 00:01:56.326 Now, I thought long and hard about putting up a page of maths. 00:01:56.326 --> 00:01:58.007 I thought, well, I'm going to get -- 00:01:58.007 --> 00:02:00.007 half of the audience is going to leave immediately 00:02:00.007 --> 00:02:01.354 if I put up a page of math 00:02:01.354 --> 00:02:03.133 But I wanted you to see, just to feel 00:02:03.962 --> 00:02:07.403 the depth of questions that they were answering. 00:02:07.403 --> 00:02:11.507 So this is from the maths paper, you know, a non trivial, sort of, 00:02:11.507 --> 00:02:15.148 if you come from the UK, A-level-like math question 00:02:15.148 --> 00:02:16.468 that they were able to answer 00:02:17.272 --> 00:02:21.889 Here is a physics question about Newtonian dynamics 00:02:21.889 --> 00:02:23.552 that they were able to answer. 00:02:23.552 --> 00:02:28.545 Now they got 511 points, out of a maximum of 950. 00:02:28.939 --> 00:02:32.391 That's more than the average score of Japanese students 00:02:32.391 --> 00:02:34.007 sitting the entrance exam. 00:02:34.007 --> 00:02:38.542 They would have got into most Japanese universities with a score of 511. 00:02:38.908 --> 00:02:40.621 That's what's possible today. 00:02:40.621 --> 00:02:44.134 Their ambition in 10 years' time is to get into Tokyo, University of Tokyo, 00:02:44.134 --> 00:02:46.434 which is one of the best universities in the world. 00:02:48.488 --> 00:02:50.475 So, this is why I put up a picture of Terminator, 00:02:50.475 --> 00:02:53.468 because whenever I talk to the media about what we do in AI, 00:02:53.476 --> 00:02:55.769 they put up a picture of Terminator, right? 00:02:55.769 --> 00:02:58.098 So I don't want you to worry about Terminator, 00:02:58.098 --> 00:03:02.117 Terminator is at least 50 to 100 years away. 00:03:02.582 --> 00:03:05.985 and there are lots of reasons why we don't have to worry about it, 00:03:05.985 --> 00:03:07.248 about Terminator. 00:03:07.248 --> 00:03:08.979 I'm not going to go and spend too much time 00:03:08.979 --> 00:03:11.348 on why you don't have to worry about Terminator. 00:03:11.676 --> 00:03:13.334 But there is actually things that you should worry about, 00:03:13.334 --> 00:03:15.352 much nearer than Terminator. 00:03:17.007 --> 00:03:20.160 Many people have said, Stephen Hawkins has said 00:03:20.160 --> 00:03:22.713 that, you know, AI is going to spell the end of the human race. 00:03:23.295 --> 00:03:27.961 Elon Musk chimed in afterwards, said "It's our biggest existential threat." 00:03:29.793 --> 00:03:32.841 Bill Gates followed on by saying, "Elon Musk was right." 00:03:33.498 --> 00:03:39.519 Lots of people have said that's a -- AI is a real existential threat to us. 00:03:39.519 --> 00:03:42.887 I don't want you to worry about the existential threat that AI 00:03:42.887 --> 00:03:44.900 or Terminator is going to bring. 00:03:45.953 --> 00:03:47.711 There's actually a very common confusion, 00:03:48.121 --> 00:03:51.139 which is that it's not the AI that's going to be the existential threat, 00:03:51.139 --> 00:03:52.139 it's autonomy, it's autonomous systems. 00:03:54.386 --> 00:03:56.042 What's in the (check) -- 00:03:56.042 --> 00:03:57.972 it isn't it going to wake up any time in the morning and say: 00:03:57.972 --> 00:04:02.109 "You know what? I'm tired of playing Jeopardy, 00:04:02.109 --> 00:04:04.173 "I want to play Who Wants to Be a Millionaire! 00:04:05.299 --> 00:04:07.468 "Or wait a second, I'm tired of playing game shows, 00:04:07.468 --> 00:04:09.393 I want to take over the universe." 00:04:09.821 --> 00:04:12.195 It's just not in its code, there is no way, 00:04:12.195 --> 00:04:15.065 it's not given any freedom to think about anything other 00:04:15.065 --> 00:04:17.000 than maximizing its Jeopardy score. 00:04:18.219 --> 00:04:21.209 And it has no desires, no other desires than, 00:04:21.228 --> 00:04:23.482 other to improve its maximum scores. 00:04:23.482 --> 00:04:26.256 So I don't want you to worry about Terminator, 00:04:27.350 --> 00:04:29.102 but I do want you to worry about jobs. 00:04:29.974 --> 00:04:32.166 Because lots of people, lots of very serious people, 00:04:32.166 --> 00:04:35.076 have been saying hat AI is going to end jobs, 00:04:35.076 --> 00:04:38.157 and that is a very great consequence for anyone working in education, 00:04:38.157 --> 00:04:42.269 because, certainly, the jobs that are going to exist in the future 00:04:42.269 --> 00:04:44.925 are going to be different than the jobs that exist today. 00:04:46.880 --> 00:04:49.724 Now, who has an odd birthday? 00:04:51.251 --> 00:04:52.938 Well, I haven't told you what an odd birthday is yet, 00:04:52.938 --> 00:04:54.990 so someone has an odd birthday, like me. 00:04:54.990 --> 00:04:57.858 OK. Who was born on an odd-number day of the month? 00:04:57.858 --> 00:04:59.995 I was born on the 11th of April, right? 00:05:00.824 --> 00:05:03.368 Come on, it's half the room, I know it's half the room. 00:05:03.368 --> 00:05:04.923 (Laughter) 00:05:04.923 --> 00:05:08.690 OK.Well, you want to have an odd birthday, by the way, 00:05:09.059 --> 00:05:12.504 because that means, in 20 years' time, you will be a person with a job. 00:05:13.426 --> 00:05:16.277 As opposed to the even people, who won't have jobs. 00:05:16.277 --> 00:05:20.662 That's certainty -- if you believe lots of serious people, 00:05:22.141 --> 00:05:25.010 you might have missed this news on Friday the 13th, 00:05:25.010 --> 00:05:27.778 I thought this was a rather depressing news story 00:05:27.778 --> 00:05:29.748 ....... (check) comparison otherwise ...... (check) 00:05:29.777 --> 00:05:31.527 but the chief economist, Bank of England, 00:05:31.922 --> 00:05:36.230 went on the record saying 50% of jobs were under threat in the UK. 00:05:36.948 --> 00:05:40.420 And he's not the first serious person who should know what he's talking about 00:05:40.420 --> 00:05:42.736 who said similar things. 00:05:43.029 --> 00:05:46.255 There was a very influential Merrill Lynch report that came out a month or to ago 00:05:46.270 --> 00:05:49.430 saying very similar things about the impact of AI, 00:05:49.430 --> 00:05:51.375 robotics, automation on jobs. 00:05:53.439 --> 00:05:56.278 And some of this goes back to, I think, one of the first reports 00:05:56.278 --> 00:05:59.288 that really hit the press, that really got people's attention, 00:05:59.288 --> 00:06:02.075 was a report that came out of the Oxford Martin School. 00:06:02.075 --> 00:06:05.702 They predicted that 47% of jobs in the United States 00:06:06.110 --> 00:06:08.550 were under threat of automation in the next 20 years. 00:06:10.234 --> 00:06:13.594 We followed that up with a very similar study and analysis 00:06:13.880 --> 00:06:15.961 for jobs in Australia, where I work. 00:06:16.865 --> 00:06:19.323 And because it's a slightly different profile of workers, 00:06:19.323 --> 00:06:24.329 of the work force in Australia, we came up with a number of around 40%. 00:06:24.329 --> 00:06:27.129 These are non trivial numbers, right? 40-50%. 00:06:27.589 --> 00:06:32.497 No, just an aside: 47%, I don't know why they didn't say 47.2%, right? 00:06:32.497 --> 00:06:35.551 You can't believe a number that's far too precise 00:06:35.551 --> 00:06:37.842 when you're predicting the future, but nevertheless, 00:06:37.842 --> 00:06:39.532 the fact that it's of this sort of scale, 00:06:39.532 --> 00:06:44.601 you've got to take away: it wasn't 4%, it was roughly about half the jobs. 00:06:47.784 --> 00:06:50.495 Now, let's put some context to this. 00:06:50.495 --> 00:06:52.413 I mean, is this really a credible claim? 00:06:52.413 --> 00:06:55.549 The Number One job in the United States today: truck driver. 00:06:55.962 --> 00:06:57.526 Now you might have noticed, 00:06:57.536 --> 00:07:00.079 the autonomous cars are coming to us very soon. 00:07:00.506 --> 00:07:03.579 We're going to be having -- tried the first trial of autonomous cars 00:07:03.579 --> 00:07:06.760 on the roads, public roads of Australia, three weeks ago. 00:07:07.371 --> 00:07:09.657 The Google Car has driven over a million kilometers 00:07:09.657 --> 00:07:12.542 -- or the Google cars, rather, have driven over a million kilometers, 00:07:12.907 --> 00:07:15.050 autonomously, on the roads of California. 00:07:15.893 --> 00:07:18.553 In 20 years' time, we are going to have autonomous cars. 00:07:18.553 --> 00:07:20.255 We're also going to have autonomous trucks. 00:07:21.033 --> 00:07:24.769 So if you are in the Number One profession in the United States, 00:07:24.769 --> 00:07:28.480 you have to worry that your job is not going to be automated away. 00:07:29.712 --> 00:07:32.494 The Number Two job in the United States is salesperson. 00:07:33.343 --> 00:07:36.248 Again, since we use the internet, 00:07:36.248 --> 00:07:39.618 we've actually mostly automated that process ourselves, 00:07:39.618 --> 00:07:43.382 but it's clear that a lot of those jobs are going to be disappearing. 00:07:43.382 --> 00:07:46.736 So I think these claims have a lot of credibility. 00:07:48.896 --> 00:07:53.104 There's actually a nice dinner party game that my colleagues in AI play 00:07:53.104 --> 00:07:55.736 at the end of our conferences, where we sit around 00:07:55.736 --> 00:07:58.250 and the game is, you have to name a job 00:07:59.115 --> 00:08:02.671 and then, someone has to put up some credible evidence 00:08:02.671 --> 00:08:05.748 that we're actually well on the way to actually automating that. 00:08:05.748 --> 00:08:08.152 And this game is almost impossible to win. 00:08:08.737 --> 00:08:10.423 If I had more time, I'd play the game with you. 00:08:11.293 --> 00:08:15.260 The only -- about the only winning answer is politician. 00:08:15.650 --> 00:08:17.931 (Laughter) 00:08:17.931 --> 00:08:20.747 They will certainly regulate that they'll be the last to be automated. 00:08:20.747 --> 00:08:22.731 But that's about the only winning answer we have. 00:08:24.049 --> 00:08:28.836 So -- and it's not just technology that is the cause of this. 00:08:28.836 --> 00:08:33.035 There's many other, really, sort of rather unhelpful trends. 00:08:33.035 --> 00:08:35.558 If you were trying to set up the world's economy, 00:08:35.558 --> 00:08:38.842 you would not put these things all down on the table at the same time: 00:08:38.842 --> 00:08:40.825 the global, ongoing global financial crisis, 00:08:40.825 --> 00:08:43.840 which seems like it will never disappear, I think; 00:08:44.755 --> 00:08:48.013 the fact that we're all living longer: this is great, great news for us 00:08:48.013 --> 00:08:50.333 but bad news for employment; 00:08:50.961 --> 00:08:54.425 the impact of globalization, the fact that we can outsource our work 00:08:54.425 --> 00:08:56.191 to cheaper economies. 00:08:56.191 --> 00:09:00.191 All of these things are compounding the impact 00:09:00.191 --> 00:09:02.957 that technology is having on the nature of work. 00:09:04.749 --> 00:09:09.725 And this transformation is going to be different than the last one, 00:09:09.725 --> 00:09:11.357 the Industrial revolution. 00:09:12.021 --> 00:09:15.591 There's no hard and fast rule of economics that says: 00:09:16.911 --> 00:09:20.039 "As many jobs need to be created by a new technology as destroyed." 00:09:20.039 --> 00:09:22.952 Every time we have a new technology, of course, new jobs are created. 00:09:22.952 --> 00:09:25.883 There's lots of, there's thousands, hundreds of thousands of new jobs 00:09:26.204 --> 00:09:27.848 enabled by technology today. 00:09:28.355 --> 00:09:32.484 But there's no reason that they have to balance exactly those that are destroyed. 00:09:33.092 --> 00:09:36.199 In the last -- in the last revolution, that did happen to be the case. 00:09:36.853 --> 00:09:40.930 A third of the population was working out in the fields, in agriculture. 00:09:40.930 --> 00:09:44.552 Now, worldwide, it's 3 or 4% of the world's population 00:09:44.552 --> 00:09:45.657 working in agriculture. 00:09:45.657 --> 00:09:48.029 Those people are working in factories and offices now. 00:09:48.029 --> 00:09:51.920 We employ far more people than we did at the turn of the 19th century. 00:09:52.629 --> 00:09:55.993 But this one looks different, this information revolution looks different. 00:09:55.993 --> 00:10:00.233 It looks like it has the potential to take away more jobs, perhaps, 00:10:00.233 --> 00:10:01.529 than it does. 00:10:01.529 --> 00:10:03.684 And one of the other things is that we used to think 00:10:03.684 --> 00:10:05.293 it was the blue-collar jobs. 00:10:06.509 --> 00:10:10.937 And that's true: if you go to a car factory today, sure enough, 00:10:10.937 --> 00:10:12.578 there are robots that are doing the painting, 00:10:12.578 --> 00:10:14.800 there are robots that are doing the welding. 00:10:14.800 --> 00:10:16.584 But nowadays, it's white-collar jobs: 00:10:16.584 --> 00:10:20.006 it's journalists, it's lawyers, it's accountants, 00:10:20.006 --> 00:10:21.667 these jobs that are under threat. 00:10:21.667 --> 00:10:25.444 These graphs here show the percentage change in employment 00:10:25.444 --> 00:10:31.244 and the change in employment rates. 00:10:31.660 --> 00:10:34.672 And it's the middle, the middle class, white-collar professions 00:10:34.672 --> 00:10:37.923 that we thought that you would go to university to make yourself safe, 00:10:37.926 --> 00:10:40.551 but it seems to be the ones that are most under threat. 00:10:40.551 --> 00:10:42.646 If you are a ....... (check) it's probably -- 00:10:43.277 --> 00:10:46.172 you're too cheap to be replaced by something automated. 00:10:46.172 --> 00:10:49.963 But if you're a more expensive person, and this means (check) 00:10:49.963 --> 00:10:53.427 that the rich are getting richer and inequalities that we are seeing in society 00:10:53.427 --> 00:10:55.281 that are distressing our societies to day, 00:10:55.281 --> 00:10:58.284 seem to be magnified by these technological changes. 00:10:59.899 --> 00:11:01.761 And there is so many frightening graphs, 00:11:01.761 --> 00:11:03.723 Go and read Thomas Picketty (check), I encourage you. 00:11:04.079 --> 00:11:05.980 Go and look at one of his books and you can see here 00:11:05.980 --> 00:11:09.619 that we're seeing a constant improvement in productivity. 00:11:10.079 --> 00:11:12.860 Technology is buying us those improvements in productivity, 00:11:12.860 --> 00:11:17.888 is increasing our wealth, but there's a leveling off of employment. 00:11:18.462 --> 00:11:20.187 And so, the challenge, then, is how 00:11:20.753 --> 00:11:23.651 -- it's a question for society, not for a technologist like myself -- 00:11:23.651 --> 00:11:25.777 how do we all benefit from this rising tide, 00:11:25.777 --> 00:11:30.648 not so that it is the rich get richer and the rest of us get further behind. 00:11:31.613 --> 00:11:36.443 So, many parts of many jobs looks likely (check) to be automated. 00:11:36.767 --> 00:11:39.624 One confusion is this: people say these jobs are going to disappear. 00:11:39.624 --> 00:11:43.009 Actually, it seems to be more likely that many parts of your job will be automated. 00:11:43.009 --> 00:11:45.892 But that still means that there is perhaps less employment around. 00:11:46.201 --> 00:11:48.860 So how can you make yourself more future-proof? 00:11:49.382 --> 00:11:51.291 Well, I have two pieces of advice as a technologist, 00:11:51.291 --> 00:11:53.950 in terms of what's going to be technically possible in AI. 00:11:54.215 --> 00:11:57.886 Either you've got to embrace the future, so become like me, 00:11:58.605 --> 00:12:01.725 become someone who's working on trying to invent that future. 00:12:02.218 --> 00:12:03.671 And if you're not technically minded, that's fine: 00:12:03.671 --> 00:12:07.127 I've got the other part of the equation, the other answer to your question 00:12:07.127 --> 00:12:10.437 which is completely at the other end of the spectrum, which is: 00:12:10.786 --> 00:12:14.011 focus on those things that find as AI, (check) the hardest things, 00:12:14.011 --> 00:12:15.672 making computers more creative, 00:12:15.672 --> 00:12:18.719 making computers that can understand your emotional state, 00:12:19.269 --> 00:12:23.111 focusing on emotional terms and not intellectual intelligence. 00:12:24.765 --> 00:12:26.866 So, how safe is education? 00:12:26.866 --> 00:12:28.728 The room is here, full of people working in education. 00:12:28.728 --> 00:12:30.104 How safe are your jobs? 00:12:30.104 --> 00:12:33.452 Well these are the numbers from that Oxford Martin report I wrote. 00:12:33.789 --> 00:12:35.096 So if you're a telemarketeer, 00:12:35.096 --> 00:12:37.908 99% chance that you're going to be automated. 00:12:37.908 --> 00:12:39.162 Not surprising, right? 00:12:39.162 --> 00:12:41.208 Easy to automate, it's down the phone. 00:12:42.589 --> 00:12:45.322 Some of the numbers I just don't want you to take away: 00:12:45.322 --> 00:12:48.195 they used machine only, they used AI to actually generate the report 00:12:48.195 --> 00:12:50.240 and I don't believe some of the numbers, 00:12:50.240 --> 00:12:53.141 I don't believe: "Bicycle repairmen: 94%." 00:12:53.141 --> 00:12:54.550 There's no chance in hell 00:12:54.550 --> 00:12:56.804 that the bicycle repair person is going to be automated: 00:12:56.804 --> 00:12:58.916 far too cheap and intricate a job. 00:12:58.916 --> 00:13:03.840 "Parking lot attendant: 87%." I don't know why it's not 100%: 00:13:03.840 --> 00:13:06.271 well, you're not going to have parking lot attendants, for sure. 00:13:06.770 --> 00:13:08.290 But look: luckily, 00:13:08.290 --> 00:13:10.408 you and me are right at the bottom of the list, 00:13:10.743 --> 00:13:12.974 down at the 0's and 1%, right? 00:13:13.451 --> 00:13:15.381 I think those numbers probably underestimate 00:13:15.981 --> 00:13:19.203 how replaceable or how irreplaceable we are, 00:13:19.203 --> 00:13:23.315 but nevertheless, you can take some heart away 00:13:23.315 --> 00:13:25.418 from the sort of numbers you see there. 00:13:26.488 --> 00:13:28.217 And the reason being? 00:13:28.217 --> 00:13:30.873 The first reason is, because we're dealing with people, 00:13:30.873 --> 00:13:32.436 we're trying to understand people, 00:13:32.436 --> 00:13:36.469 understand their motivations, what are their mental blocks. 00:13:36.469 --> 00:13:38.404 These are things that are really hard to get computers, 00:13:38.404 --> 00:13:41.445 I can tell you, really hard to get computers, program computers to do. 13:41