0:00:00.000,0:00:03.204 (Toby Walsh) I wanna talk [br]about artificial intelligence: it's -- 0:00:03.674,0:00:05.089 I'm a professor of artificial intelligence 0:00:05.089,0:00:09.271 and its a great time, 2015, [br]to be working in AI. 0:00:09.544,0:00:10.544 We're making real palpable progress 0:00:10.544,0:00:12.820 and there's loads of money [br]being thrown at us. 0:00:12.820,0:00:17.000 Google just spent [br]five hundred million dollars -- 0:00:17.000,0:00:21.989 pounds buying an AI startup called Deep Mind a couple of weeks ago, 0:00:21.989,0:00:25.172 today they announced that they were [br]going to spend a billion dollars 0:00:25.671,0:00:28.469 setting up an AI lab in Silicon Valley. 0:00:29.001,0:00:31.591 IBM is betting about [br]a third of the company 0:00:31.830,0:00:34.858 on their cognitive AI computing effort. 0:00:35.084,0:00:38.216 So it's really interesting time to be working in AI. 0:00:38.631,0:00:41.782 But the first thing I wanted to [br]help inform you about 0:00:42.250,0:00:46.792 is what is the state of art, [br]what progress have we made in AI 0:00:46.792,0:00:51.187 because Hollywood paints [br]all these pictures, 0:00:51.187,0:00:53.488 these mostly dystopian pictures of AI. 0:00:53.495,0:00:58.672 Whenever the next science fiction movie[br]comes out, I put my head in my hands 0:00:58.672,0:01:01.755 and think Oh my God, what do people think[br]that we're doing? 0:01:01.755,0:01:08.729 So, I wanted to start by just giving you[br]a feel for what actually is really capable. 0:01:09.081,0:01:13.872 So a couple of years ago, IBM Watson[br]won the game show Jeopardy 1, 0:01:13.872,0:01:19.737 the million dollar prize in the game show,[br]Jeopardy P, the reigning human champions. 0:01:19.737,0:01:23.357 Now, you might think, well that's just[br]a party trick, isn't it? 0:01:23.357,0:01:27.625 It's a -- pour enough of Wikipedia [br]and the internet into a computer, 0:01:27.625,0:01:30.543 and it can answer general knowledge[br]questions. 0:01:30.543,0:01:33.204 Well, you guys are being a bit unfair to[br]IBM Watson, 0:01:33.204,0:01:35.881 there are the cryptic questions[br]they are answering. 0:01:35.881,0:01:40.878 But just to give you a real feel for [br]what is technically possible today, 0:01:40.878,0:01:43.586 something that was announced[br]just two days ago: 0:01:43.834,0:01:51.869 some colleagues of mine at NII in Japan[br]passed the University Entrance Exam 0:01:51.869,0:01:53.433 with an AI program. 0:01:53.433,0:01:56.326 Now, I thought long and hard about[br]putting up a page of maths. 0:01:56.326,0:01:58.007 I thought, well, I'm going to get -- 0:01:58.007,0:02:00.007 half of the audience is going to [br]leave immediately 0:02:00.007,0:02:01.354 if I put up a page of math 0:02:01.354,0:02:03.133 But I wanted you to see, just to feel 0:02:03.962,0:02:07.403 the depth of questions [br]that they were answering. 0:02:07.403,0:02:11.507 So this is from the maths paper, you know,[br]a non trivial, sort of, 0:02:11.507,0:02:15.148 if you come from the UK, [br]A-level-like math question 0:02:15.148,0:02:16.468 that they were able to answer 0:02:17.272,0:02:21.889 Here is a physics question[br]about Newtonian dynamics 0:02:21.889,0:02:23.552 that they were able to answer. 0:02:23.552,0:02:28.545 Now they got 511 points, out of[br]a maximum of 950. 0:02:28.939,0:02:32.391 That's more than the average score[br]of Japanese students 0:02:32.391,0:02:34.007 sitting the entrance exam. 0:02:34.007,0:02:38.542 They would have got into most [br]Japanese universities with a score of 511. 0:02:38.908,0:02:40.621 That's what's possible today. 0:02:40.621,0:02:44.134 Their ambition in 10 years' time is to get[br]into Tokyo, University of Tokyo, 0:02:44.134,0:02:46.434 which is one of the best universities[br]in the world. 0:02:48.488,0:02:50.475 So, this is why I put up [br]a picture of Terminator, 0:02:50.475,0:02:53.468 because whenever I talk to the media[br]about what we do in AI, 0:02:53.476,0:02:55.769 they put up a picture of Terminator,[br]right? 0:02:55.769,0:02:58.098 So I don't want you to worry [br]about Terminator, 0:02:58.098,0:03:02.117 Terminator is at least [br]50 to 100 years away. 0:03:02.582,0:03:05.985 and there are lots of reasons why[br]we don't have to worry about it, 0:03:05.985,0:03:07.248 about Terminator. 0:03:07.248,0:03:08.979 I'm not going to go and spend[br]too much time 0:03:08.979,0:03:11.348 on why you don't have to worry about Terminator. 0:03:11.676,0:03:13.334 But there is actually things [br]that you should worry about, 0:03:13.334,0:03:15.352 much nearer than Terminator. 0:03:17.007,0:03:20.160 Many people have said, [br]Stephen Hawkins has said 0:03:20.160,0:03:22.713 that, you know, AI is going to spell[br]the end of the human race. 0:03:23.295,0:03:27.961 Elon Musk chimed in afterwards, said[br]"It's our biggest existential threat." 0:03:29.793,0:03:32.841 Bill Gates followed on by saying,[br]"Elon Musk was right." 0:03:33.498,0:03:39.519 Lots of people have said that's a --[br]AI is a real existential threat to us. 0:03:39.519,0:03:42.887 I don't want you to worry about[br]the existential threat that AI 0:03:42.887,0:03:44.900 or Terminator is going to bring. 0:03:45.953,0:03:47.711 There's actually a very common confusion, 0:03:48.121,0:03:51.139 which is that it's not the AI [br]that's going to be the existential threat, 0:03:51.139,0:03:52.139 it's autonomy, it's autonomous systems. 0:03:54.386,0:03:56.042 What's in the (check) -- 0:03:56.042,0:03:57.972 it isn't it going to wake up [br]any time in the morning and say: 0:03:57.972,0:04:02.109 "You know what? I'm tired [br]of playing Jeopardy, 0:04:02.109,0:04:04.173 "I want to play Who Wants to Be a Millionaire! 0:04:05.299,0:04:07.468 "Or wait a second, I'm tired of playing [br]game shows, 0:04:07.468,0:04:09.393 I want to take over the universe." 0:04:09.821,0:04:12.195 It's just not in its code, there is no way, 0:04:12.195,0:04:15.065 it's not given any freedom to think [br]about anything other 0:04:15.065,0:04:17.000 than maximizing its Jeopardy score. 0:04:18.219,0:04:21.209 And it has no desires, no other desires than, 0:04:21.228,0:04:23.482 other to improve its maximum scores. 0:04:23.482,0:04:26.256 So I don't want you to worry about Terminator, 0:04:27.350,0:04:29.102 but I do want you to worry about jobs. 0:04:29.974,0:04:32.166 Because lots of people, [br]lots of very serious people, 0:04:32.166,0:04:35.076 have been saying [br]hat AI is going to end jobs, 0:04:35.076,0:04:38.157 and that is a very great consequence [br]for anyone working in education, 0:04:38.157,0:04:42.269 because, certainly, the jobs that are going [br]to exist in the future 0:04:42.269,0:04:44.925 are going to be different [br]than the jobs that exist today. 0:04:46.880,0:04:49.724 Now, who has an odd birthday? 0:04:51.251,0:04:52.938 Well, I haven't told you [br]what an odd birthday is yet, 0:04:52.938,0:04:54.990 so someone has an odd birthday, like me. 0:04:54.990,0:04:57.858 OK. Who was born on an odd-number [br]day of the month? 0:04:57.858,0:04:59.995 I was born on the 11th of April, right? 0:05:00.824,0:05:03.368 Come on, it's half the room, [br]I know it's half the room. 0:05:03.368,0:05:04.923 (Laughter) 0:05:04.923,0:05:08.690 OK.Well, you want to have [br]an odd birthday, by the way, 0:05:09.059,0:05:12.504 because that means, in 20 years' time, [br]you will be a person with a job. 0:05:13.426,0:05:16.277 As opposed to the even people, [br]who won't have jobs. 0:05:16.277,0:05:20.662 That's certainty -- if you believe [br]lots of serious people, 0:05:22.141,0:05:25.010 you might have missed this news [br]on Friday the 13th, 0:05:25.010,0:05:27.778 I thought this was a rather[br]depressing news story 0:05:27.778,0:05:29.748 ....... (check) comparison otherwise ...... (check) 0:05:29.777,0:05:31.527 but the chief economist, Bank of England, 0:05:31.922,0:05:36.230 went on the record saying 50% of jobs [br]were under threat in the UK. 0:05:36.948,0:05:40.420 And he's not the first serious person[br]who should know what he's talking about 0:05:40.420,0:05:42.736 who said similar things. 0:05:43.029,0:05:46.255 There was a very influential Merrill Lynch[br]report that came out a month or to ago 0:05:46.270,0:05:49.430 saying very similar things about [br]the impact of AI, 0:05:49.430,0:05:51.375 robotics, automation on jobs. 0:05:53.439,0:05:56.278 And some of this goes back to, I think, [br]one of the first reports 0:05:56.278,0:05:59.288 that really hit the press, [br]that really got people's attention, 0:05:59.288,0:06:02.075 was a report that came out of [br]the Oxford Martin School. 0:06:02.075,0:06:05.702 They predicted that 47% of jobs [br]in the United States 0:06:06.110,0:06:08.550 were under threat of automation [br]in the next 20 years. 0:06:10.234,0:06:13.594 We followed that up [br]with a very similar study and analysis 0:06:13.880,0:06:15.961 for jobs in Australia, where I work. 0:06:16.865,0:06:19.323 And because it's a slightly different [br]profile of workers, 0:06:19.323,0:06:24.329 of the work force in Australia, we came up[br]with a number of around 40%. 0:06:24.329,0:06:27.129 These are non trivial numbers, right? [br]40-50%. 0:06:27.589,0:06:32.497 No, just an aside: 47%, I don't know [br]why they didn't say 47.2%, right? 0:06:32.497,0:06:35.551 You can't believe a number [br]that's far too precise 0:06:35.551,0:06:37.842 when you're predicting the future,[br]but nevertheless, 0:06:37.842,0:06:39.532 the fact that it's of this sort of scale, 0:06:39.532,0:06:44.601 you've got to take away: it wasn't 4%, [br]it was roughly about half the jobs. 0:06:47.784,0:06:50.495 Now, let's put some context to this. 0:06:50.495,0:06:52.413 I mean, is this really a credible claim? 0:06:52.413,0:06:55.549 The Number One job [br]in the United States today: truck driver. 0:06:55.962,0:06:57.526 Now you might have noticed, 0:06:57.536,0:07:00.079 the autonomous cars[br]are coming to us very soon. 0:07:00.506,0:07:03.579 We're going to be having -- tried[br]the first trial of autonomous cars 0:07:03.579,0:07:06.760 on the roads, public roads of Australia, [br]three weeks ago. 0:07:07.371,0:07:09.657 The Google Car has driven [br]over a million kilometers 0:07:09.657,0:07:12.542 -- or the Google cars, rather, [br]have driven over a million kilometers, 0:07:12.907,0:07:15.050 autonomously, on the roads of California. 0:07:15.893,0:07:18.553 In 20 years' time, we are going to have [br]autonomous cars. 0:07:18.553,0:07:20.255 We're also going to have [br]autonomous trucks. 0:07:21.033,0:07:24.769 So if you are in the Number One profession[br]in the United States, 0:07:24.769,0:07:28.480 you have to worry that your job [br]is not going to be automated away. 0:07:29.712,0:07:32.494 The Number Two job in the United States [br]is salesperson. 0:07:33.343,0:07:36.248 Again, since we use the internet, 0:07:36.248,0:07:39.618 we've actually mostly automated [br]that process ourselves, 0:07:39.618,0:07:43.382 but it's clear that a lot of those jobs [br]are going to be disappearing. 0:07:43.382,0:07:46.736 So I think these claims [br]have a lot of credibility. 0:07:48.896,0:07:53.104 There's actually a nice dinner party game [br]that my colleagues in AI play 0:07:53.104,0:07:55.736 at the end of our conferences,[br]where we sit around 0:07:55.736,0:07:58.250 and the game is, you have to name a job 0:07:59.115,0:08:02.671 and then, someone has to put up [br]some credible evidence 0:08:02.671,0:08:05.748 that we're actually well on the way [br]to actually automating that. 0:08:05.748,0:08:08.152 And this game is almost impossible to win. 0:08:08.737,0:08:10.423 If I had more time, [br]I'd play the game with you. 0:08:11.293,0:08:15.260 The only -- about the only winning answer[br]is politician. 0:08:15.650,0:08:17.931 (Laughter) 0:08:17.931,0:08:20.747 They will certainly regulate that[br]they'll be the last to be automated. 0:08:20.747,0:08:22.731 But that's about [br]the only winning answer we have. 0:08:24.049,0:08:28.836 So -- and it's not just technology [br]that is the cause of this. 0:08:28.836,0:08:33.035 There's many other, really, [br]sort of rather unhelpful trends. 0:08:33.035,0:08:35.558 If you were trying to set up [br]the world's economy, 0:08:35.558,0:08:38.842 you would not put these things [br]all down on the table at the same time: 0:08:38.842,0:08:40.825 the global, [br]ongoing global financial crisis, 0:08:40.825,0:08:43.840 which seems like [br]it will never disappear, I think; 0:08:44.755,0:08:48.013 the fact that we're all living longer: [br]this is great, great news for us 0:08:48.013,0:08:50.333 but bad news for employment; 0:08:50.961,0:08:54.425 the impact of globalization, the fact that [br]we can outsource our work 0:08:54.425,0:08:56.191 to cheaper economies. 0:08:56.191,0:09:00.191 All of these things [br]are compounding the impact 0:09:00.191,0:09:02.957 that technology is having [br]on the nature of work. 0:09:04.749,0:09:09.725 And this transformation is going to be [br]different than the last one, 0:09:09.725,0:09:11.357 the Industrial revolution. 0:09:12.021,0:09:15.591 There's no hard and fast [br]rule of economics that says: 0:09:16.911,0:09:20.039 "As many jobs need to be created [br]by a new technology as destroyed." 0:09:20.039,0:09:22.952 Every time we have a new technology, [br]of course, new jobs are created. 0:09:22.952,0:09:25.883 There's lots of, there's thousands, [br]hundreds of thousands of new jobs 0:09:26.204,0:09:27.848 enabled by technology today. 0:09:28.355,0:09:32.484 But there's no reason that they have to [br]balance exactly those that are destroyed. 0:09:33.092,0:09:36.199 In the last -- in the last revolution, [br]that did happen to be the case. 0:09:36.853,0:09:40.930 A third of the population was working [br]out in the fields, in agriculture. 0:09:40.930,0:09:44.552 Now, worldwide, [br]it's 3 or 4% of the world's population 0:09:44.552,0:09:45.657 working in agriculture. 0:09:45.657,0:09:48.029 Those people are working [br]in factories and offices now. 0:09:48.029,0:09:51.920 We employ far more people than we did [br]at the turn of the 19th century. 0:09:52.629,0:09:55.993 But this one looks different, this [br]information revolution looks different. 0:09:55.993,0:10:00.233 It looks like it has the potential [br]to take away more jobs, perhaps, 0:10:00.233,0:10:01.529 than it does. 0:10:01.529,0:10:03.684 And one of the other things is that [br]we used to think 0:10:03.684,0:10:05.293 it was the blue-collar jobs. 0:10:06.509,0:10:10.937 And that's true: if you go [br]to a car factory today, sure enough, 0:10:10.937,0:10:12.578 there are robots [br]that are doing the painting, 0:10:12.578,0:10:14.800 there are robots [br]that are doing the welding. 0:10:14.800,0:10:16.584 But nowadays, it's white-collar jobs: 0:10:16.584,0:10:20.006 it's journalists, it's lawyers, [br]it's accountants, 0:10:20.006,0:10:21.667 these jobs that are under threat. 0:10:21.667,0:10:25.444 These graphs here show [br]the percentage change in employment 0:10:25.444,0:10:31.244 and the change in employment rates. 0:10:31.660,0:10:34.672 And it's the middle, the middle class, [br]white-collar professions 0:10:34.672,0:10:37.923 that we thought that you would go [br]to university to make yourself safe, 0:10:37.926,0:10:40.624 but it seems to be the ones [br]that are most under threat. 9:59:59.000,9:59:59.000 If you are a ....... (check) it's probably -- you're too cheap to be replaced by something automated. But if you're a more expensive person, and this means (check) that the rich are getting richer and inequalities that we are seeing in society that are distressing our societies to day, seem to be magnified by these technological changes. And there is so many frightening graphs, if you're going to read Thomas Picketty (check), I encourage you. Go and look at one of his books, and you can see here that we're seeing a constant improvement in productivity. Technology is buying us those improvements in productivity, is increasing our wealth. But there's a leveling off of employment. And so, the challenge, then, is how -- it's a question for society, not for a technologist like myself -- how do we all benefit from this rising tide, not so that it is the rich get richer and the rest of us get further behind. So, many parts of many jobs looks likely (check) to be automated. One confusion is this: people say these jobs are going to disappear. Actually, it seems to be more likely that many parts of your job will be automated. But that still means that there is perhaps less employment around. So how can you make yourself more future-proof? Well, I have two pieces of advice as a technologist, in terms of what's going to be technically possible in AI. Either you've got to embrace the future, so become like me, become someone who's working on trying to invent that future. And if you're not technically minded, that's fine: I've got the other part of the equation, the other answer to your question which is completely at the other end of the spectrum, which is: focus on those things that find as AI, the hardest things, making a future that's more creative, making a future that can understand your emotional state, focusing on emotional terms and not intellectual intelligence. 12:23