1 00:00:00,000 --> 00:00:03,204 (Toby Walsh) I wanna talk about artificial intelligence: it's -- 2 00:00:03,674 --> 00:00:05,089 I'm a professor of artificial intelligence 3 00:00:05,089 --> 00:00:09,271 and its a great time, 2015, to be working in AI. 4 00:00:09,544 --> 00:00:10,544 We're making real palpable progress 5 00:00:10,544 --> 00:00:12,820 and there's loads of money being thrown at us. 6 00:00:12,820 --> 00:00:17,000 Google just spent five hundred million dollars -- 7 00:00:17,000 --> 00:00:21,989 pounds buying an AI startup called Deep Mind a couple of weeks ago, 8 00:00:21,989 --> 00:00:25,172 today they announced that they were going to spend a billion dollars 9 00:00:25,671 --> 00:00:28,469 setting up an AI lab in Silicon Valley. 10 00:00:29,001 --> 00:00:31,591 IBM is betting about a third of the company 11 00:00:31,830 --> 00:00:34,858 on their cognitive AI computing effort. 12 00:00:35,084 --> 00:00:38,216 So it's really interesting time to be working in AI. 13 00:00:38,631 --> 00:00:41,782 But the first thing I wanted to help inform you about 14 00:00:42,250 --> 00:00:46,792 is what is the state of art, what progress have we made in AI 15 00:00:46,792 --> 00:00:51,187 because Hollywood paints all these pictures, 16 00:00:51,187 --> 00:00:53,488 these mostly dystopian pictures of AI. 17 00:00:53,495 --> 00:00:58,672 Whenever the next science fiction movie comes out, I put my head in my hands 18 00:00:58,672 --> 00:01:01,755 and think Oh my God, what do people think that we're doing? 19 00:01:01,755 --> 00:01:08,729 So, I wanted to start by just giving you a feel for what actually is really capable. 20 00:01:09,081 --> 00:01:13,872 So a couple of years ago, IBM Watson won the game show Jeopardy 1, 21 00:01:13,872 --> 00:01:19,737 the million dollar prize in the game show, Jeopardy P, the reigning human champions. 22 00:01:19,737 --> 00:01:23,357 Now, you might think, well that's just a party trick, isn't it? 23 00:01:23,357 --> 00:01:27,625 It's a -- pour enough of Wikipedia and the internet into a computer, 24 00:01:27,625 --> 00:01:30,543 and it can answer general knowledge questions. 25 00:01:30,543 --> 00:01:33,204 Well, you guys are being a bit unfair to IBM Watson, 26 00:01:33,204 --> 00:01:35,881 there are the cryptic questions they are answering. 27 00:01:35,881 --> 00:01:40,878 But just to give you a real feel for what is technically possible today, 28 00:01:40,878 --> 00:01:43,586 something that was announced just two days ago: 29 00:01:43,834 --> 00:01:51,869 some colleagues of mine at NII in Japan passed the University Entrance Exam 30 00:01:51,869 --> 00:01:53,433 with an AI program. 31 00:01:53,433 --> 00:01:56,326 Now, I thought long and hard about putting up a page of maths. 32 00:01:56,326 --> 00:01:58,007 I thought, well, I'm going to get -- 33 00:01:58,007 --> 00:02:00,007 half of the audience is going to leave immediately 34 00:02:00,007 --> 00:02:01,354 if I put up a page of math 35 00:02:01,354 --> 00:02:03,133 But I wanted you to see, just to feel 36 00:02:03,962 --> 00:02:07,403 the depth of questions that they were answering. 37 00:02:07,403 --> 00:02:11,507 So this is from the maths paper, you know, a non trivial, sort of, 38 00:02:11,507 --> 00:02:15,148 if you come from the UK, A-level-like math question 39 00:02:15,148 --> 00:02:16,468 that they were able to answer 40 00:02:17,272 --> 00:02:21,889 Here is a physics question about Newtonian dynamics 41 00:02:21,889 --> 00:02:23,552 that they were able to answer. 42 00:02:23,552 --> 00:02:28,545 Now they got 511 points, out of a maximum of 950. 43 00:02:28,939 --> 00:02:32,391 That's more than the average score of Japanese students 44 00:02:32,391 --> 00:02:34,007 sitting the entrance exam. 45 00:02:34,007 --> 00:02:38,542 They would have got into most Japanese universities with a score of 511. 46 00:02:38,908 --> 00:02:40,621 That's what's possible today. 47 00:02:40,621 --> 00:02:44,134 Their ambition in 10 years' time is to get into Tokyo, University of Tokyo, 48 00:02:44,134 --> 00:02:46,434 which is one of the best universities in the world. 49 00:02:48,488 --> 00:02:50,475 So, this is why I put up a picture of Terminator, 50 00:02:50,475 --> 00:02:53,468 because whenever I talk to the media about what we do in AI, 51 00:02:53,476 --> 00:02:55,769 they put up a picture of Terminator, right? 52 00:02:55,769 --> 00:02:58,098 So I don't want you to worry about Terminator, 53 00:02:58,098 --> 00:03:02,117 Terminator is at least 50 to 100 years away. 54 00:03:02,582 --> 00:03:05,985 and there are lots of reasons why we don't have to worry about it, 55 00:03:05,985 --> 00:03:07,248 about Terminator. 56 00:03:07,248 --> 00:03:08,979 I'm not going to go and spend too much time 57 00:03:08,979 --> 00:03:11,348 on why you don't have to worry about Terminator. 58 00:03:11,676 --> 00:03:13,334 But there is actually things that you should worry about, 59 00:03:13,334 --> 00:03:15,352 much nearer than Terminator. 60 00:03:17,007 --> 00:03:20,160 Many people have said, Stephen Hawkins has said 61 00:03:20,160 --> 00:03:22,713 that, you know, AI is going to spell the end of the human race. 62 00:03:23,295 --> 00:03:27,961 Elon Musk chimed in afterwards, said "It's our biggest existential threat." 63 00:03:29,793 --> 00:03:32,841 Bill Gates followed on by saying, "Elon Musk was right." 64 00:03:33,498 --> 00:03:39,519 Lots of people have said that's a -- AI is a real existential threat to us. 65 00:03:39,519 --> 00:03:42,887 I don't want you to worry about the existential threat that AI 66 00:03:42,887 --> 00:03:44,900 or Terminator is going to bring. 67 00:03:45,953 --> 00:03:47,711 There's actually a very common confusion, 68 00:03:48,121 --> 00:03:51,139 which is that it's not the AI that's going to be the existential threat, 69 00:03:51,139 --> 00:03:52,139 it's autonomy, it's autonomous systems. 70 00:03:54,386 --> 00:03:56,042 What's in the (check) -- 71 00:03:56,042 --> 00:03:57,972 it isn't it going to wake up any time in the morning and say: 72 00:03:57,972 --> 00:04:02,109 "You know what? I'm tired of playing Jeopardy, 73 00:04:02,109 --> 00:04:04,173 "I want to play Who Wants to Be a Millionaire! 74 00:04:05,299 --> 00:04:07,468 "Or wait a second, I'm tired of playing game shows, 75 00:04:07,468 --> 00:04:09,393 I want to take over the universe." 76 00:04:09,821 --> 00:04:12,195 It's just not in its code, there is no way, 77 00:04:12,195 --> 00:04:15,065 it's not given any freedom to think about anything other 78 00:04:15,065 --> 00:04:17,000 than maximizing its Jeopardy score. 79 00:04:18,219 --> 00:04:21,209 And it has no desires, no other desires than, 80 00:04:21,228 --> 00:04:23,482 other to improve its maximum scores. 81 00:04:23,482 --> 00:04:26,256 So I don't want you to worry about Terminator, 82 00:04:27,350 --> 00:04:29,102 but I do want you to worry about jobs. 83 00:04:29,974 --> 00:04:32,166 Because lots of people, lots of very serious people, 84 00:04:32,166 --> 00:04:35,076 have been saying hat AI is going to end jobs, 85 00:04:35,076 --> 00:04:38,157 and that is a very great consequence for anyone working in education, 86 00:04:38,157 --> 00:04:42,269 because, certainly, the jobs that are going to exist in the future 87 00:04:42,269 --> 00:04:44,925 are going to be different than the jobs that exist today. 88 00:04:46,880 --> 00:04:49,724 Now, who has an odd birthday? 89 00:04:51,251 --> 00:04:52,938 Well, I haven't told you what an odd birthday is yet, 90 00:04:52,938 --> 00:04:54,990 so someone has an odd birthday, like me. 91 00:04:54,990 --> 00:04:57,858 OK. Who was born on an odd-number day of the month? 92 00:04:57,858 --> 00:04:59,995 I was born on the 11th of April, right? 93 00:05:00,824 --> 00:05:03,368 Come on, it's half the room, I know it's half the room. 94 00:05:03,368 --> 00:05:04,923 (Laughter) 95 00:05:04,923 --> 00:05:08,690 OK.Well, you want to have an odd birthday, by the way, 96 00:05:09,059 --> 00:05:12,504 because that means, in 20 years' time, you will be a person with a job. 97 00:05:13,426 --> 00:05:16,277 As opposed to the even people, who won't have jobs. 98 00:05:16,277 --> 00:05:20,662 That's certainty -- if you believe lots of serious people, 99 00:05:22,141 --> 00:05:25,010 you might have missed this news on Friday the 13th, 100 00:05:25,010 --> 00:05:27,778 I thought this was a rather depressing news story 101 00:05:27,778 --> 00:05:29,748 ....... (check) comparison otherwise ...... (check) 102 00:05:29,777 --> 00:05:31,527 but the chief economist, Bank of England, 103 00:05:31,922 --> 00:05:36,230 went on the record saying 50% of jobs were under threat in the UK. 104 00:05:36,948 --> 00:05:40,420 And he's not the first serious person who should know what he's talking about 105 00:05:40,420 --> 00:05:42,736 who said similar things. 106 00:05:43,029 --> 00:05:46,255 There was a very influential Merrill Lynch report that came out a month or to ago 107 00:05:46,270 --> 00:05:49,430 saying very similar things about the impact of AI, 108 00:05:49,430 --> 00:05:51,375 robotics, automation on jobs. 109 00:05:53,439 --> 00:05:56,278 And some of this goes back to, I think, one of the first reports 110 00:05:56,278 --> 00:05:59,288 that really hit the press, that really got people's attention, 111 00:05:59,288 --> 00:06:02,075 was a report that came out of the Oxford Martin School. 112 00:06:02,075 --> 00:06:05,702 They predicted that 47% of jobs in the United States 113 00:06:06,110 --> 00:06:08,550 were under threat of automation in the next 20 years. 114 00:06:10,234 --> 00:06:13,594 We followed that up with a very similar study and analysis 115 00:06:13,880 --> 00:06:15,961 for jobs in Australia, where I work. 116 00:06:16,865 --> 00:06:19,323 And because it's a slightly different profile of workers, 117 00:06:19,323 --> 00:06:24,329 of the work force in Australia, we came up with a number of around 40%. 118 00:06:24,329 --> 00:06:27,129 These are non trivial numbers, right? 40-50%. 119 00:06:27,589 --> 00:06:32,497 No, just an aside: 47%, I don't know why they didn't say 47.2%, right? 120 00:06:32,497 --> 00:06:35,551 You can't believe a number that's far too precise 121 00:06:35,551 --> 00:06:37,842 when you're predicting the future, but nevertheless, 122 00:06:37,842 --> 00:06:39,532 the fact that it's of this sort of scale, 123 00:06:39,532 --> 00:06:44,601 you've got to take away: it wasn't 4%, it was roughly about half the jobs. 124 00:06:47,784 --> 00:06:50,495 Now, let's put some context to this. 125 00:06:50,495 --> 00:06:52,413 I mean, is this really a credible claim? 126 00:06:52,413 --> 00:06:55,549 The Number One job in the United States today: truck driver. 127 00:06:55,962 --> 00:06:57,526 Now you might have noticed, 128 00:06:57,536 --> 00:07:00,079 the autonomous cars are coming to us very soon. 129 00:07:00,506 --> 00:07:03,579 We're going to be having -- tried the first trial of autonomous cars 130 00:07:03,579 --> 00:07:06,760 on the roads, public roads of Australia, three weeks ago. 131 00:07:07,371 --> 00:07:09,657 The Google Car has driven over a million kilometers 132 00:07:09,657 --> 00:07:12,542 -- or the Google cars, rather, have driven over a million kilometers, 133 00:07:12,907 --> 00:07:15,050 autonomously, on the roads of California. 134 00:07:15,893 --> 00:07:18,553 In 20 years' time, we are going to have autonomous cars. 135 00:07:18,553 --> 00:07:20,255 We're also going to have autonomous trucks. 136 00:07:21,033 --> 00:07:24,769 So if you are in the Number One profession in the United States, 137 00:07:24,769 --> 00:07:28,480 you have to worry that your job is not going to be automated away. 138 00:07:29,712 --> 00:07:32,494 The Number Two job in the United States is salesperson. 139 00:07:33,343 --> 00:07:36,248 Again, since we use the internet, 140 00:07:36,248 --> 00:07:39,618 we've actually mostly automated that process ourselves, 141 00:07:39,618 --> 00:07:43,382 but it's clear that a lot of those jobs are going to be disappearing. 142 00:07:43,382 --> 00:07:46,736 So I think these claims have a lot of credibility. 143 00:07:48,896 --> 00:07:53,104 There's actually a nice dinner party game that my colleagues in AI play 144 00:07:53,104 --> 00:07:55,736 at the end of our conferences, where we sit around 145 00:07:55,736 --> 00:07:58,250 and the game is, you have to name a job 146 00:07:59,115 --> 00:08:02,671 and then, someone has to put up some credible evidence 147 00:08:02,671 --> 00:08:05,748 that we're actually well on the way to actually automating that. 148 00:08:05,748 --> 00:08:08,152 And this game is almost impossible to win. 149 00:08:08,737 --> 00:08:10,423 If I had more time, I'd play the game with you. 150 00:08:11,293 --> 00:08:15,260 The only -- about the only winning answer is politician. 151 00:08:15,650 --> 00:08:17,931 (Laughter) 152 00:08:17,931 --> 00:08:20,747 They will certainly regulate that they'll be the last to be automated. 153 00:08:20,747 --> 00:08:22,731 But that's about the only winning answer we have. 154 00:08:24,049 --> 00:08:28,836 So -- and it's not just technology that is the cause of this. 155 00:08:28,836 --> 00:08:33,035 There's many other, really, sort of rather unhelpful trends. 156 00:08:33,035 --> 00:08:35,558 If you were trying to set up the world's economy, 157 00:08:35,558 --> 00:08:38,842 you would not put these things all down on the table at the same time: 158 00:08:38,842 --> 00:08:40,825 the global, ongoing global financial crisis, 159 00:08:40,825 --> 00:08:43,840 which seems like it will never disappear, I think; 160 00:08:44,755 --> 00:08:48,013 the fact that we're all living longer: this is great, great news for us 161 00:08:48,013 --> 00:08:50,333 but bad news for employment; 162 00:08:50,961 --> 00:08:54,425 the impact of globalization, the fact that we can outsource our work 163 00:08:54,425 --> 00:08:56,191 to cheaper economies. 164 00:08:56,191 --> 00:09:00,191 All of these things are compounding the impact 165 00:09:00,191 --> 00:09:02,957 that technology is having on the nature of work. 166 00:09:04,749 --> 00:09:09,725 And this transformation is going to be different than the last one, 167 00:09:09,725 --> 00:09:11,357 the Industrial revolution. 168 00:09:12,021 --> 00:09:15,591 There's no hard and fast rule of economics that says: 169 00:09:16,911 --> 00:09:20,039 "As many jobs need to be created by a new technology as destroyed." 170 00:09:20,039 --> 00:09:22,952 Every time we have a new technology, of course, new jobs are created. 171 00:09:22,952 --> 00:09:25,883 There's lots of, there's thousands, hundreds of thousands of new jobs 172 00:09:26,204 --> 00:09:27,848 enabled by technology today. 173 00:09:28,355 --> 00:09:32,484 But there's no reason that they have to balance exactly those that are destroyed. 174 00:09:33,092 --> 00:09:36,199 In the last -- in the last revolution, that did happen to be the case. 175 00:09:36,853 --> 00:09:40,930 A third of the population was working out in the fields, in agriculture. 176 00:09:40,930 --> 00:09:44,552 Now, worldwide, it's 3 or 4% of the world's population 177 00:09:44,552 --> 00:09:45,657 working in agriculture. 178 00:09:45,657 --> 00:09:48,029 Those people are working in factories and offices now. 179 00:09:48,029 --> 00:09:51,920 We employ far more people than we did at the turn of the 19th century. 180 00:09:52,629 --> 00:09:55,993 But this one looks different, this information revolution looks different. 181 00:09:55,993 --> 00:10:00,233 It looks like it has the potential to take away more jobs, perhaps, 182 00:10:00,233 --> 00:10:01,529 than it does. 183 00:10:01,529 --> 00:10:03,684 And one of the other things is that we used to think 184 00:10:03,684 --> 00:10:05,293 it was the blue-collar jobs. 185 00:10:06,509 --> 00:10:10,937 And that's true: if you go to a car factory today, sure enough, 186 00:10:10,937 --> 00:10:12,578 there are robots that are doing the painting, 187 00:10:12,578 --> 00:10:14,800 there are robots that are doing the welding. 188 00:10:14,800 --> 00:10:16,584 But nowadays, it's white-collar jobs: 189 00:10:16,584 --> 00:10:20,006 it's journalists, it's lawyers, it's accountants, 190 00:10:20,006 --> 00:10:21,667 these jobs that are under threat. 191 00:10:21,667 --> 00:10:25,444 These graphs here show the percentage change in employment 192 00:10:25,444 --> 00:10:31,244 and the change in employment rates. 193 00:10:31,660 --> 00:10:34,672 And it's the middle, the middle class, white-collar professions 194 00:10:34,672 --> 00:10:37,923 that we thought that you would go to university to make yourself safe, 195 00:10:37,926 --> 00:10:40,624 but it seems to be the ones that are most under threat. 196 99:59:59,999 --> 99:59:59,999 If you are a ....... (check) it's probably -- you're too cheap to be replaced by something automated. But if you're a more expensive person, and this means (check) that the rich are getting richer and inequalities that we are seeing in society that are distressing our societies to day, seem to be magnified by these technological changes. And there is so many frightening graphs, if you're going to read Thomas Picketty (check), I encourage you. Go and look at one of his books, and you can see here that we're seeing a constant improvement in productivity. Technology is buying us those improvements in productivity, is increasing our wealth. But there's a leveling off of employment. And so, the challenge, then, is how -- it's a question for society, not for a technologist like myself -- how do we all benefit from this rising tide, not so that it is the rich get richer and the rest of us get further behind. So, many parts of many jobs looks likely (check) to be automated. One confusion is this: people say these jobs are going to disappear. Actually, it seems to be more likely that many parts of your job will be automated. But that still means that there is perhaps less employment around. So how can you make yourself more future-proof? Well, I have two pieces of advice as a technologist, in terms of what's going to be technically possible in AI. Either you've got to embrace the future, so become like me, become someone who's working on trying to invent that future. And if you're not technically minded, that's fine: I've got the other part of the equation, the other answer to your question which is completely at the other end of the spectrum, which is: focus on those things that find as AI, the hardest things, making a future that's more creative, making a future that can understand your emotional state, focusing on emotional terms and not intellectual intelligence. 12:23