1 00:00:00,151 --> 00:00:01,476 Bryn Freedman: You're a guy 2 00:00:01,500 --> 00:00:06,087 whose company funds these AI programs and invests. 3 00:00:06,111 --> 00:00:11,156 So why should we trust you to not have a bias 4 00:00:11,180 --> 00:00:14,356 and tell us something really useful for the rest of us 5 00:00:14,380 --> 00:00:16,601 about the future of work? 6 00:00:17,046 --> 00:00:18,537 Roy Bahat: Yes, I am. 7 00:00:18,561 --> 00:00:21,549 And when you wake up in the morning and you read the newspaper 8 00:00:21,573 --> 00:00:25,022 and it says, "The robots are coming, they may take all our jobs," 9 00:00:25,046 --> 00:00:27,577 as a start-up investor, focused on the future of work, 10 00:00:27,601 --> 00:00:29,918 our fund was the first one to say 11 00:00:29,942 --> 00:00:32,252 artificial intelligence should be a focus for us. 12 00:00:32,276 --> 00:00:34,529 So I woke up one morning and read that and said, 13 00:00:34,553 --> 00:00:36,426 "Oh, my gosh, they're talking about me. 14 00:00:36,450 --> 00:00:38,000 That's me who's doing that." 15 00:00:38,839 --> 00:00:40,863 And then I thought, wait a minute. 16 00:00:40,887 --> 00:00:43,299 If things continue, 17 00:00:43,323 --> 00:00:48,577 then maybe not only will the start-ups in which we invest struggle, 18 00:00:48,601 --> 00:00:51,162 because there won't people to have jobs 19 00:00:51,186 --> 00:00:54,048 to pay for the things that they make and buy them, 20 00:00:54,072 --> 00:00:56,989 but our economy and society might struggle, too. 21 00:00:57,013 --> 00:00:58,918 And look, I should be the guy 22 00:00:58,942 --> 00:01:01,871 who sits here and tells you, "Everything is going to be fine." 23 00:01:01,895 --> 00:01:03,498 It's all going to work out great. 24 00:01:03,522 --> 00:01:05,617 Hey, when they introduced the ATM machine, 25 00:01:05,641 --> 00:01:08,489 years later there's more tellers in banks, it's true. 26 00:01:08,513 --> 00:01:10,474 And yet, when I looked at it, I thought, 27 00:01:10,498 --> 00:01:13,093 "This is going to accelerate, and if it does accelerate 28 00:01:13,117 --> 00:01:15,133 there's a chance the center doesn't hold." 29 00:01:15,157 --> 00:01:17,602 But I figured somebody must know the answer to this, 30 00:01:17,626 --> 00:01:19,164 there's so many ideas out there. 31 00:01:19,188 --> 00:01:22,696 And I read all the books and I went to the conferences, 32 00:01:22,720 --> 00:01:26,434 and at one point we counted more than 100 efforts 33 00:01:26,458 --> 00:01:28,363 to study the future of work. 34 00:01:28,387 --> 00:01:31,244 And it was a frustrating experience, 35 00:01:31,268 --> 00:01:35,283 because I'd hear the same back-and-forth over and over again: 36 00:01:35,307 --> 00:01:37,085 "The robots are coming," 37 00:01:37,109 --> 00:01:38,712 and then somebody else would say, 38 00:01:38,736 --> 00:01:40,086 "Oh, don't worry about that, 39 00:01:40,110 --> 00:01:42,300 they've always said that and it turns out OK." 40 00:01:42,324 --> 00:01:43,928 And then somebody else would say, 41 00:01:43,952 --> 00:01:46,624 "Well, it's really about the meaning of your job anyway." 42 00:01:46,648 --> 00:01:49,592 And then everybody would shrug and [unclear] and have a drink. 43 00:01:49,616 --> 00:01:52,676 And it felt like there was this Kabuki theater of this discussion 44 00:01:52,700 --> 00:01:54,588 where nobody was talking to each other. 45 00:01:54,612 --> 00:01:55,834 And many of the people 46 00:01:55,858 --> 00:01:58,287 that I knew and worked with in the technology world 47 00:01:58,311 --> 00:01:59,985 were not speaking to policy makers, 48 00:02:00,009 --> 00:02:02,077 the policy makers were not speaking to them, 49 00:02:02,101 --> 00:02:05,809 and so we partnered with a non-partisan think tank NGO called New America 50 00:02:05,833 --> 00:02:07,166 to study this issue. 51 00:02:07,190 --> 00:02:09,538 And we brought together a group of people 52 00:02:09,562 --> 00:02:12,992 including an AI tzar at a technology company, 53 00:02:13,016 --> 00:02:16,395 and a video game designer, and a heartland conservative, 54 00:02:16,419 --> 00:02:19,617 and a Wall Street investor, and a socialist magazine editor, 55 00:02:19,641 --> 00:02:22,506 literally, all in the same room, it was occasionally awkward, 56 00:02:22,530 --> 00:02:25,244 and tried to figure out what is it that will happen here. 57 00:02:25,268 --> 00:02:28,088 The question we asked was simple. 58 00:02:28,704 --> 00:02:32,117 It was, what is the effect of technology on work going to be? 59 00:02:32,141 --> 00:02:33,768 And we looked out 10 to 20 years, 60 00:02:33,792 --> 00:02:37,323 because we wanted to look out far enough that there could be a real change, 61 00:02:37,347 --> 00:02:40,196 but soon enough that we weren't talking about teleportation 62 00:02:40,220 --> 00:02:41,387 or anything like that. 63 00:02:41,411 --> 00:02:42,935 And we recognized, 64 00:02:42,959 --> 00:02:45,744 and I think every year we're reminded of this in the world, 65 00:02:45,768 --> 00:02:47,975 that predicting what's going to happen is hard, 66 00:02:47,999 --> 00:02:50,855 so instead of predicting, there are other things you can do. 67 00:02:50,879 --> 00:02:53,720 Which is, you can try to imagine alternate possible futures, 68 00:02:53,744 --> 00:02:56,633 which is what we did, we did a scenario-planning exercise, 69 00:02:56,657 --> 00:02:59,723 and we imagined cases where no job is safe. 70 00:02:59,747 --> 00:03:02,850 We imagined cases where every job is safe. 71 00:03:02,874 --> 00:03:06,913 And we imagined every distinct possibility we could. 72 00:03:06,937 --> 00:03:10,220 And the result, which really surprised us, 73 00:03:10,244 --> 00:03:13,942 was when you think through those futures and you think what should we do, 74 00:03:13,966 --> 00:03:15,688 the answers about what we should do 75 00:03:15,712 --> 00:03:19,098 actually turn out to be the same no matter what happens. 76 00:03:19,425 --> 00:03:22,973 And the irony of looking out 10 to 20 years into the future 77 00:03:22,997 --> 00:03:25,600 is you realize that the things we want to act on 78 00:03:25,624 --> 00:03:27,585 are actually already happening right now. 79 00:03:27,609 --> 00:03:30,395 The automation is right now, the future is right now. 80 00:03:30,419 --> 00:03:33,014 BF: So what does that mean, and what does that tell us, 81 00:03:33,038 --> 00:03:35,702 if the future is now, what is it that we should be doing 82 00:03:35,726 --> 00:03:37,630 and what should we be thinking about? 83 00:03:37,654 --> 00:03:39,734 RB: We have to understand the problem first. 84 00:03:39,758 --> 00:03:43,797 And so the data are that as the economy becomes more productive, 85 00:03:43,821 --> 00:03:45,966 and individual workers become more productive, 86 00:03:45,990 --> 00:03:47,252 their wages haven't risen. 87 00:03:47,276 --> 00:03:50,466 If you look at the proportion of prime working-age men, 88 00:03:50,490 --> 00:03:51,982 in the United States at least, 89 00:03:52,006 --> 00:03:55,755 who work now versus in 1960, 90 00:03:55,779 --> 00:03:58,247 we have three times as many men not working, 91 00:03:58,271 --> 00:03:59,752 and then you hear the stories. 92 00:03:59,776 --> 00:04:01,808 I sat down with a group of Walmart workers 93 00:04:01,832 --> 00:04:04,188 and I said, "What do you think about this cashier, 94 00:04:04,212 --> 00:04:06,069 this futuristic self-check out thing?" 95 00:04:06,093 --> 00:04:07,252 They said, "That's nice, 96 00:04:07,276 --> 00:04:09,292 but have you heard about the cash recycler, 97 00:04:09,316 --> 00:04:11,688 that's a machine that's being installed right now, 98 00:04:11,712 --> 00:04:14,291 it's eliminating two jobs at every Walmart right now." 99 00:04:14,315 --> 00:04:16,555 And we thought we didn't understand the problem, 100 00:04:16,579 --> 00:04:20,459 and we looked at the voices that were the ones that were excluded. 101 00:04:20,483 --> 00:04:23,125 Which is, all of the people affected by this change. 102 00:04:23,149 --> 00:04:24,736 And we decided to listen to them, 103 00:04:24,760 --> 00:04:26,744 sort of, automation and its discontents. 104 00:04:26,768 --> 00:04:29,014 I've spent the last couple of years doing that. 105 00:04:29,038 --> 00:04:31,467 I've been to Flint, Michigan, and Youngstown, Ohio, 106 00:04:31,491 --> 00:04:32,871 talking about entrepreneurs, 107 00:04:32,895 --> 00:04:35,427 trying to make it work in a very different environment 108 00:04:35,451 --> 00:04:38,069 from New York or San Francisco or London or Tokyo. 109 00:04:38,735 --> 00:04:40,117 I've been to prisons twice, 110 00:04:40,141 --> 00:04:42,714 to talk to inmates about their jobs after they leave. 111 00:04:43,077 --> 00:04:46,824 I've sat down with truck drivers to ask them about the self-driving truck, 112 00:04:46,848 --> 00:04:49,302 with people who, in addition to their full-time job, 113 00:04:49,326 --> 00:04:51,110 care for an aging relative, 114 00:04:51,134 --> 00:04:52,698 and when you talk to people, 115 00:04:52,722 --> 00:04:55,602 there were two themes that came out loud and clear. 116 00:04:56,285 --> 00:05:01,129 The first one was that people are less looking for more money 117 00:05:01,153 --> 00:05:04,431 or get out of the fear of the robot taking their job, 118 00:05:04,455 --> 00:05:06,351 and they just want something stable. 119 00:05:06,375 --> 00:05:07,915 They want something predictable. 120 00:05:07,939 --> 00:05:11,614 So if you survey people and ask them what they want out of work, 121 00:05:11,638 --> 00:05:15,118 for everybody who makes less than 150,000 dollars a year, 122 00:05:15,142 --> 00:05:18,498 they'll take a more stable and secure income, on average, 123 00:05:18,522 --> 00:05:20,387 over earning more money. 124 00:05:20,411 --> 00:05:22,625 And if you think about the fact that 125 00:05:22,649 --> 00:05:26,037 not only for all of the people across the earth who don't earn a living, 126 00:05:26,061 --> 00:05:27,252 but for those who do, 127 00:05:27,276 --> 00:05:30,236 the vast majority earn a different amount from month to month 128 00:05:30,260 --> 00:05:31,474 and have an instability, 129 00:05:31,498 --> 00:05:32,895 all of a sudden you realize, 130 00:05:32,919 --> 00:05:35,466 "Wait a minute, we have a real problem on our hands." 131 00:05:35,490 --> 00:05:39,164 And the second thing they say, which took us a longer time to understand, 132 00:05:39,188 --> 00:05:41,522 is that they say they want dignity. 133 00:05:41,894 --> 00:05:47,006 And that concept of self-worth through work 134 00:05:47,030 --> 00:05:49,641 emerged again and again and again in our conversations. 135 00:05:49,665 --> 00:05:52,649 BF: So, I certainly appreciate this answer, 136 00:05:52,673 --> 00:05:54,057 but you can't eat dignity, 137 00:05:54,081 --> 00:05:57,053 you can't clothe your children with self-esteem. 138 00:05:57,077 --> 00:06:02,418 So, what is that, how do you reconcile what does dignity mean 139 00:06:02,442 --> 00:06:06,117 and what is the relationship between dignity and stability? 140 00:06:06,141 --> 00:06:07,394 RB: You can't eat dignity. 141 00:06:07,418 --> 00:06:08,625 You need stability first. 142 00:06:08,649 --> 00:06:09,887 And the good news is, 143 00:06:09,911 --> 00:06:12,666 many of the conversations that are happening right now, 144 00:06:12,690 --> 00:06:14,252 are about how we solve that. 145 00:06:14,276 --> 00:06:18,141 You know, I'm a proponent of studying guaranteed income, 146 00:06:18,165 --> 00:06:19,442 as one example. 147 00:06:19,466 --> 00:06:21,754 Conversations about how health care gets provided 148 00:06:21,778 --> 00:06:23,017 and other benefits. 149 00:06:23,041 --> 00:06:24,818 Those conversations are happening, 150 00:06:24,842 --> 00:06:27,238 and we're at a time where we must figure that out, 151 00:06:27,262 --> 00:06:28,903 it is the crisis of our era. 152 00:06:28,927 --> 00:06:31,839 And my point of view after talking to people 153 00:06:31,863 --> 00:06:33,903 is that we may do that, 154 00:06:33,927 --> 00:06:35,506 and it still might not be enough. 155 00:06:35,530 --> 00:06:37,712 Because what we need to do from the beginning, 156 00:06:37,736 --> 00:06:40,672 is understand what it is about work that gives people dignity, 157 00:06:40,696 --> 00:06:43,966 so that they can live the lives that they want to live. 158 00:06:43,990 --> 00:06:48,005 And so that concept of dignity is ... 159 00:06:48,029 --> 00:06:49,966 it's difficult to get your hands around. 160 00:06:49,990 --> 00:06:53,426 Because what many people hear, and especially, to be honest, rich people, 161 00:06:53,450 --> 00:06:54,651 they hear meaning. 162 00:06:54,675 --> 00:06:56,593 They hear "my work is important to me." 163 00:06:56,617 --> 00:06:58,157 And again, if you survey people, 164 00:06:58,181 --> 00:07:00,355 and you ask them, 165 00:07:00,379 --> 00:07:03,879 "How important is it to you that your work be important to you?" 166 00:07:03,903 --> 00:07:07,204 only people who make 150,000 dollars a year or more 167 00:07:07,228 --> 00:07:11,282 say that it is important to them that their work be important. 168 00:07:12,050 --> 00:07:13,275 BF: Meaning, meaningful? 169 00:07:13,744 --> 00:07:16,502 RB: Just defined as, "Is your work important to you?" 170 00:07:17,976 --> 00:07:19,696 Whatever somebody took that to mean. 171 00:07:19,720 --> 00:07:21,680 And yet, of course dignity is essential, 172 00:07:21,704 --> 00:07:23,466 we talked to truck drivers who said, 173 00:07:23,490 --> 00:07:25,052 "I saw my cousin drive, 174 00:07:25,076 --> 00:07:27,336 and I got on the open road and it was amazing, 175 00:07:27,360 --> 00:07:30,432 and I started making more money than people who went to college." 176 00:07:30,456 --> 00:07:32,661 And then they'd get to the end of their thought 177 00:07:32,685 --> 00:07:33,871 and say something like, 178 00:07:33,895 --> 00:07:36,483 People need their fruits and vegetables in the morning, 179 00:07:36,507 --> 00:07:38,151 I'm the guy who gets it to them." 180 00:07:38,175 --> 00:07:40,719 And we talked to somebody who, in addition to his job, 181 00:07:40,743 --> 00:07:41,942 was caring for his aunt. 182 00:07:41,966 --> 00:07:44,877 He was making plenty of money, and at one point we just asked, 183 00:07:44,901 --> 00:07:48,758 "What is it about caring for your aunt, can't you pay somebody to do it?" 184 00:07:48,782 --> 00:07:52,204 He said, "My aunt doesn't want somebody we pay for, she wants me." 185 00:07:52,601 --> 00:07:56,268 And so there was this concept there of being needed. 186 00:07:56,292 --> 00:07:58,825 And if you study the word "dignity," it's fascinating, 187 00:07:58,849 --> 00:08:01,341 it's one of the oldest words in the English language, 188 00:08:01,365 --> 00:08:03,474 it's from antiquity and it has two meanings: 189 00:08:03,498 --> 00:08:04,672 one is self-worth, 190 00:08:04,696 --> 00:08:08,653 and the other is that something is suitable, it's fitting. 191 00:08:08,677 --> 00:08:11,535 Meaning that you're part of something greater than yourself, 192 00:08:11,559 --> 00:08:15,050 and it connects to some broader whole, in other words, that you're needed. 193 00:08:15,074 --> 00:08:16,947 BF: So how do you answer this question, 194 00:08:16,971 --> 00:08:21,387 this concept that we don't pay teachers, and elder-care workers 195 00:08:21,411 --> 00:08:24,537 and we don't pay people who really care for people 196 00:08:24,561 --> 00:08:26,847 and are needed, enough? 197 00:08:26,871 --> 00:08:30,062 RB: Well, the good news is, people are finally asking the question, 198 00:08:30,086 --> 00:08:32,380 so as AI investors, we often get phone calls 199 00:08:32,404 --> 00:08:35,134 from foundations or CEOs and boardrooms saying, 200 00:08:35,158 --> 00:08:36,491 "What do we do about this?" 201 00:08:36,515 --> 00:08:37,823 And they used to be asking, 202 00:08:37,847 --> 00:08:39,957 "What do we do about introducing automation?" 203 00:08:39,982 --> 00:08:42,679 And now they're asking, "What do we do about self-worth?" 204 00:08:42,703 --> 00:08:45,125 And they know that the employees who work for them, 205 00:08:45,149 --> 00:08:47,371 who have a spouse who cares for somebody, 206 00:08:47,395 --> 00:08:50,918 that dignity is essential to their ability to just do their job. 207 00:08:50,942 --> 00:08:52,744 I think there's two kinds of answers: 208 00:08:52,768 --> 00:08:55,268 there's the money side of just making your life work. 209 00:08:55,292 --> 00:08:56,442 That's stability. 210 00:08:56,466 --> 00:08:57,617 You need to eat. 211 00:08:57,641 --> 00:08:59,998 And then you think about our culture more broadly, 212 00:09:00,022 --> 00:09:02,680 and you ask who do we make into heroes? 213 00:09:02,704 --> 00:09:07,164 And, you know, what I want is to see the magazine cover 214 00:09:07,188 --> 00:09:09,849 that is the person who is the heroic caregiver. 215 00:09:10,292 --> 00:09:13,022 Or the Netflix series that dramatizes the person 216 00:09:13,046 --> 00:09:16,283 who makes all of our other lives work so we can do the things we do. 217 00:09:16,307 --> 00:09:18,156 Let's make heroes out of those people, 218 00:09:18,180 --> 00:09:20,212 that's the Netflix show that I would binge. 219 00:09:20,236 --> 00:09:22,514 And we've had chroniclers of this before, 220 00:09:22,538 --> 00:09:23,728 Studs Terkel, 221 00:09:23,752 --> 00:09:27,458 the oral history of the working experience in the United States. 222 00:09:27,482 --> 00:09:30,634 And what we need is the experience of needing one another 223 00:09:30,658 --> 00:09:32,283 and being connected to each other. 224 00:09:32,307 --> 00:09:35,347 Maybe that's the answer for how we all fit as a society. 225 00:09:35,371 --> 00:09:38,593 And the thought exercise to me is if you were to go back 100 years, 226 00:09:38,617 --> 00:09:41,586 and have people, you know, my grandparents, great-grandparents, 227 00:09:41,610 --> 00:09:43,490 a tailor, worked in a mine, 228 00:09:43,514 --> 00:09:45,585 they look at what all of us do for a living, 229 00:09:45,609 --> 00:09:47,339 they say, "That's not work." 230 00:09:47,363 --> 00:09:51,132 We sit there and type and talk and there's no danger of getting hurt. 231 00:09:51,526 --> 00:09:55,017 And my guess is that if you were to imagine 100 years from now, 232 00:09:55,041 --> 00:09:57,065 we'll still be doing things for each other. 233 00:09:57,089 --> 00:09:58,549 We'll still need one another. 234 00:09:58,573 --> 00:10:00,514 And we just will think of it as work. 235 00:10:00,538 --> 00:10:02,212 The entire thing I'm trying to say 236 00:10:02,236 --> 00:10:05,109 is that dignity should not just be about having a job. 237 00:10:05,133 --> 00:10:07,990 Because if you say you need a job to have dignity, 238 00:10:08,014 --> 00:10:09,458 which many people say, 239 00:10:09,482 --> 00:10:12,299 the second you say that, you say to all the parents, 240 00:10:12,323 --> 00:10:14,768 and all the teachers and all the caregivers 241 00:10:14,792 --> 00:10:15,966 that all of a sudden, 242 00:10:15,990 --> 00:10:18,537 because they're not being paid for what they're doing, 243 00:10:18,561 --> 00:10:20,854 it somehow lacks this essential human quality. 244 00:10:20,878 --> 00:10:22,982 To me, that's the great puzzle of our time -- 245 00:10:23,006 --> 00:10:26,117 can we figure out how to provide that stability throughout life, 246 00:10:26,141 --> 00:10:28,674 and then can we figure out how to create an inclusive, 247 00:10:28,698 --> 00:10:32,965 not just racially, gender, but multigenerationally inclusive, 248 00:10:32,989 --> 00:10:37,206 I mean, every different human experience 249 00:10:37,230 --> 00:10:41,095 included in this way of understanding how we can be needed by one another. 250 00:10:41,119 --> 00:10:42,485 BF: Thank you. RB: Thank you. 251 00:10:42,509 --> 00:10:44,706 BF: Thank you very much for your participation. 252 00:10:44,730 --> 00:10:45,880 (Applause)