1 00:00:00,000 --> 00:00:02,550 ♪ [music] ♪ 2 00:00:03,800 --> 00:00:05,800 - [Narrator] Welcome to Nobel Conversations. 3 00:00:07,100 --> 00:00:08,100 In this episode, 4 00:00:08,100 --> 00:00:11,570 Josh Angrist and Guido Imbens sit down with Isaiah Andrews 5 00:00:11,570 --> 00:00:14,600 to discuss how the field of econometrics is evolving. 6 00:00:16,100 --> 00:00:18,750 - [Isaiah] So Guido and Josh, you're both pioneers 7 00:00:18,750 --> 00:00:21,500 in developing tools for empirical research in economics. 8 00:00:21,500 --> 00:00:22,930 And so I'd like to explore 9 00:00:22,930 --> 00:00:25,300 sort of where you feel like the field is heading, 10 00:00:25,300 --> 00:00:28,079 sort of economics, econometrics, the whole thing. 11 00:00:28,510 --> 00:00:31,290 To start, I'd be interested to hear 12 00:00:32,200 --> 00:00:35,200 about whether you feel like sort of the way in which 13 00:00:35,200 --> 00:00:38,510 the local average treatment effects framework sort of took hold 14 00:00:38,800 --> 00:00:42,100 has any lessons for how new empirical methods in economics 15 00:00:42,100 --> 00:00:44,300 develop and spread or how they should. 16 00:00:44,560 --> 00:00:45,960 - [Josh] That's a good question. 17 00:00:46,610 --> 00:00:47,790 You go first. 18 00:00:47,790 --> 00:00:49,460 (laughter) 19 00:00:49,700 --> 00:00:52,940 Yeah, so I think the important thing 20 00:00:52,940 --> 00:00:58,550 is to come up with good convincing cases 21 00:00:58,550 --> 00:01:02,207 where the questions are clear 22 00:01:02,400 --> 00:01:05,720 and where kind of the methods apply in general. 23 00:01:05,720 --> 00:01:07,560 So one thing I -- 24 00:01:08,070 --> 00:01:12,000 Kind of looking back at the subsequent literature, 25 00:01:12,200 --> 00:01:16,700 so I really like the regression discontinuity literature 26 00:01:16,700 --> 00:01:19,670 [where there were] clearly a bunch of really convincing examples 27 00:01:19,670 --> 00:01:21,319 and that allowed people to kind of 28 00:01:22,300 --> 00:01:27,200 think more clearly, look harder at the methodological questions. 29 00:01:27,400 --> 00:01:28,800 Kind of do clear applications 30 00:01:28,800 --> 00:01:30,600 that then allow you to kind of think about, 31 00:01:30,600 --> 00:01:33,600 "Wow, do this type of assumption seem reasonable here? 32 00:01:33,600 --> 00:01:38,000 What kind of things do we not like in the early papers? 33 00:01:38,500 --> 00:01:39,802 How can we improve things?" 34 00:01:39,802 --> 00:01:44,210 So having clear applications motivating these literatures, 35 00:01:44,210 --> 00:01:46,400 I think it's very helpful. 36 00:01:46,800 --> 00:01:48,050 - [Josh] I'm glad you mentioned 37 00:01:48,050 --> 00:01:49,382 the regression discontinuity, Guido. 38 00:01:49,382 --> 00:01:53,300 I think there's a lot of complementarity between IV and RD, 39 00:01:54,700 --> 00:01:57,060 Instrumental Variables and Regression Discontinuity. 40 00:02:00,350 --> 00:02:03,260 And a lot of the econometric applications 41 00:02:03,260 --> 00:02:04,520 of regression discontinuity 42 00:02:04,520 --> 00:02:07,230 are what used to be called "fuzzy" RD, 43 00:02:07,230 --> 00:02:11,620 where, you know, it's not discrete or deterministic at the cutoff, 44 00:02:11,620 --> 00:02:14,900 but just the change in rates or intensity. 45 00:02:14,900 --> 00:02:18,740 And the late framework helps us understand those applications 46 00:02:18,740 --> 00:02:21,140 and gives us a clear interpretation 47 00:02:21,140 --> 00:02:25,000 for say, something like, in my paper with Victor Lavy, 48 00:02:25,000 --> 00:02:28,100 where we use Maimonides' rule, the class size cutoffs. 49 00:02:28,430 --> 00:02:30,030 What are you getting there? 50 00:02:30,290 --> 00:02:31,820 Of course, you can answer that question 51 00:02:31,820 --> 00:02:33,900 with a linear constant effects model, 52 00:02:34,200 --> 00:02:36,310 but it turns out we're not limited to that, 53 00:02:36,310 --> 00:02:39,889 and RD is still very powerful and illuminating, 54 00:02:40,630 --> 00:02:42,100 even when, you know, 55 00:02:42,100 --> 00:02:45,670 the correlation between the cutoff and the variable of interest, 56 00:02:45,670 --> 00:02:48,980 in this case class size, is partial, 57 00:02:48,980 --> 00:02:51,000 maybe even not that strong. 58 00:02:52,000 --> 00:02:54,999 So there was definitely a kind of, a parallel development. 59 00:02:54,999 --> 00:02:56,400 It's also interesting, 60 00:02:56,600 --> 00:02:59,780 you know, nobody talked about regression discontinuity designs 61 00:02:59,780 --> 00:03:01,220 when we were in graduate school, 62 00:03:01,220 --> 00:03:05,300 it was something that other social scientists were interested in, 63 00:03:05,800 --> 00:03:09,507 and that kind of grew up alongside the LATE framework 64 00:03:09,507 --> 00:03:14,787 and we've both done work on both applications and methods there 65 00:03:14,787 --> 00:03:18,377 and it's been very exciting to see that kind of develop 66 00:03:18,377 --> 00:03:19,800 and become so important. 67 00:03:20,000 --> 00:03:23,700 It's part of a general evolution, I think, towards, you know, 68 00:03:24,000 --> 00:03:27,700 credible identification strategies, causal effects... 69 00:03:28,630 --> 00:03:30,200 less, you know, making econometrics 70 00:03:30,400 --> 00:03:33,300 more about causal questions than about models. 71 00:03:33,640 --> 00:03:34,650 In terms of the future, 72 00:03:34,650 --> 00:03:37,860 I think one thing that LATE has helped facilitate 73 00:03:37,860 --> 00:03:42,700 is a move towards more creative, randomized trials, where, 74 00:03:42,700 --> 00:03:44,400 you know, there's something of interest, 75 00:03:45,500 --> 00:03:48,460 it's not possible or straightforward 76 00:03:48,460 --> 00:03:50,700 to simply turn it off or on, 77 00:03:51,000 --> 00:03:54,584 but you can encourage it or discourage it. 78 00:03:54,584 --> 00:03:58,200 So you subsidize schooling with financial aid, for example. 79 00:03:59,000 --> 00:04:02,080 So now we have a whole framework for interpreting that, 80 00:04:03,600 --> 00:04:06,900 and it kind of opens the doors to randomized trials 81 00:04:06,900 --> 00:04:10,300 of things that maybe would, you know, 82 00:04:10,300 --> 00:04:12,471 not have seem possible before. 83 00:04:14,500 --> 00:04:17,700 We've used that a lot in the work we do on schools in our -- 84 00:04:17,700 --> 00:04:21,160 in the Blueprint Lab at MIT, 85 00:04:22,360 --> 00:04:26,600 we're exploiting random assignment and in very creative ways, I think. 86 00:04:28,100 --> 00:04:31,509 - [Isaiah] Related to that, do you see sort of particular factors 87 00:04:31,509 --> 00:04:34,300 that make for useful research in econometrics? 88 00:04:34,400 --> 00:04:38,290 You've alluded to it having a clear connection 89 00:04:38,290 --> 00:04:40,300 to problems that are actually coming up 90 00:04:40,300 --> 00:04:42,650 and empirical practice is often a good idea. 91 00:04:43,290 --> 00:04:45,000 - [Josh] Isn't it always a good idea? 92 00:04:45,700 --> 00:04:50,100 I often find myself sitting in an econometrics theory seminar, 93 00:04:50,700 --> 00:04:52,500 say the Harvard MIT seminar, 94 00:04:53,400 --> 00:04:56,350 and I'm thinking, "What problem is this guy solving? 95 00:04:56,350 --> 00:04:57,960 Who has this problem?" 96 00:04:57,960 --> 00:04:59,800 And, you know, 97 00:05:01,600 --> 00:05:04,700 sometimes there's an embarrassing silence if I ask 98 00:05:04,900 --> 00:05:08,300 or there might be a fairly contrived scenario. 99 00:05:08,800 --> 00:05:11,600 I want to see where the tool is useful. 100 00:05:12,500 --> 00:05:14,900 There are some purely foundational tools, 101 00:05:14,900 --> 00:05:17,600 I do take the point, you know, there are people who are 102 00:05:18,200 --> 00:05:22,500 working on conceptual foundations of, you know, 103 00:05:22,600 --> 00:05:25,300 it's more -- becomes more like mathematical statistics. 104 00:05:25,800 --> 00:05:28,200 I mean, I remember an early example of that that I, 105 00:05:28,200 --> 00:05:29,920 you know, I struggled to understand 106 00:05:29,920 --> 00:05:32,500 was the idea of stochastic equicontinuity, 107 00:05:32,500 --> 00:05:35,070 which one of my thesis advisors, Whitney Newey, 108 00:05:35,070 --> 00:05:36,900 was using to great effect 109 00:05:37,500 --> 00:05:39,900 and I was trying to understand that and there isn't really -- 110 00:05:40,600 --> 00:05:45,200 It's really foundational, it's not an application that's driving that, 111 00:05:45,890 --> 00:05:47,300 at least not immediately. 112 00:05:48,600 --> 00:05:53,200 But most things are not like that and so there should be a problem. 113 00:05:53,800 --> 00:05:59,100 And I think it's on the seller of that sort of thing, 114 00:06:00,100 --> 00:06:02,250 you know, because there's opportunity cost, 115 00:06:02,250 --> 00:06:05,170 the time and attention, and effort to understand things 116 00:06:05,170 --> 00:06:07,200 to, you know, it's on the seller to say, 117 00:06:07,400 --> 00:06:08,900 "Hey, I'm solving this problem 118 00:06:09,400 --> 00:06:12,900 and here's a set of results that show that it's useful, 119 00:06:12,900 --> 00:06:15,200 and here's some insight that I get." 120 00:06:16,200 --> 00:06:18,280 - [Isaiah] As you said, Josh, great, sort of there's been a move 121 00:06:18,280 --> 00:06:20,700 in the direction of thinking more about causality 122 00:06:20,700 --> 00:06:22,800 in economics and empirical work in economics, 123 00:06:22,900 --> 00:06:24,800 any consequences of sort of the -- 124 00:06:24,800 --> 00:06:26,570 the spread of that view that surprised you 125 00:06:26,570 --> 00:06:28,705 or anything that you view as downsides 126 00:06:28,705 --> 00:06:31,400 of sort of the way that empirical economics has gone? 127 00:06:31,500 --> 00:06:34,190 - [Josh] Sometimes I see, somebody does IV 128 00:06:34,190 --> 00:06:38,304 and they get a result which seems implausibly large. 129 00:06:38,800 --> 00:06:40,200 That's the usual case. 130 00:06:42,500 --> 00:06:45,220 So it might be, you know, an extraordinarily large 131 00:06:45,220 --> 00:06:48,600 causal effect of some relatively minor intervention, 132 00:06:49,100 --> 00:06:52,260 which was randomized or for which you could make a case 133 00:06:52,260 --> 00:06:54,490 that there's a good design. 134 00:06:54,900 --> 00:06:58,330 And then when I see that, and, you know, I think 135 00:06:58,900 --> 00:07:00,465 it's very hard for me to believe 136 00:07:00,465 --> 00:07:02,030 that this relatively minor intervention 137 00:07:02,030 --> 00:07:03,720 has such a large effect. 138 00:07:04,100 --> 00:07:06,110 The author will sometimes resort 139 00:07:06,110 --> 00:07:08,690 to the local average treatment effects theorem 140 00:07:08,690 --> 00:07:10,695 and say, "Well, these compliers, 141 00:07:10,695 --> 00:07:12,700 you know, they're special in some way." 142 00:07:13,300 --> 00:07:15,800 And, you know, they just benefit extraordinarily 143 00:07:15,800 --> 00:07:17,600 from this intervention. 144 00:07:18,100 --> 00:07:20,900 And I'm reluctant to take that at face value. 145 00:07:20,900 --> 00:07:23,750 I think, you know, often when effects are too big, 146 00:07:24,300 --> 00:07:26,780 it's because the exclusion restriction is failing, 147 00:07:26,780 --> 00:07:29,240 so you don't really have the right endogenous variable 148 00:07:29,240 --> 00:07:31,380 to scale that result. 149 00:07:32,000 --> 00:07:35,700 And so I'm not too happy to see 150 00:07:35,700 --> 00:07:38,800 you know, just sort of a generic heterogeneity 151 00:07:38,900 --> 00:07:41,610 argument being used to excuse something 152 00:07:41,610 --> 00:07:43,800 that I think might be a deeper problem. 153 00:07:45,190 --> 00:07:47,358 - [Guido] I think it played somewhat of an unfortunate role 154 00:07:47,358 --> 00:07:49,979 when the discussions kind of between reduced form 155 00:07:49,979 --> 00:07:51,700 and structural approaches, 156 00:07:51,700 --> 00:07:55,510 where I feel that wasn't quite right. 157 00:07:56,090 --> 00:07:58,810 The instrumental variables assumptions 158 00:07:58,810 --> 00:08:03,622 are at the core - structural assumptions about behavior - 159 00:08:03,622 --> 00:08:05,200 they were coming from economic -- 160 00:08:07,100 --> 00:08:09,900 thinking about the economic behavior of agents, 161 00:08:10,300 --> 00:08:15,000 and somehow it got pushed in a direction 162 00:08:15,100 --> 00:08:17,600 that I think wasn't really very helpful. 163 00:08:18,800 --> 00:08:21,700 The way I think, initially the -- 164 00:08:22,800 --> 00:08:26,480 we wrote things up, it was describing what was happening, 165 00:08:26,480 --> 00:08:29,490 there were a set of methods people were using, 166 00:08:29,490 --> 00:08:32,111 we clarified what those methods were doing 167 00:08:32,811 --> 00:08:38,361 and in a way that I think contain a fair amount of insight, 168 00:08:39,100 --> 00:08:42,050 but it somehow it got pushed into a corner 169 00:08:42,050 --> 00:08:45,000 that I don't think was necessarily very helpful. 170 00:08:45,100 --> 00:08:46,850 - [Isaiah] I mean, just the language 171 00:08:46,850 --> 00:08:48,600 of reduced form versus structural 172 00:08:48,600 --> 00:08:50,820 I find kind of funny in the sense that, 173 00:08:50,820 --> 00:08:53,100 right, the local average treatment effect model, right, 174 00:08:53,100 --> 00:08:56,110 the potential outcomes model is a nonparametric structural model, 175 00:08:56,110 --> 00:08:58,600 if you want to think about it, as you sort of suggested, Guido. 176 00:08:58,600 --> 00:09:01,263 So there's something a little funny 177 00:09:01,263 --> 00:09:03,340 about putting these two things in oposition when -- 178 00:09:03,340 --> 00:09:04,690 - [Guido] Yes. - [Josh] Well, that language, 179 00:09:04,690 --> 00:09:08,165 of course, comes from the [inaudible] equations framework 180 00:09:08,165 --> 00:09:09,520 that we inherited. 181 00:09:10,400 --> 00:09:11,530 It has the advantage 182 00:09:11,530 --> 00:09:14,160 that people seem to know what you mean when you use it, 183 00:09:14,160 --> 00:09:16,240 but might be that people are hearing different, 184 00:09:16,240 --> 00:09:18,200 different people are hearing different things. 185 00:09:18,300 --> 00:09:20,530 - [Guido] Yeah. I think [inaudible] has sort of become -- 186 00:09:20,530 --> 00:09:22,560 used in a little bit of the pejorative way, yeah? 187 00:09:22,560 --> 00:09:24,750 - [Josh] Sometimes. - [Guido] [The word]. 188 00:09:24,750 --> 00:09:28,250 Which is not really quite what it was originally intended for. 189 00:09:30,100 --> 00:09:33,090 - [Isaiah] I guess something else that strikes me in thinking about 190 00:09:33,090 --> 00:09:35,645 the effects of the local average treatment effect framework 191 00:09:35,645 --> 00:09:38,200 is that often folks will appeal to 192 00:09:38,200 --> 00:09:41,800 a local average treatment effects intuition for settings well beyond 193 00:09:42,000 --> 00:09:43,700 ones where any sort of formal result 194 00:09:43,700 --> 00:09:45,440 has actually been established. 195 00:09:45,440 --> 00:09:50,090 And I'm curious, given all the work that you guys did to, you know, 196 00:09:50,090 --> 00:09:52,390 establish late results in different settings, 197 00:09:52,390 --> 00:09:54,415 I'm curious, any thoughts on that? 198 00:09:55,360 --> 00:09:57,420 - [Guido] I think there's going to be a lot of cases 199 00:09:57,420 --> 00:10:02,200 where the intuition does get you some distance, 200 00:10:02,800 --> 00:10:05,200 but it's going to be somewhat limited 201 00:10:05,200 --> 00:10:07,600 and establishing formal results there 202 00:10:08,400 --> 00:10:09,490 may be a little tricky 203 00:10:09,490 --> 00:10:12,700 and then maybe only work in special circumstances, 204 00:10:14,600 --> 00:10:16,540 and you end up with a lot of formality 205 00:10:16,540 --> 00:10:19,500 that may not quite capture the intuition. 206 00:10:19,900 --> 00:10:21,550 Sometimes I'm somewhat uneasy with them 207 00:10:21,550 --> 00:10:24,438 and they are not necessarily the papers I would want to write, 208 00:10:25,148 --> 00:10:27,218 but I do think something -- 209 00:10:27,218 --> 00:10:31,217 intuition often does capture part of the problem. 210 00:10:33,100 --> 00:10:36,300 I think, in some sense we were kind of very fortunate there 211 00:10:36,900 --> 00:10:39,250 in the way that the late paper got handled at the journal, 212 00:10:39,250 --> 00:10:41,766 is that, actually, the editor, made it much shorter 213 00:10:42,100 --> 00:10:46,300 and that then allowed us to kind of focus on very clear, crisp results. 214 00:10:47,100 --> 00:10:51,770 Where if you -- you know, this somewhat unfortunate tendency 215 00:10:51,770 --> 00:10:52,985 in the econometrics literature 216 00:10:52,985 --> 00:10:55,100 of having the papers get longer and longer. 217 00:10:55,100 --> 00:10:56,690 - [Josh] Well, you should be able to fix that, man. 218 00:10:56,690 --> 00:10:58,800 - [Guido] I'm trying to fix that. 219 00:10:59,400 --> 00:11:01,625 But I think this is an example where it's sort of very clear 220 00:11:01,625 --> 00:11:03,498 that having it be short is actually -- 221 00:11:03,498 --> 00:11:04,842 - [Josh] You should impose that no paper 222 00:11:04,842 --> 00:11:06,655 can be longer than the late paper. 223 00:11:06,655 --> 00:11:08,000 - [Guido] That, wow. 224 00:11:08,000 --> 00:11:09,617 That may be great. 225 00:11:09,617 --> 00:11:11,685 - [Josh] At least no theory, no theory paper. 226 00:11:11,892 --> 00:11:14,300 - [Guido] Yeah, and I think, I think... 227 00:11:14,500 --> 00:11:16,800 I'm trying very hard to get the papers to be shorter. 228 00:11:16,800 --> 00:11:18,700 And I think there's a lot of value 229 00:11:19,200 --> 00:11:21,480 today because it's often the second part of the paper 230 00:11:21,480 --> 00:11:26,395 that doesn't actually get you much further in understanding things 231 00:11:27,000 --> 00:11:29,870 and it does make things much harder to read 232 00:11:30,630 --> 00:11:33,200 and, you know, it sort of goes back 233 00:11:33,200 --> 00:11:36,111 to how I think econometrics should be done, 234 00:11:36,111 --> 00:11:38,070 you should focus on -- 235 00:11:38,700 --> 00:11:41,300 It should be reasonably close to empirical problems. 236 00:11:41,500 --> 00:11:43,900 They should be very clear problems. 237 00:11:44,800 --> 00:11:48,900 But then often the theory doesn't need to be quite so long. 238 00:11:48,900 --> 00:11:50,010 - [Josh] Yeah. 239 00:11:51,100 --> 00:11:54,670 - [Guido] I think things have gone a little off track. 240 00:11:56,260 --> 00:11:57,750 - [Isaiah] A new relatively recent change 241 00:11:57,750 --> 00:12:00,230 has been a seeming big increase in demand 242 00:12:00,230 --> 00:12:02,200 for people with sort of econometrics, 243 00:12:02,200 --> 00:12:04,800 causal effect estimation skills in the tech sector. 244 00:12:05,000 --> 00:12:07,480 I'm interested, do either of you have thoughts 245 00:12:07,480 --> 00:12:09,840 on sort of how that's going to interact 246 00:12:09,840 --> 00:12:11,600 with the development of empirical methods, 247 00:12:11,600 --> 00:12:13,950 or empirical research in economics going forward? 248 00:12:14,600 --> 00:12:16,770 - [Josh] Well, there's sort of a meta point, 249 00:12:16,770 --> 00:12:21,000 which is, there's this new kind of employer, 250 00:12:21,800 --> 00:12:27,530 the Amazons and the Uber, and, you know, TripAdvisor world, 251 00:12:28,000 --> 00:12:29,300 and I think that's great. 252 00:12:29,300 --> 00:12:32,600 And I like to tell my students about that, you know, especially -- 253 00:12:32,600 --> 00:12:35,500 at MIT we have a lot of computer science majors. 254 00:12:35,500 --> 00:12:37,000 That's our biggest major. 255 00:12:37,400 --> 00:12:42,246 And I try to seduce some of those folks into economics by saying, 256 00:12:42,246 --> 00:12:45,700 you know, you can go work for these, 257 00:12:45,800 --> 00:12:49,250 you know, companies that people are very keen to work for 258 00:12:49,250 --> 00:12:50,800 because the work seems exciting, 259 00:12:52,000 --> 00:12:54,250 you know, that the skills that you get in econometrics 260 00:12:54,250 --> 00:12:56,100 are as good or better 261 00:12:56,100 --> 00:12:59,736 than any competing discipline has to offer. 262 00:12:59,736 --> 00:13:01,100 So you should at least 263 00:13:01,400 --> 00:13:04,200 take some econ, take some econometrics, and some econ. 264 00:13:04,800 --> 00:13:07,000 I did a fun project with Uber 265 00:13:07,600 --> 00:13:09,770 on labor supply of Uber drivers 266 00:13:09,920 --> 00:13:12,805 and was very, very exciting to be part of that. 267 00:13:13,100 --> 00:13:15,400 Plus I got to drive for Uber for a while 268 00:13:15,900 --> 00:13:17,730 and I thought that was fun too. 269 00:13:17,730 --> 00:13:20,700 I did not make enough that I was tempted to 270 00:13:21,100 --> 00:13:25,100 give up my MIT job, but I enjoyed the experience. 271 00:13:25,230 --> 00:13:30,900 I see a potential challenge to our model of graduate education here, 272 00:13:31,700 --> 00:13:37,400 which is, if we're training people to go work at Amazon, you know, 273 00:13:37,900 --> 00:13:41,190 it's not clear why, you know, we should be paying 274 00:13:41,190 --> 00:13:42,900 graduate stipends for that. 275 00:13:43,200 --> 00:13:47,280 Why should the taxpayer effectively be subsidizing that. 276 00:13:47,280 --> 00:13:51,400 Our graduate education in the US Is generously subsidized, 277 00:13:51,400 --> 00:13:53,160 even in private universities, 278 00:13:53,160 --> 00:13:56,100 it's ultimately -- there's a lot of public money there, 279 00:13:56,100 --> 00:13:59,300 and I think the traditional rationale for that is, 280 00:13:59,500 --> 00:14:03,900 you know, we were training educators and scholars, and there's a great externality 281 00:14:04,300 --> 00:14:05,700 from the work that we do. 282 00:14:05,700 --> 00:14:09,600 It's either the research externality, or a teaching externality. 283 00:14:10,100 --> 00:14:14,600 But, you know, if many of our students are going to work in the private sector, 284 00:14:16,300 --> 00:14:21,700 that's fine, but that maybe their employers should pay for that. 285 00:14:22,300 --> 00:14:25,100 He says, so different from people working for a Consulting. 286 00:14:26,300 --> 00:14:26,900 Trust me. 287 00:14:27,200 --> 00:14:33,000 It's not clear to me that the number of jobs in academics has changed. 288 00:14:33,100 --> 00:14:37,600 It's just, I feel like this is a growing sector whereas Consulting, 289 00:14:37,700 --> 00:14:42,100 your right to raise that, it might be the same for for Consulting. 290 00:14:43,300 --> 00:14:44,400 But this, 291 00:14:44,500 --> 00:14:47,500 you know, I'm placing more and more students in these businesses. 292 00:14:47,500 --> 00:14:50,400 So, it's on my mind in a way that I've sort of, 293 00:14:50,800 --> 00:14:55,500 you know, not been attentive to consulting jobs, you know, Consulting was always, 294 00:14:55,600 --> 00:15:00,400 It's important and I think they'll so there's some movement from Consulting back 295 00:15:00,400 --> 00:15:02,600 into research. It's a little more fluid. 296 00:15:02,900 --> 00:15:03,500 The, 297 00:15:03,900 --> 00:15:05,400 a lot of the work in the 298 00:15:06,400 --> 00:15:09,800 in both domains. I have to say, it's not really different but 299 00:15:10,100 --> 00:15:13,700 you know, people who are working in the tech sector are doing things 300 00:15:13,700 --> 00:15:16,800 that are potentially of scientific interest, but mostly it's hidden. 301 00:15:17,100 --> 00:15:20,900 Then you really I have to say, you know, why, why is the government paying for this? 302 00:15:21,800 --> 00:15:25,500 Yeah, although yeah, I mean taquitos point, I guess it. There's a, there's a data. 303 00:15:25,600 --> 00:15:30,000 Question here of it has the sort of total nanak. It sort of say 304 00:15:30,500 --> 00:15:30,900 private 305 00:15:31,300 --> 00:15:34,100 for-profit sector employment of econ Ph.D. 306 00:15:34,100 --> 00:15:37,700 Program graduates increased or has it just been a substitution from 307 00:15:37,900 --> 00:15:40,200 finance and Consulting towards tack. 308 00:15:40,300 --> 00:15:44,300 I may be a reaction to something that's not really happening 309 00:15:44,400 --> 00:15:48,200 so bad. I've actually done some work with some of these tech companies. 310 00:15:49,100 --> 00:15:52,300 So I don't disagree with Justice point that we need to think 311 00:15:52,300 --> 00:15:55,100 a little bit about the funding model whose it was in the end paying for the 312 00:15:55,600 --> 00:15:59,400 It education. But from a scientific perspective. 313 00:16:00,100 --> 00:16:03,500 The only do these places have have great data and nowadays. 314 00:16:03,500 --> 00:16:07,100 They tend to be very careful with that for privacy reasons, 315 00:16:07,500 --> 00:16:08,800 but also have great questions. 316 00:16:10,200 --> 00:16:11,200 I find it very 317 00:16:11,600 --> 00:16:13,300 inspiring kind of to listen to 318 00:16:13,300 --> 00:16:15,800 the people there and kind of see what kind of questions they have 319 00:16:15,900 --> 00:16:17,300 and often their questions. 320 00:16:18,200 --> 00:16:20,600 That also come up outside of these. 321 00:16:20,700 --> 00:16:25,400 These companies have a couple of papers with the rights in the chat. 322 00:16:25,800 --> 00:16:26,300 And then 323 00:16:26,500 --> 00:16:31,600 as soon as an atheist kind of where we look at ways of combining experimental data 324 00:16:31,700 --> 00:16:34,100 and observational data, and can it there. 325 00:16:35,500 --> 00:16:38,600 Rights Chetty was interested in what is the effect 326 00:16:38,700 --> 00:16:44,600 of Early Childhood programs on outcomes later in life? Not just kind of test scores, 327 00:16:44,600 --> 00:16:48,300 but on earnings and stuff, and we cannot be developed methods 328 00:16:48,600 --> 00:16:51,500 that would help you shed light on that, on the some, 329 00:16:52,700 --> 00:16:55,000 in some settings and the same problems. 330 00:16:56,300 --> 00:17:00,500 Came up kind of in this tech company settings. 331 00:17:00,800 --> 00:17:03,700 And so for my perspective, it's 332 00:17:04,400 --> 00:17:07,500 the same kind of a stocking two people doing a protocol work. 333 00:17:07,600 --> 00:17:11,800 I tried to kind of look at these specific problems and then try to come up 334 00:17:11,900 --> 00:17:18,300 with more General problems that we formulating the problems at a higher level. 335 00:17:18,500 --> 00:17:22,900 So that I can think about solutions that work in a range of settings. 336 00:17:23,400 --> 00:17:25,500 And so from that perspective, the 337 00:17:25,700 --> 00:17:30,300 His with the the tech companies I just very valuable and very useful. 338 00:17:30,900 --> 00:17:31,400 It's know. 339 00:17:31,700 --> 00:17:33,700 We do have students. Now spent 340 00:17:33,800 --> 00:17:37,200 doing internships there and then coming back and writing 341 00:17:37,400 --> 00:17:43,400 more interesting thesis, as a result of their experiences there. 342 00:17:44,600 --> 00:17:47,020 - [Narrator] If you'd like to watch more Nobel Conversations, 343 00:17:47,020 --> 00:17:48,200 click here, 344 00:17:48,200 --> 00:17:50,500 or if you'd like to learn more about econometrics, 345 00:17:50,500 --> 00:17:53,100 check out Josh's "Mastering Econometrics" series. 346 00:17:53,700 --> 00:17:56,720 If you'd like to learn more about Guido, Josh and Isaiah 347 00:17:56,720 --> 00:17:58,300 check out the links in the description. 348 99:59:59,999 --> 99:59:59,999 ♪ [music] ♪