1 00:00:03,800 --> 00:00:05,800 Welcome to Nobel conversations. 2 00:00:07,100 --> 00:00:08,100 In this episode, 3 00:00:08,200 --> 00:00:10,200 Josh Angrist and Guido Imbens 4 00:00:10,200 --> 00:00:14,600 sit down with Isaiah Andrews to discuss how the field of econometrics is evolving. 5 00:00:16,100 --> 00:00:17,900 So he doing Josh, you're both 6 00:00:18,200 --> 00:00:21,500 pioneers and developing tools for Empirical research in economics. 7 00:00:21,500 --> 00:00:25,300 And so I'd like to explore sort of where you feel like the field is heading, 8 00:00:25,300 --> 00:00:29,400 sort of Economics econometrics. The whole thing to start, 9 00:00:29,900 --> 00:00:31,400 I'd be interested to hear 10 00:00:32,200 --> 00:00:35,200 about whether you feel like the sort of the way in which 11 00:00:35,200 --> 00:00:36,700 the local average treatment effects from. 12 00:00:36,800 --> 00:00:38,700 Work sort of took hold 13 00:00:38,800 --> 00:00:43,500 has any lessons for how new empirical methods and economics develop and spread or 14 00:00:43,500 --> 00:00:45,700 how they should? That's a good question. 15 00:00:46,700 --> 00:00:47,500 You go first. 16 00:00:49,700 --> 00:00:55,100 Yeah, somebody I think the important thing is to come up with with 17 00:00:55,300 --> 00:00:59,200 the good conferencing cases. Where 18 00:01:00,600 --> 00:01:02,100 The questions are clear 19 00:01:02,400 --> 00:01:06,200 and we're kind of the methods apply in general. So maybe 20 00:01:06,400 --> 00:01:12,000 one thing I kind of looking back at the subsequent literature. 21 00:01:12,200 --> 00:01:13,400 So I really like the 22 00:01:13,600 --> 00:01:17,300 the regression discontinuity literature. Whenever 23 00:01:17,400 --> 00:01:21,000 clearly a bunch of really convincing examples and that allowed people to kind of 24 00:01:22,300 --> 00:01:27,200 think more clearly look harder at a methodological. The questions, 25 00:01:27,400 --> 00:01:30,200 can it do a clear applications that allow you to kind of think about 26 00:01:30,500 --> 00:01:34,200 Wow, do this type of sumption seem reasonable here. What kind of 27 00:01:34,300 --> 00:01:38,000 what kind of things do we not? Like in this, the early papers? 28 00:01:38,500 --> 00:01:42,300 How can we improve things? So having clear applications motivating, 29 00:01:43,400 --> 00:01:46,400 these legislators? I think it's it's very helpful. 30 00:01:46,800 --> 00:01:49,300 I'm glad you mentioned the regression discontinuity. He do. 31 00:01:49,300 --> 00:01:53,300 I think there's a lot of complementarity between IV and RD. 32 00:01:54,700 --> 00:02:00,200 Instrumental variables and regression discontinuity and the 33 00:02:00,600 --> 00:02:05,800 A lot of the Econo metric applications of regression discontinuity are what used 34 00:02:05,800 --> 00:02:07,800 to be called? Fuzzy rdd where 35 00:02:08,500 --> 00:02:12,100 you know, it's not discrete or deterministic at the cutoff, but 36 00:02:12,500 --> 00:02:14,900 just to change in rates or intensity. 37 00:02:14,900 --> 00:02:19,800 And and the late framework helps us understand those applications and gives us 38 00:02:19,800 --> 00:02:25,000 a clear interpretation for say something like, in my paper with Victor Lavie, 39 00:02:25,000 --> 00:02:28,100 where we use, maimonides rule, the class size cut-offs. 40 00:02:28,600 --> 00:02:30,200 What are you getting there? So? 41 00:02:30,400 --> 00:02:33,900 Of course, you can answer that question with a linear constant effects model, 42 00:02:34,200 --> 00:02:37,600 but it turns out we're not limited to that. And an RD is still 43 00:02:37,700 --> 00:02:39,900 very powerful and Illuminating, 44 00:02:40,900 --> 00:02:42,100 even when you know, 45 00:02:42,100 --> 00:02:46,000 the correlation between the cut off and the variable of interest in 46 00:02:46,000 --> 00:02:51,000 this case class size is partial, maybe even not that strong. 47 00:02:52,000 --> 00:02:56,400 So there was definitely a kind of a parallel development. It's also interesting, 48 00:02:56,600 --> 00:03:00,200 you know, nobody talked about regression discontinuity designs when we 49 00:03:00,300 --> 00:03:02,100 Were in graduate school. It was 50 00:03:02,400 --> 00:03:05,300 something that other social scientists were interested in 51 00:03:05,800 --> 00:03:12,300 and that kind of grew up alongside the late framework and we've both done work on 52 00:03:12,500 --> 00:03:15,000 both applications and methods there and 53 00:03:15,400 --> 00:03:19,800 it's been very exciting to see that kind of develop and become so important. 54 00:03:20,000 --> 00:03:23,700 It's part of a general Evolution. I think towards you know, 55 00:03:24,000 --> 00:03:30,200 credible identification, strategies, causal effects less, you know, making a condom 56 00:03:30,400 --> 00:03:33,300 Tricks more about causal questions that about models 57 00:03:33,700 --> 00:03:38,700 in terms of the future. I think one thing that Slade has helped facilitate as a move 58 00:03:38,700 --> 00:03:42,700 towards more creative randomized, trials, where, 59 00:03:42,800 --> 00:03:44,400 you know, there's something of interest, 60 00:03:45,500 --> 00:03:50,700 it's not possible or straightforward to Simply turn it off or on 61 00:03:51,000 --> 00:03:52,900 but you can encourage it 62 00:03:53,200 --> 00:03:58,200 or discourage it. So, you subsidize schooling with financial aid, for example, 63 00:03:59,000 --> 00:04:00,200 so now we have a whole 64 00:04:00,300 --> 00:04:02,300 Framework for interpreting that. 65 00:04:02,900 --> 00:04:03,500 And, 66 00:04:03,600 --> 00:04:06,900 and it kind of opens the doors to randomized, 67 00:04:06,900 --> 00:04:09,400 Trials of things that that maybe would, 68 00:04:09,700 --> 00:04:10,200 you know, 69 00:04:10,300 --> 00:04:13,800 not have seen possible before we've, 70 00:04:14,500 --> 00:04:19,700 we've used that a lot in the work. We do on schools in our in the blueprint Lab 71 00:04:20,000 --> 00:04:26,600 at MIT were exploiting random assignment and in very creative ways, I think. 72 00:04:28,100 --> 00:04:32,300 Related to that. Do you see sort of particular factors that make for 73 00:04:32,400 --> 00:04:34,300 useful research and econometrics. 74 00:04:34,400 --> 00:04:37,400 You've alluded to it? 75 00:04:37,400 --> 00:04:40,300 Having a clear connection to problems that are actually coming up. 76 00:04:40,300 --> 00:04:45,000 And empirical practice is often a good idea. I'll send it. Always a good idea. 77 00:04:45,700 --> 00:04:50,100 I often find myself sitting in an economy metrics Theory, seminar. 78 00:04:50,700 --> 00:04:52,500 Say the Harvard MIT seminar 79 00:04:53,400 --> 00:04:57,000 and I'm thinking what problem is this guy solving who has this? 80 00:04:57,100 --> 00:04:59,800 This problem and you know, 81 00:05:01,600 --> 00:05:04,700 sometimes there's an embarrassing silence if I ask 82 00:05:04,900 --> 00:05:08,300 or there might be a fairly contrived scenario. 83 00:05:08,800 --> 00:05:11,600 I want to see where the tool is useful. 84 00:05:12,500 --> 00:05:14,900 There are some purely foundational tools. 85 00:05:14,900 --> 00:05:17,600 I do take the point, you know, there are people who are 86 00:05:18,200 --> 00:05:22,500 working on conceptual foundations of you know, 87 00:05:22,600 --> 00:05:25,300 it's more becomes more like mathematical statistics. 88 00:05:25,800 --> 00:05:27,000 I mean, I remember an early example 89 00:05:27,100 --> 00:05:28,100 I believe that that I, 90 00:05:28,200 --> 00:05:32,500 you know, I struggled to understand was the idea of stochastic Equity continuity, 91 00:05:32,500 --> 00:05:36,900 which my one of my thesis advisors Whitney knew he was using to great effect and 92 00:05:37,500 --> 00:05:39,900 I was trying to understand that and there isn't really. 93 00:05:40,600 --> 00:05:45,200 It's really foundational. It's not but an application that's driving that 94 00:05:46,100 --> 00:05:47,300 at least not immediately 95 00:05:48,600 --> 00:05:53,200 but but most things are not like that and so there should be a problem. 96 00:05:53,800 --> 00:05:56,900 And the I think it's on the it's on 97 00:05:57,100 --> 00:05:59,600 On the, the seller of that sort of thing, 98 00:06:00,100 --> 00:06:04,400 you know, because there's opportunity cost the time and attention and effort 99 00:06:04,400 --> 00:06:07,200 to understand things to, you know, it's on the seller to say. 100 00:06:07,400 --> 00:06:08,900 Hey, I'm solving this problem 101 00:06:09,400 --> 00:06:12,900 and and here's a set of results that show that it's useful. 102 00:06:12,900 --> 00:06:15,200 And here's some insight that I get. 103 00:06:16,200 --> 00:06:17,200 As you said, Josh, great, 104 00:06:17,200 --> 00:06:20,700 sort of there's been a move in the direction of thinking more about causality 105 00:06:20,700 --> 00:06:22,800 in economics and empirical work in economics, 106 00:06:22,900 --> 00:06:24,800 any consequences of sort of the Wilds, 107 00:06:24,800 --> 00:06:27,000 the spread of that view. That surprised you or anything. 108 00:06:27,100 --> 00:06:31,400 UB was downsides of sort of the way that he could empirical, economics has gone 109 00:06:31,500 --> 00:06:32,100 sometimes. 110 00:06:32,100 --> 00:06:38,500 I see somebody does Ivy and they get a result which seems implausibly large. 111 00:06:38,800 --> 00:06:40,200 That's the usual case. 112 00:06:42,500 --> 00:06:43,900 So it might be, you know, 113 00:06:43,900 --> 00:06:48,600 an extraordinarily large causal effect of some relatively minor Intervention, 114 00:06:49,100 --> 00:06:52,800 which was randomized or for which you could make a case that 115 00:06:52,900 --> 00:06:56,900 that there's a good design. And then when I see that, 116 00:06:57,100 --> 00:06:57,600 That and, 117 00:06:57,600 --> 00:06:58,500 you know, I think, 118 00:06:58,500 --> 00:06:58,900 you know, 119 00:06:58,900 --> 00:07:02,600 it's very hard for me to believe that this relatively minor intervention has such 120 00:07:02,600 --> 00:07:04,000 a large defect, 121 00:07:04,100 --> 00:07:07,400 the author. Well, sometimes resort to the local average, 122 00:07:07,400 --> 00:07:09,100 treatment effects theorem and say, 123 00:07:09,400 --> 00:07:12,700 wow, these compliers, you know, they're special in some way. 124 00:07:13,300 --> 00:07:17,600 And, you know, they just benefit extraordinarily from this intervention 125 00:07:18,100 --> 00:07:22,100 and I'm reluctant to take that at face value. I think, you know, 126 00:07:22,100 --> 00:07:24,100 often when effects are too big, 127 00:07:24,300 --> 00:07:26,900 it's because the exclusion restriction is failing. So 128 00:07:27,100 --> 00:07:31,700 Don't really have the right endogenous variable to scale that result. 129 00:07:32,000 --> 00:07:35,700 And so I'm not too happy to see 130 00:07:35,800 --> 00:07:38,800 you know, just sort of a generic heterogeneity 131 00:07:38,900 --> 00:07:43,800 argument being used to excuse something that I think might be a deeper problem. 132 00:07:45,300 --> 00:07:47,400 I think it played somewhat of an unfortunate roll pin. 133 00:07:47,400 --> 00:07:52,300 The discussions kind of between reduced form and structural approaches where 134 00:07:52,600 --> 00:07:54,200 I feel that wasn't quite 135 00:07:55,000 --> 00:07:59,300 right. The instrumental variables assumptions are 136 00:08:00,400 --> 00:08:05,200 at the core structural assumptions about Behavior. They were coming from economic 137 00:08:07,100 --> 00:08:09,900 thinking about the economic behavior of agents, 138 00:08:10,300 --> 00:08:15,000 and it's somehow it got pushed in a Direction. 139 00:08:15,100 --> 00:08:17,600 That I think wasn't really very helpful. If 140 00:08:18,800 --> 00:08:21,700 the way I think, initially the 141 00:08:22,800 --> 00:08:27,300 we wrote things up. It was it was describing what was happening, there was set of 142 00:08:27,500 --> 00:08:32,200 methods. People were using be clarified what those methods were doing 143 00:08:32,900 --> 00:08:38,500 and in a way that I think contain a fair amount of insight, 144 00:08:39,100 --> 00:08:45,000 but it somehow it got pushed into a corner that I think was not necessarily very 145 00:08:45,100 --> 00:08:48,600 or even just the language of reduced form versus structural. 146 00:08:48,600 --> 00:08:51,100 I find kind of funny in the sense that the right 147 00:08:51,100 --> 00:08:53,100 the local average treatment effect model, right? 148 00:08:53,100 --> 00:08:55,300 The potential outcomes model is a nonparametric. 149 00:08:55,300 --> 00:08:56,200 Structural model, 150 00:08:56,200 --> 00:08:58,600 if you want to think about it, as you sort of suggested, he does. 151 00:08:58,600 --> 00:08:59,400 So, there's something, 152 00:09:00,100 --> 00:09:03,700 there's something a little funny about putting these two things in a position when 153 00:09:03,800 --> 00:09:06,600 yes, well, that language, of course, comes from the area, the 154 00:09:06,900 --> 00:09:09,800 70s equations framework that we inherited. 155 00:09:10,400 --> 00:09:12,400 It has the advantage that people seem 156 00:09:12,400 --> 00:09:15,000 to know what you mean when you use it, but might 157 00:09:15,100 --> 00:09:18,200 That people are hearing different. Different people are hearing different things. 158 00:09:18,300 --> 00:09:20,900 Yeah. I think I think veggies Farmers had become use 159 00:09:20,900 --> 00:09:23,200 a little bit of the pejoratives. Okay? Yeah. 160 00:09:23,300 --> 00:09:28,300 The word, which is not really quite what it was originally intended for. 161 00:09:30,100 --> 00:09:34,100 I guess something else that strikes me in thinking about the effects of 162 00:09:34,100 --> 00:09:38,200 the local average treatment effect framework is that often folks will appeal to 163 00:09:38,200 --> 00:09:41,800 a local average, treatment effects intuition for settings. Well, beyond 164 00:09:42,000 --> 00:09:44,900 ones, where any sort of formal results has actually been 165 00:09:45,000 --> 00:09:49,700 Shhhhht. And I'm curious given all the work that you guys did to, 166 00:09:49,900 --> 00:09:53,200 you know, establish late results in different in different settings. I'm curious 167 00:09:53,300 --> 00:09:57,800 any thoughts on that. I think there's going to be a lot of cases where 168 00:09:57,900 --> 00:10:02,200 the intuition does get. You get you some distance, 169 00:10:02,800 --> 00:10:07,600 but it's going to be somewhat limited and establishing formal results. There 170 00:10:08,400 --> 00:10:12,700 may be a little tricky and there may be only work in special circumstances, 171 00:10:13,100 --> 00:10:13,500 you need. 172 00:10:14,600 --> 00:10:19,500 And you end up with a lot of formality that may not quite capture the intuition 173 00:10:19,900 --> 00:10:23,200 sometimes I'm somewhat uneasy with them and they are not necessarily the papers. 174 00:10:23,200 --> 00:10:25,000 I would want to ride that the 175 00:10:25,100 --> 00:10:30,000 but I do think something do intuition orphaned US capture part of the 176 00:10:30,200 --> 00:10:31,100 of the problem. 177 00:10:33,100 --> 00:10:36,300 I think, in some sense we were kind of very fortunate there 178 00:10:36,900 --> 00:10:40,500 in the way. The late paper go handle. It. Don't know if that, actually the editor, 179 00:10:40,600 --> 00:10:41,700 made it much shorter 180 00:10:42,100 --> 00:10:46,300 and that then allowed us to kind of focus on very clear, crisp results 181 00:10:47,100 --> 00:10:49,800 where if, you know, this, 182 00:10:50,000 --> 00:10:54,200 this is somewhat unfortunate tendency in the commercialization of having the papers. 183 00:10:54,600 --> 00:10:58,800 Well, you should be able to fix that, man. I'm trying to take some time to fix that. 184 00:10:59,400 --> 00:11:02,700 I think this is an example where it's sort of very clear that having it. Be sure. 185 00:11:02,900 --> 00:11:08,000 It's actually impose that no paper can be longer than the late paper that wow. 186 00:11:08,800 --> 00:11:14,300 Great. At least no Theory. No Theory Pig. Yeah, and I think, I think they're well, 187 00:11:14,500 --> 00:11:16,800 I'm trying very hard to get the papers to be shorter. 188 00:11:16,800 --> 00:11:18,700 And I think there's a lot of value 189 00:11:19,200 --> 00:11:22,600 today because it's often the second part of the paper that doesn't actually 190 00:11:23,700 --> 00:11:26,500 Get you much further and understanding things 191 00:11:27,000 --> 00:11:31,700 but and it does make things much harder to read and, you know, 192 00:11:32,400 --> 00:11:33,700 it sort of goes back to 193 00:11:34,200 --> 00:11:38,500 how I think he kind of a trick should be done to you should focus on the see. 194 00:11:38,700 --> 00:11:41,300 It should be reasonably close to empirical problems. 195 00:11:41,500 --> 00:11:43,900 They should be very clear problems. 196 00:11:44,800 --> 00:11:48,900 But then often the the theory doesn't need to be quite so long. 197 00:11:49,000 --> 00:11:49,300 Yeah, 198 00:11:51,100 --> 00:11:53,400 I think they had things have 199 00:11:53,600 --> 00:11:54,700 On a little off track. 200 00:11:56,400 --> 00:11:58,400 The relatively recent change has been a 201 00:11:58,500 --> 00:12:02,200 seeming big increase in demand for people with sort of econometrics. 202 00:12:02,200 --> 00:12:04,800 Causal effect, estimation skills in the tech sector. 203 00:12:05,000 --> 00:12:09,000 I'm interested either of you have thoughts on sort of how that's gonna 204 00:12:09,200 --> 00:12:11,600 how that's going to interact with the development of empirical methods, 205 00:12:11,600 --> 00:12:14,300 or Empirical research, and economics. Going forward, sort of 206 00:12:14,600 --> 00:12:21,000 whether sort of a meta point, which is there's this new kind of employer 207 00:12:21,800 --> 00:12:26,000 the Amazons and the Uber and, you know, 208 00:12:26,200 --> 00:12:27,600 Riser world 209 00:12:28,000 --> 00:12:29,300 and I think that's great. 210 00:12:29,300 --> 00:12:33,200 And I'd like to tell my students about that, you know, especially at MIT. 211 00:12:33,200 --> 00:12:37,000 We have a lot of computer science Majors. That's our biggest major 212 00:12:37,400 --> 00:12:42,800 and I try to seduce some of those folks into economics by saying, you know, 213 00:12:43,200 --> 00:12:45,700 you can go work for these, 214 00:12:45,800 --> 00:12:48,400 you know companies that people are very keen to 215 00:12:48,700 --> 00:12:50,800 work for because the work seems exciting, 216 00:12:52,000 --> 00:12:56,000 you know that the skills that you get in econometrics are are as good or better. 217 00:12:56,100 --> 00:13:01,100 Better than than any competing discipline has to offer. So you should at least 218 00:13:01,400 --> 00:13:04,200 take some econ, take some econometrics. And some econ. 219 00:13:04,800 --> 00:13:07,000 I did a fun project with a uber 220 00:13:07,600 --> 00:13:12,900 on labor supply of Uber drivers and was very, very exciting to be part of that. 221 00:13:13,100 --> 00:13:15,400 Plus. I got to drive for Uber for a while 222 00:13:15,900 --> 00:13:20,700 and I thought that was fun tonight. I did not make enough that I was attempted to 223 00:13:21,100 --> 00:13:25,100 give up by a mighty job, but I enjoyed the experience. 224 00:13:25,300 --> 00:13:26,000 I see a 225 00:13:26,200 --> 00:13:30,900 Cho challenge to our model of graduate education here, 226 00:13:31,700 --> 00:13:37,400 which is if we're trading people to go work at Amazon, you know, 227 00:13:37,900 --> 00:13:42,900 it's not clear. Why? You know, we should be paying graduate stipends for that. 228 00:13:43,200 --> 00:13:45,400 Why should the taxpayer effectively 229 00:13:46,100 --> 00:13:51,400 be subsidizing? That our graduate education in the u.s. Is generously subsidized? 230 00:13:51,400 --> 00:13:56,000 Even in private universities. It's ultimately there's a lot of public money. 231 00:13:56,100 --> 00:13:59,300 Me there. And I think the traditional rationale for that is, 232 00:13:59,500 --> 00:14:03,900 you know, we were training, Educators and Scholars, and there's a great externality 233 00:14:04,300 --> 00:14:05,700 from the work that we do. 234 00:14:05,700 --> 00:14:09,600 It's either the research externality, or a teaching externality. 235 00:14:10,100 --> 00:14:14,600 But, you know, if many of our students are going to work in the private sector, 236 00:14:16,300 --> 00:14:21,700 that's fine, but that maybe their employers should pay for that. 237 00:14:22,300 --> 00:14:25,100 He says, so different from people working for a Consulting. 238 00:14:26,300 --> 00:14:26,900 Trust me. 239 00:14:27,200 --> 00:14:33,000 It's not clear to me that the number of jobs in academics has changed. 240 00:14:33,100 --> 00:14:37,600 It's just, I feel like this is a growing sector whereas Consulting, 241 00:14:37,700 --> 00:14:42,100 your right to raise that, it might be the same for for Consulting. 242 00:14:43,300 --> 00:14:44,400 But this, 243 00:14:44,500 --> 00:14:47,500 you know, I'm placing more and more students in these businesses. 244 00:14:47,500 --> 00:14:50,400 So, it's on my mind in a way that I've sort of, 245 00:14:50,800 --> 00:14:55,500 you know, not been attentive to consulting jobs, you know, Consulting was always, 246 00:14:55,600 --> 00:15:00,400 It's important and I think they'll so there's some movement from Consulting back 247 00:15:00,400 --> 00:15:02,600 into research. It's a little more fluid. 248 00:15:02,900 --> 00:15:03,500 The, 249 00:15:03,900 --> 00:15:05,400 a lot of the work in the 250 00:15:06,400 --> 00:15:09,800 in both domains. I have to say, it's not really different but 251 00:15:10,100 --> 00:15:13,700 you know, people who are working in the tech sector are doing things 252 00:15:13,700 --> 00:15:16,800 that are potentially of scientific interest, but mostly it's hidden. 253 00:15:17,100 --> 00:15:20,900 Then you really I have to say, you know, why, why is the government paying for this? 254 00:15:21,800 --> 00:15:25,500 Yeah, although yeah, I mean taquitos point, I guess it. There's a, there's a data. 255 00:15:25,600 --> 00:15:30,000 Question here of it has the sort of total nanak. It sort of say 256 00:15:30,500 --> 00:15:30,900 private 257 00:15:31,300 --> 00:15:34,100 for-profit sector employment of econ Ph.D. 258 00:15:34,100 --> 00:15:37,700 Program graduates increased or has it just been a substitution from 259 00:15:37,900 --> 00:15:40,200 finance and Consulting towards tack. 260 00:15:40,300 --> 00:15:44,300 I may be a reaction to something that's not really happening 261 00:15:44,400 --> 00:15:48,200 so bad. I've actually done some work with some of these tech companies. 262 00:15:49,100 --> 00:15:52,300 So I don't disagree with Justice point that we need to think 263 00:15:52,300 --> 00:15:55,100 a little bit about the funding model whose it was in the end paying for the 264 00:15:55,600 --> 00:15:59,400 It education. But from a scientific perspective. 265 00:16:00,100 --> 00:16:03,500 The only do these places have have great data and nowadays. 266 00:16:03,500 --> 00:16:07,100 They tend to be very careful with that for privacy reasons, 267 00:16:07,500 --> 00:16:08,800 but also have great questions. 268 00:16:10,200 --> 00:16:11,200 I find it very 269 00:16:11,600 --> 00:16:13,300 inspiring kind of to listen to 270 00:16:13,300 --> 00:16:15,800 the people there and kind of see what kind of questions they have 271 00:16:15,900 --> 00:16:17,300 and often their questions. 272 00:16:18,200 --> 00:16:20,600 That also come up outside of these. 273 00:16:20,700 --> 00:16:25,400 These companies have a couple of papers with the rights in the chat. 274 00:16:25,800 --> 00:16:26,300 And then 275 00:16:26,500 --> 00:16:31,600 as soon as an atheist kind of where we look at ways of combining experimental data 276 00:16:31,700 --> 00:16:34,100 and observational data, and can it there. 277 00:16:35,500 --> 00:16:38,600 Rights Chetty was interested in what is the effect 278 00:16:38,700 --> 00:16:44,600 of Early Childhood programs on outcomes later in life? Not just kind of test scores, 279 00:16:44,600 --> 00:16:48,300 but on earnings and stuff, and we cannot be developed methods 280 00:16:48,600 --> 00:16:51,500 that would help you shed light on that, on the some, 281 00:16:52,700 --> 00:16:55,000 in some settings and the same problems. 282 00:16:56,300 --> 00:17:00,500 Came up kind of in this tech company settings. 283 00:17:00,800 --> 00:17:03,700 And so for my perspective, it's 284 00:17:04,400 --> 00:17:07,500 the same kind of a stocking two people doing a protocol work. 285 00:17:07,600 --> 00:17:11,800 I tried to kind of look at these specific problems and then try to come up 286 00:17:11,900 --> 00:17:18,300 with more General problems that we formulating the problems at a higher level. 287 00:17:18,500 --> 00:17:22,900 So that I can think about solutions that work in a range of settings. 288 00:17:23,400 --> 00:17:25,500 And so from that perspective, the 289 00:17:25,700 --> 00:17:30,300 His with the the tech companies I just very valuable and very useful. 290 00:17:30,900 --> 00:17:31,400 It's know. 291 00:17:31,700 --> 00:17:33,700 We do have students. Now spent 292 00:17:33,800 --> 00:17:37,200 doing internships there and then coming back and writing 293 00:17:37,400 --> 00:17:43,400 more interesting thesis, as a result of their experiences there. 294 00:17:44,600 --> 00:17:47,800 If you'd like to watch more Nobel conversations, click here, 295 00:17:48,200 --> 00:17:50,500 or if you'd like to learn more about econometrics, 296 00:17:50,600 --> 00:17:53,200 check out Josh's mastering econometrics series. 297 00:17:53,700 --> 00:17:55,500 If you'd like to learn more about he do. 298 00:17:55,600 --> 00:17:58,300 Josh and Isaiah check out the links in the description.