0:00:00.000,0:00:02.550 ♪ [music] ♪ 0:00:03.800,0:00:05.800 - [Narrator] Welcome to [br]Nobel Conversations. 0:00:07.100,0:00:08.100 In this episode, 0:00:08.100,0:00:11.570 Josh Angrist and Guido Imbens[br]sit down with Isaiah Andrews 0:00:11.570,0:00:14.600 to discuss how the field [br]of econometrics is evolving. 0:00:16.100,0:00:18.750 - [Isaiah] So Guido and Josh, [br]you're both pioneers 0:00:18.750,0:00:21.500 in developing tools for[br]empirical research in economics. 0:00:21.500,0:00:22.930 And so I'd like to explore 0:00:22.930,0:00:25.300 sort of where you feel like [br]the field is heading, 0:00:25.300,0:00:28.079 sort of economics, econometrics,[br]the whole thing. 0:00:28.510,0:00:31.290 To start, I'd be interested to hear 0:00:32.200,0:00:35.200 about whether you feel like[br]sort of the way in which 0:00:35.200,0:00:38.510 the local average treatment [br]effects framework sort of took hold 0:00:38.800,0:00:42.100 has any lessons for how [br]new empirical methods in economics 0:00:42.100,0:00:44.300 develop and spread[br]or how they should. 0:00:44.560,0:00:45.960 - [Josh] That's a good question. 0:00:46.610,0:00:47.790 You go first. 0:00:47.790,0:00:49.460 (laughter) 0:00:49.700,0:00:53.180 Yeah, so I think [br]the important thing 0:00:53.180,0:00:58.550 is to come up [br]with good convincing cases 0:00:58.550,0:01:02.207 where the questions are clear 0:01:02.400,0:01:05.720 and where kind of the methods[br]apply in general. 0:01:05.720,0:01:07.560 So one thing I-- 0:01:08.070,0:01:12.000 Kind of looking back[br]at the subsequent literature, 0:01:12.200,0:01:16.700 so I really like the regression[br]discontinuity literature 0:01:16.700,0:01:19.670 [where there were] clearly a bunch[br]of really convincing examples 0:01:19.670,0:01:21.319 and that allowed people to kind of 0:01:22.300,0:01:27.200 think more clearly, look harder at[br]the methodological questions. 0:01:27.400,0:01:28.800 Kind of do clear applications 0:01:28.800,0:01:30.600 that then allow you [br]to kind of think about, 0:01:30.600,0:01:33.600 "Wow, do this type of assumption[br]seem reasonable here? 0:01:33.600,0:01:38.000 What kind of things do we not like[br]in the early papers? 0:01:38.500,0:01:39.802 How can we improve things?" 0:01:39.802,0:01:42.300 So having clear applications[br]motivating, 0:01:43.400,0:01:46.400 these literatures, [br]I think it's very helpful. 0:01:46.800,0:01:49.300 I'm glad you mentioned[br]the regression discontinuity, Guido. 0:01:49.300,0:01:53.300 I think there's a lot of [br]complementarity between IV and RD, 0:01:54.700,0:01:57.060 Instrumental Variables and[br]Regression Discontinuity. 0:02:00.350,0:02:03.260 And a lot of [br]the econometric applications 0:02:03.260,0:02:04.520 of regression discontinuity 0:02:04.520,0:02:07.230 are what used to be called [br]"fuzzy" RD, 0:02:07.230,0:02:11.620 where, you know, it's not discrete [br]or deterministic at the cutoff, 0:02:11.620,0:02:14.900 but just the change[br]in rates or intensity. 0:02:14.900,0:02:18.740 And and the late framework helps us[br]understand those applications 0:02:18.740,0:02:21.140 and gives us a clear interpretation 0:02:21.140,0:02:25.000 for say, something like, [br]in my paper with Victor Lavy, 0:02:25.000,0:02:28.100 where we use Maimonides'[br]rule, the class size cutoffs. 0:02:28.600,0:02:30.200 What are you getting there? 0:02:30.400,0:02:31.970 Of course, you can[br]answer that question 0:02:31.970,0:02:33.900 with a linear [br]constant effects model, 0:02:34.200,0:02:36.310 but it turns out [br]we're not limited to that, 0:02:36.310,0:02:39.889 and an RD is still very powerful [br]and illuminating, 0:02:40.630,0:02:42.100 even when, you know, 0:02:42.100,0:02:46.000 the correlation between the cutoff[br]and the variable of interest, 0:02:46.000,0:02:48.980 in this case class size, [br]is partial, 0:02:48.980,0:02:51.000 maybe even not that strong. 0:02:52.000,0:02:54.999 So there was definitely a kind of,[br]a parallel development. 0:02:54.999,0:02:56.400 It's also interesting, 0:02:56.600,0:02:59.780 you know, nobody talked about[br]regression discontinuity designs 0:02:59.780,0:03:01.220 when we were in graduate school, 0:03:01.220,0:03:05.300 it was something that other social[br]scientists were interested in, 0:03:05.800,0:03:09.507 and that kind of grew up [br]alongside the late framework 0:03:09.507,0:03:14.787 and we've both done work on [br]both applications and methods there 0:03:14.787,0:03:18.377 and it's been very exciting [br]to see that kind of develop 0:03:18.377,0:03:19.800 and become so important. 0:03:20.000,0:03:23.700 It's part of a general evolution,[br]I think, towards, you know, 0:03:24.000,0:03:27.700 credible identification strategies[br]causal effects... 0:03:28.630,0:03:30.200 less, you know, making econometrics 0:03:30.400,0:03:33.300 more about causal questions[br]than about models. 0:03:33.640,0:03:34.650 In terms of the future, 0:03:34.650,0:03:37.860 I think one thing that LATE [br]has helped facilitate 0:03:37.860,0:03:42.700 is a move towards more creative,[br]randomized trials, where, 0:03:42.700,0:03:44.400 you know, there's[br]something of interest, 0:03:45.500,0:03:50.700 it's not possible or straightforward[br]to simply turn it off or on, 0:03:51.000,0:03:54.584 but you can encourage it [br]or discourage it. 0:03:54.584,0:03:58.200 So you subsidize schooling [br]with financial aid, for example. 0:03:59.000,0:04:02.080 So now we have a whole [br]framework for interpreting that, 0:04:03.600,0:04:06.900 and it kind of opens[br]the doors to randomized trials 0:04:06.900,0:04:10.300 of things that that maybe would,[br]you know, 0:04:10.300,0:04:12.471 not have seem possible before. 0:04:14.500,0:04:17.700 We've used that a lot in the work[br]we do on schools in our-- 0:04:17.700,0:04:21.160 in the Blueprint Lab at MIT, 0:04:22.360,0:04:26.600 we're exploiting random assignment[br]and in very creative ways, I think. 0:04:28.100,0:04:32.300 - [Isaiah] Related to that, do you [br]see sort of particular factors 0:04:32.400,0:04:34.300 that make for useful research[br]in econometrics? 0:04:34.400,0:04:38.290 You've alluded to it [br]having a clear connection 0:04:38.290,0:04:40.300 to problems [br]that are actually coming up 0:04:40.300,0:04:42.650 and empirical practice [br]is often a good idea. 0:04:43.290,0:04:45.000 - [Josh] Isn't it always [br]a good idea? 0:04:45.700,0:04:50.100 I often find myself sitting in an[br]econometrics theory seminar, 0:04:50.700,0:04:52.500 say the Harvard MIT seminar, 0:04:53.400,0:04:55.940 and I'm thinking, "What problem[br]is this guy solving? 0:04:55.940,0:04:57.960 Who has this problem?" 0:04:57.960,0:04:59.800 And, you know, 0:05:01.600,0:05:04.700 sometimes there's an[br]embarrassing silence if I ask 0:05:04.900,0:05:08.300 or there might be[br]a fairly contrived scenario. 0:05:08.800,0:05:11.600 I want to see [br]where the tool is useful. 0:05:12.500,0:05:14.900 There are some [br]purely foundational tools, 0:05:14.900,0:05:17.600 I do take the point, you know, [br]there are people who are 0:05:18.200,0:05:22.500 working on conceptual[br]foundations of, you know, 0:05:22.600,0:05:25.300 it's more-- becomes more like[br]mathematical statistics. 0:05:25.800,0:05:28.200 I mean, I remember [br]an early example of that that I, 0:05:28.200,0:05:30.350 you know, I struggled to understand 0:05:30.350,0:05:32.500 was the idea[br]of stochastic equicontinuity, 0:05:32.500,0:05:35.070 which one of my thesis advisors,[br]Whitney Newey, 0:05:35.070,0:05:36.900 was using to great effect 0:05:37.500,0:05:39.900 and I was trying to understand that[br]and there isn't really-- 0:05:40.600,0:05:45.200 It's really foundational, it's not[br]an application that's driving that, 0:05:45.890,0:05:47.300 at least not immediately 0:05:48.600,0:05:53.200 but most things are not like that[br]and so there should be a problem. 0:05:53.800,0:05:59.100 And I think it's on the seller [br]of that sort of thing, 0:06:00.100,0:06:02.250 you know, because there's [br]opportunity cost, 0:06:02.250,0:06:05.170 the time and attention,[br]and effort to understand things 0:06:05.170,0:06:07.200 to, you know, [br]it's on the seller to say, 0:06:07.400,0:06:08.900 "Hey, I'm solving this problem 0:06:09.400,0:06:12.900 and here's a set of results[br]that show that it's useful, 0:06:12.900,0:06:15.200 and here's some insight[br]that I get." 0:06:16.200,0:06:18.280 - [Isaiah] As you said, Josh, great,[br]sort of there's been a move 0:06:18.280,0:06:20.700 in the direction of thinking [br]more about causality 0:06:20.700,0:06:22.800 in economics and empirical[br]work in economics, 0:06:22.900,0:06:24.800 any consequences of sort of the-- 0:06:24.800,0:06:26.570 the spread of that view [br]that surprised you 0:06:26.570,0:06:28.855 or anything that you view[br]as downsides 0:06:28.855,0:06:31.400 of sort of the way [br]that empirical economics has gone? 0:06:31.500,0:06:34.190 - [Josh] Sometimes I see, [br]somebody does IV 0:06:34.190,0:06:38.500 and they get a result [br]which seems implausibly large. 0:06:38.800,0:06:40.200 That's the usual case. 0:06:42.500,0:06:45.220 So it might be, you know,[br]an extraordinarily large 0:06:45.220,0:06:48.600 causal effect of some [br]relatively minor intervention, 0:06:49.100,0:06:52.800 which was randomized[br]or for which you could make a case 0:06:52.900,0:06:54.900 that there's a good design. 0:06:54.900,0:06:58.330 And then when I see that,[br]and, you know, I think 0:06:58.900,0:07:00.465 it's very hard for me to believe 0:07:00.465,0:07:02.030 that this relatively [br]minor intervention 0:07:02.030,0:07:04.000 has such a large effect. 0:07:04.100,0:07:06.110 The author will sometimes resort 0:07:06.110,0:07:08.690 to the local average [br]treatment effects theorem 0:07:08.690,0:07:12.700 and say, "Well, these compliers, you know,[br]they're special in some way." 0:07:13.300,0:07:17.600 And, you know, they just benefit[br]extraordinarily from this intervention. 0:07:18.100,0:07:22.100 And I'm reluctant to take that[br]at face value. I think, you know, 0:07:22.100,0:07:24.100 often when effects are too big, 0:07:24.300,0:07:26.900 it's because the exclusion[br]restriction is failing. So 0:07:27.100,0:07:31.700 Don't really have the right endogenous[br]variable to scale that result. 0:07:32.000,0:07:35.700 And so I'm not too happy to see 0:07:35.800,0:07:38.800 you know, just sort of[br]a generic heterogeneity 0:07:38.900,0:07:43.800 argument being used to excuse something[br]that I think might be a deeper problem. 0:07:45.300,0:07:47.400 I think it played somewhat[br]of an unfortunate roll pin. 0:07:47.400,0:07:52.300 The discussions kind of between reduced[br]form and structural approaches where 0:07:52.600,0:07:54.200 I feel that wasn't quite 0:07:55.000,0:07:59.300 right. The instrumental[br]variables assumptions are 0:08:00.400,0:08:05.200 at the core structural assumptions about[br]Behavior. They were coming from economic 0:08:07.100,0:08:09.900 thinking about the economic[br]behavior of agents, 0:08:10.300,0:08:15.000 and it's somehow it got[br]pushed in a Direction. 0:08:15.100,0:08:17.600 That I think wasn't[br]really very helpful. If 0:08:18.800,0:08:21.700 the way I think, initially the 0:08:22.800,0:08:27.300 we wrote things up. It was it was describing[br]what was happening, there was set of 0:08:27.500,0:08:32.200 methods. People were using be[br]clarified what those methods were doing 0:08:32.900,0:08:38.500 and in a way that I think[br]contain a fair amount of insight, 0:08:39.100,0:08:45.000 but it somehow it got pushed into a corner[br]that I think was not necessarily very 0:08:45.100,0:08:48.600 or even just the language of[br]reduced form versus structural. 0:08:48.600,0:08:51.100 I find kind of funny in[br]the sense that the right 0:08:51.100,0:08:53.100 the local average treatment[br]effect model, right? 0:08:53.100,0:08:55.300 The potential outcomes[br]model is a nonparametric. 0:08:55.300,0:08:56.200 Structural model, 0:08:56.200,0:08:58.600 if you want to think about it, as[br]you sort of suggested, he does. 0:08:58.600,0:08:59.400 So, there's something, 0:09:00.100,0:09:03.700 there's something a little funny about[br]putting these two things in a position when 0:09:03.800,0:09:06.600 yes, well, that language, of[br]course, comes from the area, the 0:09:06.900,0:09:09.800 70s equations framework that we inherited. 0:09:10.400,0:09:12.400 It has the advantage that people seem 0:09:12.400,0:09:15.000 to know what you mean[br]when you use it, but might 0:09:15.100,0:09:18.200 That people are hearing different. Different[br]people are hearing different things. 0:09:18.300,0:09:20.900 Yeah. I think I think veggies[br]Farmers had become use 0:09:20.900,0:09:23.200 a little bit of the[br]pejoratives. Okay? Yeah. 0:09:23.300,0:09:28.300 The word, which is not really quite[br]what it was originally intended for. 0:09:30.100,0:09:34.100 I guess something else that strikes[br]me in thinking about the effects of 0:09:34.100,0:09:38.200 the local average treatment effect[br]framework is that often folks will appeal to 0:09:38.200,0:09:41.800 a local average, treatment effects[br]intuition for settings. Well, beyond 0:09:42.000,0:09:44.900 ones, where any sort of formal[br]results has actually been 0:09:45.000,0:09:49.700 Shhhhht. And I'm curious given[br]all the work that you guys did to, 0:09:49.900,0:09:53.200 you know, establish late results in[br]different in different settings. I'm curious 0:09:53.300,0:09:57.800 any thoughts on that. I think there's[br]going to be a lot of cases where 0:09:57.900,0:10:02.200 the intuition does get.[br]You get you some distance, 0:10:02.800,0:10:07.600 but it's going to be somewhat limited[br]and establishing formal results. There 0:10:08.400,0:10:12.700 may be a little tricky and there may[br]be only work in special circumstances, 0:10:13.100,0:10:13.500 you need. 0:10:14.600,0:10:19.500 And you end up with a lot of formality[br]that may not quite capture the intuition 0:10:19.900,0:10:23.200 sometimes I'm somewhat uneasy with them[br]and they are not necessarily the papers. 0:10:23.200,0:10:25.000 I would want to ride that the 0:10:25.100,0:10:30.000 but I do think something do intuition[br]orphaned US capture part of the 0:10:30.200,0:10:31.100 of the problem. 0:10:33.100,0:10:36.300 I think, in some sense we were[br]kind of very fortunate there 0:10:36.900,0:10:40.500 in the way. The late paper go handle. It.[br]Don't know if that, actually the editor, 0:10:40.600,0:10:41.700 made it much shorter 0:10:42.100,0:10:46.300 and that then allowed us to kind of[br]focus on very clear, crisp results 0:10:47.100,0:10:49.800 where if, you know, this, 0:10:50.000,0:10:54.200 this is somewhat unfortunate tendency in[br]the commercialization of having the papers. 0:10:54.600,0:10:58.800 Well, you should be able to fix that, man.[br]I'm trying to take some time to fix that. 0:10:59.400,0:11:02.700 I think this is an example where it's sort[br]of very clear that having it. Be sure. 0:11:02.900,0:11:08.000 It's actually impose that no paper can[br]be longer than the late paper that wow. 0:11:08.800,0:11:14.300 Great. At least no Theory. No Theory Pig.[br]Yeah, and I think, I think they're well, 0:11:14.500,0:11:16.800 I'm trying very hard to get[br]the papers to be shorter. 0:11:16.800,0:11:18.700 And I think there's a lot of value 0:11:19.200,0:11:22.600 today because it's often the second[br]part of the paper that doesn't actually 0:11:23.700,0:11:26.500 Get you much further[br]and understanding things 0:11:27.000,0:11:31.700 but and it does make things much[br]harder to read and, you know, 0:11:32.400,0:11:33.700 it sort of goes back to 0:11:34.200,0:11:38.500 how I think he kind of a trick should[br]be done to you should focus on the see. 0:11:38.700,0:11:41.300 It should be reasonably[br]close to empirical problems. 0:11:41.500,0:11:43.900 They should be very clear problems. 0:11:44.800,0:11:48.900 But then often the the theory[br]doesn't need to be quite so long. 0:11:49.000,0:11:49.300 Yeah, 0:11:51.100,0:11:53.400 I think they had things have 0:11:53.600,0:11:54.700 On a little off track. 0:11:56.400,0:11:58.400 The relatively recent change has been a 0:11:58.500,0:12:02.200 seeming big increase in demand for[br]people with sort of econometrics. 0:12:02.200,0:12:04.800 Causal effect, estimation[br]skills in the tech sector. 0:12:05.000,0:12:09.000 I'm interested either of you have[br]thoughts on sort of how that's gonna 0:12:09.200,0:12:11.600 how that's going to interact with[br]the development of empirical methods, 0:12:11.600,0:12:14.300 or Empirical research, and[br]economics. Going forward, sort of 0:12:14.600,0:12:21.000 whether sort of a meta point, which[br]is there's this new kind of employer 0:12:21.800,0:12:26.000 the Amazons and the Uber and, you know, 0:12:26.200,0:12:27.600 Riser world 0:12:28.000,0:12:29.300 and I think that's great. 0:12:29.300,0:12:33.200 And I'd like to tell my students about[br]that, you know, especially at MIT. 0:12:33.200,0:12:37.000 We have a lot of computer science[br]Majors. That's our biggest major 0:12:37.400,0:12:42.800 and I try to seduce some of those folks[br]into economics by saying, you know, 0:12:43.200,0:12:45.700 you can go work for these, 0:12:45.800,0:12:48.400 you know companies that[br]people are very keen to 0:12:48.700,0:12:50.800 work for because the work seems exciting, 0:12:52.000,0:12:56.000 you know that the skills that you get in[br]econometrics are are as good or better. 0:12:56.100,0:13:01.100 Better than than any competing discipline[br]has to offer. So you should at least 0:13:01.400,0:13:04.200 take some econ, take some[br]econometrics. And some econ. 0:13:04.800,0:13:07.000 I did a fun project with a uber 0:13:07.600,0:13:12.900 on labor supply of Uber drivers and was[br]very, very exciting to be part of that. 0:13:13.100,0:13:15.400 Plus. I got to drive for Uber for a while 0:13:15.900,0:13:20.700 and I thought that was fun tonight. I did[br]not make enough that I was attempted to 0:13:21.100,0:13:25.100 give up by a mighty job, but[br]I enjoyed the experience. 0:13:25.300,0:13:26.000 I see a 0:13:26.200,0:13:30.900 Cho challenge to our model[br]of graduate education here, 0:13:31.700,0:13:37.400 which is if we're trading people[br]to go work at Amazon, you know, 0:13:37.900,0:13:42.900 it's not clear. Why? You know, we should[br]be paying graduate stipends for that. 0:13:43.200,0:13:45.400 Why should the taxpayer effectively 0:13:46.100,0:13:51.400 be subsidizing? That our graduate education[br]in the u.s. Is generously subsidized? 0:13:51.400,0:13:56.000 Even in private universities. It's[br]ultimately there's a lot of public money. 0:13:56.100,0:13:59.300 Me there. And I think the[br]traditional rationale for that is, 0:13:59.500,0:14:03.900 you know, we were training, Educators and[br]Scholars, and there's a great externality 0:14:04.300,0:14:05.700 from the work that we do. 0:14:05.700,0:14:09.600 It's either the research externality,[br]or a teaching externality. 0:14:10.100,0:14:14.600 But, you know, if many of our students[br]are going to work in the private sector, 0:14:16.300,0:14:21.700 that's fine, but that maybe their[br]employers should pay for that. 0:14:22.300,0:14:25.100 He says, so different from[br]people working for a Consulting. 0:14:26.300,0:14:26.900 Trust me. 0:14:27.200,0:14:33.000 It's not clear to me that the number[br]of jobs in academics has changed. 0:14:33.100,0:14:37.600 It's just, I feel like this is a[br]growing sector whereas Consulting, 0:14:37.700,0:14:42.100 your right to raise that, it might[br]be the same for for Consulting. 0:14:43.300,0:14:44.400 But this, 0:14:44.500,0:14:47.500 you know, I'm placing more and[br]more students in these businesses. 0:14:47.500,0:14:50.400 So, it's on my mind in[br]a way that I've sort of, 0:14:50.800,0:14:55.500 you know, not been attentive to consulting[br]jobs, you know, Consulting was always, 0:14:55.600,0:15:00.400 It's important and I think they'll so[br]there's some movement from Consulting back 0:15:00.400,0:15:02.600 into research. It's a little more fluid. 0:15:02.900,0:15:03.500 The, 0:15:03.900,0:15:05.400 a lot of the work in the 0:15:06.400,0:15:09.800 in both domains. I have to say,[br]it's not really different but 0:15:10.100,0:15:13.700 you know, people who are working[br]in the tech sector are doing things 0:15:13.700,0:15:16.800 that are potentially of scientific[br]interest, but mostly it's hidden. 0:15:17.100,0:15:20.900 Then you really I have to say, you know,[br]why, why is the government paying for this? 0:15:21.800,0:15:25.500 Yeah, although yeah, I mean taquitos point,[br]I guess it. There's a, there's a data. 0:15:25.600,0:15:30.000 Question here of it has the sort[br]of total nanak. It sort of say 0:15:30.500,0:15:30.900 private 0:15:31.300,0:15:34.100 for-profit sector employment of econ Ph.D. 0:15:34.100,0:15:37.700 Program graduates increased or has[br]it just been a substitution from 0:15:37.900,0:15:40.200 finance and Consulting towards tack. 0:15:40.300,0:15:44.300 I may be a reaction to something[br]that's not really happening 0:15:44.400,0:15:48.200 so bad. I've actually done some work[br]with some of these tech companies. 0:15:49.100,0:15:52.300 So I don't disagree with Justice[br]point that we need to think 0:15:52.300,0:15:55.100 a little bit about the funding model[br]whose it was in the end paying for the 0:15:55.600,0:15:59.400 It education. But from a[br]scientific perspective. 0:16:00.100,0:16:03.500 The only do these places have[br]have great data and nowadays. 0:16:03.500,0:16:07.100 They tend to be very careful[br]with that for privacy reasons, 0:16:07.500,0:16:08.800 but also have great questions. 0:16:10.200,0:16:11.200 I find it very 0:16:11.600,0:16:13.300 inspiring kind of to listen to 0:16:13.300,0:16:15.800 the people there and kind of see[br]what kind of questions they have 0:16:15.900,0:16:17.300 and often their questions. 0:16:18.200,0:16:20.600 That also come up outside of these. 0:16:20.700,0:16:25.400 These companies have a couple of[br]papers with the rights in the chat. 0:16:25.800,0:16:26.300 And then 0:16:26.500,0:16:31.600 as soon as an atheist kind of where we[br]look at ways of combining experimental data 0:16:31.700,0:16:34.100 and observational data, and can it there. 0:16:35.500,0:16:38.600 Rights Chetty was interested[br]in what is the effect 0:16:38.700,0:16:44.600 of Early Childhood programs on outcomes[br]later in life? Not just kind of test scores, 0:16:44.600,0:16:48.300 but on earnings and stuff, and[br]we cannot be developed methods 0:16:48.600,0:16:51.500 that would help you shed[br]light on that, on the some, 0:16:52.700,0:16:55.000 in some settings and the same problems. 0:16:56.300,0:17:00.500 Came up kind of in this[br]tech company settings. 0:17:00.800,0:17:03.700 And so for my perspective, it's 0:17:04.400,0:17:07.500 the same kind of a stocking two[br]people doing a protocol work. 0:17:07.600,0:17:11.800 I tried to kind of look at these[br]specific problems and then try to come up 0:17:11.900,0:17:18.300 with more General problems that we[br]formulating the problems at a higher level. 0:17:18.500,0:17:22.900 So that I can think about solutions[br]that work in a range of settings. 0:17:23.400,0:17:25.500 And so from that perspective, the 0:17:25.700,0:17:30.300 His with the the tech companies I[br]just very valuable and very useful. 0:17:30.900,0:17:31.400 It's know. 0:17:31.700,0:17:33.700 We do have students. Now spent 0:17:33.800,0:17:37.200 doing internships there and[br]then coming back and writing 0:17:37.400,0:17:43.400 more interesting thesis, as a[br]result of their experiences there. 0:17:44.600,0:17:47.800 If you'd like to watch more[br]Nobel conversations, click here, 0:17:48.200,0:17:50.500 or if you'd like to learn[br]more about econometrics, 0:17:50.600,0:17:53.200 check out Josh's mastering[br]econometrics series. 0:17:53.700,0:17:55.500 If you'd like to learn more about he do. 0:17:55.600,0:17:58.300 Josh and Isaiah check out[br]the links in the description.