0:00:00.900,0:00:01.900 ♪ [music] ♪ 0:00:03.800,0:00:05.700 - [Narrator] Welcome to [br]Nobel conversations. 0:00:07.300,0:00:10.240 In this episode, Josh Angrist [br]and Guido Imbens, 0:00:10.240,0:00:12.000 sit down with Isaiah Andrews 0:00:12.000,0:00:14.303 to discuss how the research[br]was initially received 0:00:14.900,0:00:17.736 and how they responded [br]to criticism. 0:00:18.700,0:00:19.300 At the time, did you feel like[br]you are on to something, 0:00:20.400,0:00:24.000 you felt like this was the beginning[br]of a whole line of work 0:00:24.000,0:00:27.400 that you felt like was going [br]to be important or...? 0:00:27.600,0:00:30.100 Not so much that it was[br]a whole line of work, 0:00:30.100,0:00:32.600 but certainly I felt like, [br]"Wow, this--" 0:00:32.600,0:00:34.700 We proved something [br]that people didn't know before, 0:00:34.700,0:00:39.000 that it was worth knowing. 0:00:39.000,0:00:40.000 Yeah, going back compared to [br]my job market papers having-- 0:00:41.600,0:00:45.900 I felt this was actually [br]a very clear crisp result. 0:00:46.400,0:00:48.400 But there were definitely [br]was mixed reception 0:00:48.900,0:00:53.200 and I don't think anybody said that, 0:00:53.300,0:00:56.900 "Oh, wow, this is already,[br]something." 0:00:57.100,0:00:59.600 No, which is the nightmare scenario[br]for a researcher 0:01:00.300,0:01:03.200 where you think you've [br]discovered something 0:01:03.300,0:01:04.800 and then somebody else says, [br]"Oh, I knew that." 0:01:05.000,0:01:08.000 But there were definitely was [br]a need to convince people 0:01:08.000,0:01:09.000 that this was worth knowing,[br]that instrumental variables 0:01:09.000,0:01:12.800 estimates a causal effect [br]for compliers. 0:01:13.200,0:01:18.000 Yeah, but even though it[br]took a long time 0:01:18.600,0:01:20.400 to convince a bigger audience, 0:01:20.700,0:01:24.600 sometimes even fairly quickly, [br]the reception was pretty good 0:01:24.800,0:01:27.000 among a small group of people. 0:01:27.200,0:01:31.500 Gary, clearly liked it a lot [br]from the beginning 0:01:31.800,0:01:34.600 and I remember, because at that point[br]Josh had left for Israel, 0:01:34.700,0:01:37.400 but I remember explaining it [br]to Don Ruben 0:01:37.600,0:01:43.700 and he was like, [br]"Yeah, this really is something here." 0:01:43.800,0:01:47.200 Not right away though,[br]Don took some convincing. 0:01:47.500,0:01:48.400 By the time you got to Don, 0:01:48.500,0:01:51.500 there have been some back[br]and forth with him 0:01:51.800,0:01:53.500 and in correspondence actually. 0:01:53.700,0:01:56.700 But I remember at some point [br]getting a call or email from him 0:01:56.700,0:02:02.300 saying that he was sitting [br]at the airport in Rome 0:02:02.500,0:02:03.700 and looking at the paper [br]and thinking, 0:02:03.700,0:02:07.000 "Yeah, no actually, [br]you guys are onto something." 0:02:07.000,0:02:09.800 We were happy about 0:02:09.800,0:02:10.800 but that took longer [br]than I think you remember. 0:02:10.800,0:02:12.500 Yeah, it wasn't right away 0:02:12.600,0:02:13.600 [laughter] 0:02:13.700,0:02:16.500 because I know that I was back [br]in Israel by the time that happened. 0:02:16.500,0:02:18.300 I'd left for Israel in the summer-- 0:02:18.400,0:02:22.300 I was only at Harvard for two years. [br]We had that one year. 0:02:22.600,0:02:25.700 It is remarkable, I mean, that[br]one year was so fateful for us. 0:02:25.900,0:02:27.200 - [Guido] Yes. 0:02:28.500,0:02:30.200 I think we understood there was[br]something good happening, 0:02:30.200,0:02:34.000 but maybe we didn't think it was[br]life-changing, only in retrospect. 0:02:34.400,0:02:35.400 ♪ [music] ♪ 0:02:35.800,0:02:38.300 - [Isaiah] As you said, it sounds like [br]a small group of people 0:02:38.300,0:02:39.300 were initially quite receptive, 0:02:39.300,0:02:41.000 perhaps took some time for[br]a broader group of people 0:02:44.400,0:02:46.700 to come around to seeing [br]the LATE framework 0:02:46.700,0:02:47.700 as a valuable way to look [br]at the world. 0:02:47.700,0:02:50.000 I guess, in over the course of that, 0:02:50.100,0:02:52.200 were their periods [br]where you thought, 0:02:52.300,0:02:53.200 maybe the people saying [br]this wasn't a useful way 0:02:53.300,0:02:55.800 to look at the world were right? 0:02:55.800,0:02:58.400 Did you get discouraged? [br]How did you think about? 0:02:58.400,0:03:00.900 I don't think I was discouraged[br]but the people who were saying 0:03:00.900,0:03:03.900 that we're smart people, [br]well informed metricians, 0:03:05.000,0:03:08.000 sophisticated readers 0:03:08.900,0:03:11.800 and I think the substance[br]of the comment was, 0:03:11.800,0:03:15.600 this is not what econometrics [br]is about. 0:03:16.300,0:03:20.700 Econometrics was being transmitted [br]at that time was about structure. 0:03:21.300,0:03:24.700 There was this idea that[br]there's structure in the economy 0:03:25.100,0:03:27.100 and it's our job to discover it 0:03:27.200,0:03:31.200 and what makes it structure[br]is it's essentially invariant 0:03:31.900,0:03:34.800 and so we're saying, [br]in the LATE theorem, 0:03:34.900,0:03:39.100 that every instrument produces[br]its own causal effect, 0:03:39.300,0:03:42.100 which is in contradiction to that[br]to some extent 0:03:42.400,0:03:45.300 and so that was where the tension was. 0:03:45.300,0:03:46.300 People didn't want [br]to give up that idea. 0:03:46.300,0:03:47.700 Yeah, I remember once [br]people were started 0:03:51.200,0:03:56.100 arguing more more vocally [br]against that, 0:03:56.900,0:04:00.700 that never really [br]bothered me that much. 0:04:01.000,0:04:03.700 It seems clear that [br]we had a result there 0:04:04.900,0:04:08.100 and it was somewhat [br]controversial, 0:04:08.100,0:04:09.100 but controversial in a good way. 0:04:09.100,0:04:10.400 It was clear that people felt 0:04:10.700,0:04:12.800 they had to come out against it because-- 0:04:14.100,0:04:17.500 Well, I think what [br]we think it's good now 0:04:17.500,0:04:20.800 we might not have loved it [br]at the time. 0:04:20.800,0:04:21.800 I remember being somewhat,[br]the more upset-- 0:04:21.800,0:04:26.400 there was some dinner [br]where someone said, 0:04:26.700,0:04:28.300 "No, no, that paper with Josh, 0:04:28.800,0:04:32.600 that was doing a disservice[br]to the profession." 0:04:32.600,0:04:34.400 We definitely had [br]reactions like that. 0:04:34.800,0:04:38.200 At some level, that may be [br]indicative of the culture 0:04:38.400,0:04:40.000 in general in economics [br]at the time. 0:04:41.400,0:04:44.300 I thought back later, [br]what if that'd happened now, 0:04:44.600,0:04:48.200 if I was a senior person sitting[br]in that conversation, 0:04:48.300,0:04:52.200 I would call that out because it[br]really was not appropriate-- 0:04:53.000,0:04:54.200 - [Josh] But it wasn't so bad. 0:04:54.600,0:04:57.300 I think the criticism is-- 0:04:57.700,0:05:01.500 It wasn't completely misguided, [br]it was maybe wrong. 0:05:01.800,0:05:04.700 No, no, but you can say [br]the paper is wrong 0:05:05.400,0:05:07.300 but it's saying that 0:05:07.300,0:05:08.300 it's a disservice [br]to the profession, 0:05:08.300,0:05:09.300 that's not really-- 0:05:09.300,0:05:10.300 Personal. 0:05:10.300,0:05:13.900 Yes, and doing that, not to me, 0:05:13.900,0:05:14.900 but in front of [br]my senior colleagues. 0:05:14.900,0:05:17.700 But nobody was saying [br]the result was wrong 0:05:17.700,0:05:18.700 and I remember also, 0:05:18.700,0:05:20.900 some of the comments [br]were thought-provoking 0:05:20.900,0:05:24.000 so we had some negative reviews, 0:05:24.400,0:05:26.300 I think on the average [br]causal response paper. 0:05:26.500,0:05:30.600 Somebody said, "These compliers[br]you can't figure out who they are." 0:05:31.500,0:05:31.900 Right. 0:05:32.000,0:05:33.000 It's one thing to say 0:05:33.000,0:05:35.200 you're estimating [br]the effect of treatment 0:05:35.200,0:05:36.200 on the treated [br]or something like that. 0:05:36.200,0:05:38.400 You can tell me who's treated, 0:05:38.600,0:05:42.600 people in the CPS,[br]you can't tell me who's a complier. 0:05:42.800,0:05:46.800 So that was a legitimate challenge. 0:05:46.800,0:05:47.800 That's certainly fair [br]and I can see why 0:05:47.800,0:05:53.700 that part made people[br]a little uneasy and uncomfortable. 0:05:53.800,0:05:54.200 Yeah. 0:05:54.300,0:05:56.400 But it's a at the same time 0:05:56.900,0:06:00.800 because it showed that you couldn't[br]really go beyond that, 0:06:01.500,0:06:05.500 it was very useful thing [br]to realize. 0:06:05.500,0:06:06.500 I remember on the day, 0:06:06.500,0:06:11.100 we got to the key result [br]that I was thinking, 0:06:11.100,0:06:12.100 "Wow, this is as good as it gets. 0:06:12.100,0:06:16.500 Here we actually have an insight[br]but clearly--" 0:06:17.500,0:06:21.000 And we had to sell it. 0:06:21.000,0:06:22.000 For quite a few years, [br]we had to sell 0:06:22.000,0:06:24.800 and it's proven to be quite useful. 0:06:25.500,0:06:29.300 I don't think we understood that it[br]would be so useful at the time. 0:06:30.100,0:06:34.600 No, I did feel early on this was[br]a substantial insight. 0:06:34.600,0:06:35.600 - [Josh] Yeah we [learned] something. 0:06:35.600,0:06:40.400 But I did not think [br]goals were there. 0:06:40.500,0:06:42.400 I don't think we were aiming [br]for the Nobel. 0:06:42.650,0:06:43.650 [laughter] 0:06:43.900,0:06:46.300 We were very happy to get[br]that note in Econometrica. 0:06:47.600,0:06:48.600 ♪ [music] ♪ 0:06:49.900,0:06:52.800 - [Isaiah] Are there factors [br]or are ways of approaching problems 0:06:53.200,0:06:55.600 that lead people to be better [br]at recognizing the good stuff 0:06:55.600,0:06:56.600 and taking the time to do it [br]as opposed to dismissing it? 0:06:56.600,0:06:57.700 - [Josh] Sometimes [br]I think it's helpful. 0:06:57.700,0:07:00.900 If you're trying to [br]convince somebody 0:07:00.900,0:07:01.900 that you have something [br]useful to say 0:07:01.900,0:07:04.500 and maybe they don't [br]speak your language, 0:07:04.700,0:07:09.900 you might need [br]to learn their language. 0:07:10.300,0:07:10.850 Yes, yes, exactly. 0:07:10.850,0:07:11.400 That's what we did with Don,[br]we figured out how to-- 0:07:12.200,0:07:16.800 I remember we had a very hard time 0:07:16.800,0:07:17.800 explaining the exclusion [br]restriction to Don, 0:07:17.800,0:07:19.700 maybe rightfully so, 0:07:20.000,0:07:24.400 I think Guido and I [br]eventually figured out 0:07:24.400,0:07:25.400 that it wasn't formulated [br]very clearly, 0:07:25.400,0:07:28.700 and we came up [br]with a way to do that 0:07:28.700,0:07:29.700 in the potential outcomes framework 0:07:29.700,0:07:32.700 that I think worked[br]for the three of us. 0:07:33.400,0:07:35.800 Yeah, well, it worked for[br]the bigger literature 0:07:35.800,0:07:38.100 but I think what you're saying [br]there is exactly right, 0:07:38.100,0:07:39.100 you need to figure out [br]how not just say, 0:07:41.400,0:07:43.900 "Okay well, I've got this language[br]and this this works great 0:07:43.900,0:07:45.900 and I've got to convince someone[br]else to use the language. 0:07:45.900,0:07:47.600 You could first figure out [br]what language they're using 0:07:49.600,0:07:51.200 and then only then, [br]can you try to say, 0:07:51.200,0:07:55.700 "Wow, but here you thinking of it [br]this way," 0:07:55.700,0:07:56.700 but that's actually [br]a pretty hard thing to do, 0:07:56.700,0:08:00.000 get someone from [br]a different discipline, 0:08:00.200,0:08:02.300 convincing them, two junior faculty[br]in a different department 0:08:02.300,0:08:03.300 actually have something [br]to say to you, 0:08:03.300,0:08:06.600 that's that takes [br]a fair amount of effort. 0:08:07.500,0:08:10.200 Yeah, I wrote on a number of times,[br]in fairly long letters. 0:08:10.700,0:08:14.500 I remember thinking [br]this is worth doing, 0:08:14.600,0:08:17.600 that if I could convince Don 0:08:18.450,0:08:19.450 that would validate the framework [br]to some extent. 0:08:20.300,0:08:24.000 I think both you and Don were 0:08:24.000,0:08:25.000 a little bit more confident [br]that you were right. 0:08:25.000,0:08:27.500 Well, we used to argue a lot 0:08:27.500,0:08:28.500 and you would sometimes [br]referee those. 0:08:28.500,0:08:29.500 [laughter] 0:08:29.800,0:08:30.700 That was fun. 0:08:30.800,0:08:34.100 It wasn't hurtful. 0:08:35.200,0:08:39.600 I remember getting [br]a little testy once, 0:08:39.600,0:08:40.600 we had lunch in The Faculty Club 0:08:40.600,0:08:44.400 and we're talking about [br]the draft lottery paper. 0:08:45.200,0:08:47.600 We were talking about never takes 0:08:47.700,0:08:51.000 as people wounded serve [br]in the military irrespective of 0:08:51.200,0:08:53.700 whether they were getting drafted 0:08:54.500,0:08:58.700 and you and Don said something[br]about shooting yourself in the foot, 0:08:59.800,0:09:02.400 as a way of getting out of the military 0:09:02.400,0:09:03.400 and that may be [br]the exclusion restriction 0:09:03.400,0:09:05.300 for never takes wasn't working 0:09:06.300,0:09:08.900 and then the other one would say, 0:09:08.900,0:09:11.250 "Well, yes you could do that 0:09:11.250,0:09:12.250 but why would you want [br]to shoot yourself in the foot?" 0:09:12.250,0:09:12.825 [laughter] 0:09:12.825,0:09:13.400 It got a little [out of hand] there. 0:09:13.400,0:09:17.600 I usually go for moving to Canada,[br]for my example, 0:09:18.150,0:09:19.150 when I'm teaching that. 0:09:19.700,0:09:24.000 But he thinks it's tricky, 0:09:24.100,0:09:27.900 I get students coming[br]from computer science 0:09:28.100,0:09:29.900 and they want to do things [br]on causal inference 0:09:30.400,0:09:33.700 and it takes a huge amount [br]of effort to figure out 0:09:33.700,0:09:36.000 how they actually thinking [br]about problem 0:09:36.000,0:09:37.000 and whether there's something there 0:09:37.000,0:09:38.500 and so, now over the years, 0:09:38.500,0:09:40.600 I've got a little more appreciation[br]for the fact 0:09:40.800,0:09:42.100 that Don was actually willing to-- 0:09:42.200,0:09:46.000 It took him a while, [br]but he did engage first with Josh 0:09:46.400,0:09:47.500 and then with both of us 0:09:47.700,0:09:51.000 and rather than dismissing[br]and say, 0:09:51.500,0:09:54.800 "Well, okay I can't figure out[br]what these guys are doing 0:09:54.800,0:09:55.800 and it's probably just [br]not really interesting." 0:09:57.200,0:10:00.300 Everybody always wants [br]to figure out quickly, 0:10:00.300,0:10:04.000 you want to save time 0:10:04.000,0:10:05.000 and you want to save your brain cells[br]for other things. 0:10:05.000,0:10:07.000 The fastest route to [br]that is to figure out 0:10:07.000,0:10:08.000 why you should dismiss something. 0:10:08.000,0:10:09.000 Yes. 0:10:09.000,0:10:11.300 I don't need to spend time on this. 0:10:11.500,0:10:12.500 ♪ [music] ♪ 0:10:12.700,0:10:15.300 - [Narrator] If you'd like [br]to watch more 0:10:15.300,0:10:16.300 Nobel conversations, click here, 0:10:16.300,0:10:18.600 or if you'd like to learn[br]more about econometrics, 0:10:18.700,0:10:21.300 check out Josh's "Mastering[br]Econometrics" series. 0:10:21.800,0:10:24.700 If you'd like to learn more[br]about Guido, Josh, and Isaiah 0:10:24.900,0:10:26.400 check out the links [br]in the description. 0:10:26.702,0:10:28.017 ♪ [music] ♪