0:00:00.900,0:00:01.900
♪ [music] ♪
0:00:03.800,0:00:05.700
- [Narrator] Welcome to [br]Nobel conversations.
0:00:07.300,0:00:10.240
In this episode, Josh Angrist [br]and Guido Imbens,
0:00:10.240,0:00:12.000
sit down with Isaiah Andrews
0:00:12.000,0:00:14.303
to discuss how the research[br]was initially received
0:00:14.900,0:00:17.736
and how they responded [br]to criticism.
0:00:18.700,0:00:19.300
At the time, did you feel like[br]you are on to something,
0:00:20.400,0:00:24.000
you felt like this was [br]the beginning of a whole line of work
0:00:24.000,0:00:27.400
that you felt like was going to be important or...?
0:00:27.600,0:00:30.100
Not so much that it was[br]a whole line of work,
0:00:30.100,0:00:32.600
but certainly I felt like, "Wow, this--"
0:00:32.600,0:00:34.700
We proved something be proved up[br]that people didn't know before,
0:00:34.700,0:00:39.000
that it was worth knowing.
0:00:39.000,0:00:40.000
Yeah, going back compared to my
0:00:40.000,0:00:41.400
job market papers having--
0:00:41.600,0:00:45.900
I felt this was actually a very clear crisp result.
0:00:46.400,0:00:48.400
But there were definitely
0:00:48.900,0:00:53.200
was mixed reception and I don't[br]think anybody said that,
0:00:53.300,0:00:56.900
"Oh, wow, this is already,[br]something."
0:00:57.100,0:00:59.600
No, which is the nightmare scenario for a researcher
0:01:00.300,0:01:03.200
where you think you've discovered[br]something and then somebody else,
0:01:03.300,0:01:04.800
says, "Oh, I knew that."
0:01:05.000,0:01:08.600
But there were definitely was a need to[br]convince people that this was worth knowing,
0:01:09.000,0:01:12.800
that instrumental variables estimates[br]a causal effect for compliers.
0:01:13.200,0:01:18.000
Yeah, but even though it[br]took a long time to convince
0:01:18.600,0:01:20.400
a bigger audience,
0:01:20.700,0:01:24.600
sometimes even fairly quickly, the[br]reception was pretty good
0:01:24.800,0:01:27.000
among a small group of people.
0:01:27.200,0:01:31.500
Gary, clearly liked it a lot from the beginning
0:01:31.800,0:01:34.600
and I remember, because at that point Josh had left for Israel,
0:01:34.700,0:01:37.400
but I remember explaining it to Don Ruben
0:01:37.600,0:01:43.700
and he was like, "Yeah, this really is something here."
0:01:43.800,0:01:47.200
Not right away though.[br]Don took some convincing.
0:01:47.500,0:01:48.400
By the time you got to Don,
0:01:48.500,0:01:51.500
there have been some back[br]and forth with him
0:01:51.800,0:01:53.500
and in correspondence actually.
0:01:53.700,0:01:56.700
But I remember at some[br]point getting a call or email from him
0:01:56.700,0:02:02.300
saying that he was sitting at the[br]airport in Rome
0:02:02.500,0:02:03.700
and looking at the paper and thinking,
0:02:03.700,0:02:07.000
"Yeah, no actually, you guys are onto something."
0:02:07.000,0:02:10.600
We were happy about but that[br]took longer than I think you remember.
0:02:10.800,0:02:12.500
Yeah, it wasn't right away
0:02:12.600,0:02:13.600
[laughter]
0:02:13.700,0:02:16.500
because I know that I was back in[br]Israel by the time that happened.
0:02:16.500,0:02:18.300
I'd left for Israel in the summer--
0:02:18.400,0:02:22.300
I was only at Harvard for two years. [br]We had that one year.
0:02:22.600,0:02:25.700
It is remarkable, I mean, that[br]one year was so fateful for us.
0:02:25.900,0:02:27.200
- [Guido] Yes.
0:02:28.500,0:02:30.200
I think we understood there was[br]something good happening,
0:02:30.200,0:02:34.000
but maybe we didn't think it was[br]life-changing, only in retrospect.
0:02:34.400,0:02:35.400
♪ [music] ♪
0:02:35.800,0:02:39.300
- [Isaiah] As you said, it sounds like a small group[br]of people were initially quite receptive,
0:02:39.300,0:02:41.000
perhaps took some time for
0:02:41.100,0:02:44.200
a broader group of people to come[br]around to
0:02:44.400,0:02:47.500
seeing the LATE framework[br]as a valuable way to look at the world.
0:02:47.700,0:02:50.000
I guess, in over the[br]course of that, did you
0:02:50.100,0:02:52.200
were their periods[br]where you thought,
0:02:52.300,0:02:53.200
maybe the people
0:02:53.300,0:02:55.800
saying this wasn't a useful way to[br]look at the world were right?
0:02:55.800,0:02:58.400
Did you get discouraged? [br]How did you think about?
0:02:58.400,0:03:00.900
I don't think I was discouraged[br]but the people who were saying
0:03:00.900,0:03:03.900
that we're smart people, well informed metricians,
0:03:05.000,0:03:08.000
sophisticated readers
0:03:08.900,0:03:11.800
and I think the substance[br]of the comment was,
0:03:11.800,0:03:15.600
this is not what econometrics is about.
0:03:16.300,0:03:20.700
Econometrics was being transmitted at that time was about structure.
0:03:21.300,0:03:24.700
There was this idea that[br]there's structure in the economy
0:03:25.100,0:03:27.100
and it's our job to discover it
0:03:27.200,0:03:31.200
and what makes it structure[br]is it's essentially invariant
0:03:31.900,0:03:34.800
and so we're saying, in the late theorem,
0:03:34.900,0:03:39.100
that every instrument produces[br]its own causal effect,
0:03:39.300,0:03:42.100
which is in contradiction to that[br]to some extent
0:03:42.400,0:03:45.300
and so that was where the tension was.
0:03:45.300,0:03:46.300
People didn't want to give up that idea.
0:03:46.300,0:03:47.700
Yeah. I remember
0:03:48.100,0:03:50.500
once people were started
0:03:51.200,0:03:56.100
arguing kind of more more[br]vocally against that, it
0:03:56.900,0:04:00.700
that never really bothered me[br]that much. It seems, you know,
0:04:01.000,0:04:03.700
sort of clear that we had[br]a result there and it was
0:04:04.900,0:04:08.900
Somewhat controversial, but instead[br]of controversial in a good way. It was
0:04:09.100,0:04:10.400
clear that people felt
0:04:10.700,0:04:12.800
they had to come out
0:04:12.900,0:04:14.100
against it because well,
0:04:14.100,0:04:16.800
I think what we think it's good[br]now and it's good night wasn't that
0:04:17.500,0:04:21.700
they might not have loved it at. Yeah,[br]I you know, I remember being somewhat,
0:04:21.800,0:04:26.400
the more upset there was some dinner[br]with someone said, no, no. That paper
0:04:26.700,0:04:28.300
that paper with Josh. That was really
0:04:28.800,0:04:32.600
that was doing a disservice[br]to the profession enough.
0:04:32.600,0:04:34.400
We had two reactions like that. So
0:04:34.800,0:04:38.200
That at some level, that's that[br]may be indicative of the culture
0:04:38.400,0:04:40.000
in general in economics. In the time.
0:04:41.400,0:04:44.300
I thought back later, but[br]I'd have to happen. Now,
0:04:44.600,0:04:48.200
if I was senior person sitting[br]in that conversation, I would
0:04:48.300,0:04:52.200
we call that out because it[br]really was not appropriate,
0:04:53.000,0:04:54.200
but it was so bad.
0:04:54.600,0:04:57.300
I think the criticism is[br]no, no. No, it wasn't,
0:04:57.700,0:05:01.500
it wasn't completely[br]misguided to be wrong.
0:05:01.800,0:05:04.700
No. No, but saying if you can[br]say two papers wrong order.
0:05:05.000,0:05:05.300
Yeah,
0:05:05.400,0:05:10.200
but it's saying that it's a disservice to[br]the professor. That's not really personal.
0:05:10.300,0:05:14.700
That's yes. And doing that, not to me, but[br]in front of my senior colleagues, yeah.
0:05:14.900,0:05:18.700
The but nobody was saying the result[br]was wrong and I remember also,
0:05:18.700,0:05:20.900
some of the comments were, you know,
0:05:20.900,0:05:24.000
thought-provoking so we had some[br]negative reviews. I think on the
0:05:24.400,0:05:26.300
average causal response paper. Yeah,
0:05:26.500,0:05:30.600
somebody said, you know, these compliers[br]you can't figure out who they are.
0:05:31.500,0:05:31.900
Right.
0:05:32.000,0:05:33.000
See it's one thing
0:05:33.000,0:05:36.200
to say you're estimating the effect of[br]treatment on the treated or something
0:05:36.200,0:05:38.400
like that. You can tell me who's treated,
0:05:38.600,0:05:42.600
you know, people in the CPS, you know,[br]you can't tell me who's a complier.
0:05:42.800,0:05:47.300
So that was a legitimate. That's, that's,[br]that's only fair that it's because he's
0:05:47.800,0:05:53.700
my dad. Yeah, my that part made people[br]a little uneasy and uncomfortable.
0:05:53.800,0:05:54.200
Yeah,
0:05:54.300,0:05:56.400
but it's a at the same time
0:05:56.900,0:06:00.800
because it showed that you couldn't[br]really go beyond that, it was
0:06:01.500,0:06:06.300
Very useful thing to realize[br]everyone was kind of on the day.
0:06:06.500,0:06:11.900
We got to the key result that I was thinking.[br]Wow, you know, this is this is sort of
0:06:12.100,0:06:17.100
as good as it gets them here. Be[br]actually having inside but it clearly
0:06:17.500,0:06:21.700
and we had to sell it. It's about selling[br]quite a few years. We had to sell. Yeah,
0:06:22.000,0:06:24.800
and it's proven it's[br]proven to be quite useful.
0:06:25.500,0:06:29.300
I don't think we understood that. It[br]would be so useful at the time. No,
0:06:30.100,0:06:35.600
I did feel like early on this was[br]a substantial inside something.
0:06:35.600,0:06:40.400
But yeah, but I did not[br]think goals were there. Yeah.
0:06:40.500,0:06:42.400
I felt like we were aiming for the Nobel.
0:06:43.900,0:06:46.300
We were very happy to get[br]that noted econometrics.
0:06:49.900,0:06:52.800
These are factors are ways of approaching[br]problems that lead people to be better at
0:06:52.800,0:06:53.100
like
0:06:53.200,0:06:56.600
recognizing the good stuff and taking the[br]time to do it as opposed to dismissing it.
0:06:56.600,0:06:57.700
Sometimes I think it's helpful.
0:06:57.700,0:07:01.300
If you're trying to convince somebody[br]that you have something useful to say
0:07:01.900,0:07:04.500
and maybe they don't, you[br]know, speak your language.
0:07:04.700,0:07:09.900
You might need to learn their language.[br]Yes. Yes. That's what we did with Don we cuz
0:07:10.300,0:07:11.300
we figured out.
0:07:11.400,0:07:12.200
How to remember.
0:07:12.200,0:07:17.600
We had a very hard time explaining[br]the exclusion restriction to Dawn May.
0:07:17.800,0:07:19.700
Rightfully, so it probably
0:07:20.000,0:07:24.600
I think he do and I eventually figured out[br]that it wasn't formulated very clearly,
0:07:24.900,0:07:25.400
you know,
0:07:25.400,0:07:29.700
and we came up with a way to do that in[br]the potential outcomes framework that
0:07:29.700,0:07:32.700
I think kind of worked[br]for the three of us. Yeah.
0:07:33.400,0:07:35.800
Well, I've worked for[br]the bigger literature but
0:07:35.800,0:07:39.000
I think what you're saying that is exactly[br]right. You kind of need to figure out
0:07:39.100,0:07:41.400
how not just kind of say, okay. Well,
0:07:41.400,0:07:43.900
I've got this language[br]and this this works great
0:07:43.900,0:07:45.900
and I've got to convince someone[br]else to use the language.
0:07:45.900,0:07:47.600
You could first figure out what language
0:07:47.700,0:07:49.300
State using and then
0:07:49.600,0:07:51.200
only then, can you try to say?
0:07:51.200,0:07:56.100
Wow, but here you thinking of it this way,[br]but that's actually a pretty hard thing
0:07:56.700,0:08:00.000
to do. You get someone from a[br]different discipline, convincing them.
0:08:00.200,0:08:03.300
They kind of to Junior faculty in a[br]different department actually have something
0:08:03.300,0:08:06.600
to say to you. That's that[br]takes a fair amount of effort.
0:08:07.500,0:08:10.200
Yeah, I wrote I wrote[br]on a number of times.
0:08:10.300,0:08:10.600
Yeah,
0:08:10.700,0:08:14.500
it fairly long letters. And I remember[br]thinking this is worth doing, you know,
0:08:14.600,0:08:17.600
that if I could convince[br]Don that would sort of,
0:08:17.800,0:08:19.600
The framework to some extent.
0:08:20.300,0:08:25.000
I think both both you and Dom were a little[br]bit more confident that you were right.
0:08:25.000,0:08:28.200
We used to argue a lot and[br]you would sometimes refereed.
0:08:29.800,0:08:30.700
That was fun.
0:08:30.800,0:08:34.100
I remember it wasn't. It wasn't hurtful.
0:08:35.200,0:08:39.800
I remember getting a little testy.[br]Once we had lunch in The Faculty Club
0:08:40.600,0:08:44.400
and we're talking, we're talking[br]about the draft lottery paper. Yeah,
0:08:45.200,0:08:47.600
talking about never take his kind of
0:08:47.700,0:08:51.000
people wouldn't serve in[br]the military irrespective of
0:08:51.200,0:08:53.700
whether they were getting drafted.
0:08:54.500,0:08:58.700
And you are done said something[br]about shooting yourself in the foot,
0:08:59.800,0:09:03.400
as a way of getting out of the military[br]and that may be the exclusion restriction
0:09:03.400,0:09:05.300
for never takes Muslim working.
0:09:06.300,0:09:08.900
And then wherever they were ever[br]said that the animal is going well.
0:09:08.900,0:09:12.100
Yes, you could do that. But why would[br]you want to shoot yourself in the foot?
0:09:13.400,0:09:17.600
It is Khalil. The I usually go for[br]moving to Canada from. Yeah. That's it.
0:09:17.900,0:09:19.400
Lindsey Graham teacher,
0:09:19.700,0:09:24.000
yes, but he thinks a tricky mean it's the,
0:09:24.100,0:09:27.900
you know, I say I get students coming[br]from computer science and they want to do
0:09:28.100,0:09:29.900
things on causal, inference.
0:09:30.400,0:09:33.700
And it takes a huge amount of[br]effort to kind of do figure out how
0:09:33.700,0:09:36.800
they actually thinking about problem[br]better. This, there's something there.
0:09:37.000,0:09:38.500
And so, now over the years,
0:09:38.500,0:09:40.600
I've got a little more[br]appreciation for the fact that Don
0:09:40.800,0:09:42.100
was actually willing to kind of.
0:09:42.200,0:09:46.000
Yeah, it took a while, but he did[br]engage kind of first with Josh.
0:09:46.400,0:09:47.500
And then we both of us.
0:09:47.700,0:09:51.000
And rather than kind of dismissing[br]the say, okay. Well, you know,
0:09:51.500,0:09:55.700
I can't figure out what these guys are[br]doing and it's probably just not really that
0:09:55.800,0:09:56.700
that interesting
0:09:57.200,0:10:00.300
everybody always wants to[br]figure out quickly, you know,
0:10:00.300,0:10:04.300
you want to save time and you want to save[br]your brain cells for other things. So,
0:10:04.700,0:10:05.000
you know,
0:10:05.000,0:10:08.900
the fastest route to that is to figure[br]out why you should dismiss something. Yes.
0:10:09.000,0:10:11.300
Yes. I don't need to spend time on this.
0:10:12.700,0:10:15.900
If you'd like to watch more[br]Nobel conversations, click here,
0:10:16.300,0:10:18.600
or if you'd like to learn[br]more about econometrics,
0:10:18.700,0:10:21.300
check out Josh's mastering[br]econometrics series.
0:10:21.800,0:10:24.700
If you'd like to learn more[br]about he do Josh and Isaiah
0:10:24.900,0:10:26.400
check out the links in the description.