1
00:00:00,900 --> 00:00:01,900
♪ [music] ♪
2
00:00:03,800 --> 00:00:05,700
- [Narrator] Welcome to
Nobel conversations.
3
00:00:07,300 --> 00:00:10,240
In this episode, Josh Angrist
and Guido Imbens,
4
00:00:10,240 --> 00:00:12,000
sit down with Isaiah Andrews
5
00:00:12,000 --> 00:00:14,303
to discuss how the research
was initially received
6
00:00:14,900 --> 00:00:17,736
and how they responded
to criticism.
7
00:00:18,700 --> 00:00:19,300
At the time, did you feel like
you are on to something,
8
00:00:20,400 --> 00:00:24,000
you felt like this was
the beginning of a whole line of work
9
00:00:24,000 --> 00:00:27,400
that you felt like was going to be important or...?
10
00:00:27,600 --> 00:00:30,100
Not so much that it was
a whole line of work,
11
00:00:30,100 --> 00:00:32,600
but certainly I felt like, "Wow, this--"
12
00:00:32,600 --> 00:00:34,700
We proved something be proved up
that people didn't know before,
13
00:00:34,700 --> 00:00:39,000
that it was worth knowing.
14
00:00:39,000 --> 00:00:40,000
Yeah, going back compared to my
15
00:00:40,000 --> 00:00:41,400
job market papers having--
16
00:00:41,600 --> 00:00:45,900
I felt this was actually a very clear crisp result.
17
00:00:46,400 --> 00:00:48,400
But there were definitely
18
00:00:48,900 --> 00:00:53,200
was mixed reception and I don't
think anybody said that,
19
00:00:53,300 --> 00:00:56,900
"Oh, wow, this is already,
something."
20
00:00:57,100 --> 00:00:59,600
No, which is the nightmare scenario for a researcher
21
00:01:00,300 --> 00:01:03,200
where you think you've discovered
something and then somebody else,
22
00:01:03,300 --> 00:01:04,800
says, "Oh, I knew that."
23
00:01:05,000 --> 00:01:08,600
But there were definitely was a need to
convince people that this was worth knowing,
24
00:01:09,000 --> 00:01:12,800
that instrumental variables estimates
a causal effect for compliers.
25
00:01:13,200 --> 00:01:18,000
Yeah, but even though it
took a long time to convince
26
00:01:18,600 --> 00:01:20,400
a bigger audience,
27
00:01:20,700 --> 00:01:24,600
sometimes even fairly quickly, the
reception was pretty good
28
00:01:24,800 --> 00:01:27,000
among a small group of people.
29
00:01:27,200 --> 00:01:31,500
Gary, clearly liked it a lot from the beginning
30
00:01:31,800 --> 00:01:34,600
and I remember, because at that point Josh had left for Israel,
31
00:01:34,700 --> 00:01:37,400
but I remember explaining it to Don Ruben
32
00:01:37,600 --> 00:01:43,700
and he was like, "Yeah, this really is something here."
33
00:01:43,800 --> 00:01:47,200
Not right away though.
Don took some convincing.
34
00:01:47,500 --> 00:01:48,400
By the time you got to Don,
35
00:01:48,500 --> 00:01:51,500
there have been some back
and forth with him
36
00:01:51,800 --> 00:01:53,500
and in correspondence actually.
37
00:01:53,700 --> 00:01:56,700
But I remember at some
point getting a call or email from him
38
00:01:56,700 --> 00:02:02,300
saying that he was sitting at the
airport in Rome
39
00:02:02,500 --> 00:02:03,700
and looking at the paper and thinking,
40
00:02:03,700 --> 00:02:07,000
"Yeah, no actually, you guys are onto something."
41
00:02:07,000 --> 00:02:10,600
We were happy about but that
took longer than I think you remember.
42
00:02:10,800 --> 00:02:12,500
Yeah, it wasn't right away
43
00:02:12,600 --> 00:02:13,600
[laughter]
44
00:02:13,700 --> 00:02:16,500
because I know that I was back in
Israel by the time that happened.
45
00:02:16,500 --> 00:02:18,300
I'd left for Israel in the summer--
46
00:02:18,400 --> 00:02:22,300
I was only at Harvard for two years.
We had that one year.
47
00:02:22,600 --> 00:02:25,700
It is remarkable, I mean, that
one year was so fateful for us.
48
00:02:25,900 --> 00:02:27,200
- [Guido] Yes.
49
00:02:28,500 --> 00:02:30,200
I think we understood there was
something good happening,
50
00:02:30,200 --> 00:02:34,000
but maybe we didn't think it was
life-changing, only in retrospect.
51
00:02:34,400 --> 00:02:35,400
♪ [music] ♪
52
00:02:35,800 --> 00:02:39,300
- [Isaiah] As you said, it sounds like a small group
of people were initially quite receptive,
53
00:02:39,300 --> 00:02:41,000
perhaps took some time for
54
00:02:41,100 --> 00:02:44,200
a broader group of people to come
around to
55
00:02:44,400 --> 00:02:47,500
seeing the LATE framework
as a valuable way to look at the world.
56
00:02:47,700 --> 00:02:50,000
I guess, in over the
course of that, did you
57
00:02:50,100 --> 00:02:52,200
were their periods
where you thought,
58
00:02:52,300 --> 00:02:53,200
maybe the people
59
00:02:53,300 --> 00:02:55,800
saying this wasn't a useful way to
look at the world were right?
60
00:02:55,800 --> 00:02:58,400
Did you get discouraged?
How did you think about?
61
00:02:58,400 --> 00:03:00,900
I don't think I was discouraged
but the people who were saying
62
00:03:00,900 --> 00:03:03,900
that we're smart people, well informed metricians,
63
00:03:05,000 --> 00:03:08,000
sophisticated readers
64
00:03:08,900 --> 00:03:11,800
and I think the substance
of the comment was,
65
00:03:11,800 --> 00:03:15,600
this is not what econometrics is about.
66
00:03:16,300 --> 00:03:20,700
Econometrics was being transmitted at that time was about structure.
67
00:03:21,300 --> 00:03:24,700
There was this idea that
there's structure in the economy
68
00:03:25,100 --> 00:03:27,100
and it's our job to discover it
69
00:03:27,200 --> 00:03:31,200
and what makes it structure
is it's essentially invariant
70
00:03:31,900 --> 00:03:34,800
and so we're saying, in the late theorem,
71
00:03:34,900 --> 00:03:39,100
that every instrument produces
its own causal effect,
72
00:03:39,300 --> 00:03:42,100
which is in contradiction to that
to some extent
73
00:03:42,400 --> 00:03:45,300
and so that was where the tension was.
74
00:03:45,300 --> 00:03:46,300
People didn't want to give up that idea.
75
00:03:46,300 --> 00:03:47,700
Yeah. I remember
76
00:03:48,100 --> 00:03:50,500
once people were started
77
00:03:51,200 --> 00:03:56,100
arguing kind of more more
vocally against that, it
78
00:03:56,900 --> 00:04:00,700
that never really bothered me
that much. It seems, you know,
79
00:04:01,000 --> 00:04:03,700
sort of clear that we had
a result there and it was
80
00:04:04,900 --> 00:04:08,900
Somewhat controversial, but instead
of controversial in a good way. It was
81
00:04:09,100 --> 00:04:10,400
clear that people felt
82
00:04:10,700 --> 00:04:12,800
they had to come out
83
00:04:12,900 --> 00:04:14,100
against it because well,
84
00:04:14,100 --> 00:04:16,800
I think what we think it's good
now and it's good night wasn't that
85
00:04:17,500 --> 00:04:21,700
they might not have loved it at. Yeah,
I you know, I remember being somewhat,
86
00:04:21,800 --> 00:04:26,400
the more upset there was some dinner
with someone said, no, no. That paper
87
00:04:26,700 --> 00:04:28,300
that paper with Josh. That was really
88
00:04:28,800 --> 00:04:32,600
that was doing a disservice
to the profession enough.
89
00:04:32,600 --> 00:04:34,400
We had two reactions like that. So
90
00:04:34,800 --> 00:04:38,200
That at some level, that's that
may be indicative of the culture
91
00:04:38,400 --> 00:04:40,000
in general in economics. In the time.
92
00:04:41,400 --> 00:04:44,300
I thought back later, but
I'd have to happen. Now,
93
00:04:44,600 --> 00:04:48,200
if I was senior person sitting
in that conversation, I would
94
00:04:48,300 --> 00:04:52,200
we call that out because it
really was not appropriate,
95
00:04:53,000 --> 00:04:54,200
but it was so bad.
96
00:04:54,600 --> 00:04:57,300
I think the criticism is
no, no. No, it wasn't,
97
00:04:57,700 --> 00:05:01,500
it wasn't completely
misguided to be wrong.
98
00:05:01,800 --> 00:05:04,700
No. No, but saying if you can
say two papers wrong order.
99
00:05:05,000 --> 00:05:05,300
Yeah,
100
00:05:05,400 --> 00:05:10,200
but it's saying that it's a disservice to
the professor. That's not really personal.
101
00:05:10,300 --> 00:05:14,700
That's yes. And doing that, not to me, but
in front of my senior colleagues, yeah.
102
00:05:14,900 --> 00:05:18,700
The but nobody was saying the result
was wrong and I remember also,
103
00:05:18,700 --> 00:05:20,900
some of the comments were, you know,
104
00:05:20,900 --> 00:05:24,000
thought-provoking so we had some
negative reviews. I think on the
105
00:05:24,400 --> 00:05:26,300
average causal response paper. Yeah,
106
00:05:26,500 --> 00:05:30,600
somebody said, you know, these compliers
you can't figure out who they are.
107
00:05:31,500 --> 00:05:31,900
Right.
108
00:05:32,000 --> 00:05:33,000
See it's one thing
109
00:05:33,000 --> 00:05:36,200
to say you're estimating the effect of
treatment on the treated or something
110
00:05:36,200 --> 00:05:38,400
like that. You can tell me who's treated,
111
00:05:38,600 --> 00:05:42,600
you know, people in the CPS, you know,
you can't tell me who's a complier.
112
00:05:42,800 --> 00:05:47,300
So that was a legitimate. That's, that's,
that's only fair that it's because he's
113
00:05:47,800 --> 00:05:53,700
my dad. Yeah, my that part made people
a little uneasy and uncomfortable.
114
00:05:53,800 --> 00:05:54,200
Yeah,
115
00:05:54,300 --> 00:05:56,400
but it's a at the same time
116
00:05:56,900 --> 00:06:00,800
because it showed that you couldn't
really go beyond that, it was
117
00:06:01,500 --> 00:06:06,300
Very useful thing to realize
everyone was kind of on the day.
118
00:06:06,500 --> 00:06:11,900
We got to the key result that I was thinking.
Wow, you know, this is this is sort of
119
00:06:12,100 --> 00:06:17,100
as good as it gets them here. Be
actually having inside but it clearly
120
00:06:17,500 --> 00:06:21,700
and we had to sell it. It's about selling
quite a few years. We had to sell. Yeah,
121
00:06:22,000 --> 00:06:24,800
and it's proven it's
proven to be quite useful.
122
00:06:25,500 --> 00:06:29,300
I don't think we understood that. It
would be so useful at the time. No,
123
00:06:30,100 --> 00:06:35,600
I did feel like early on this was
a substantial inside something.
124
00:06:35,600 --> 00:06:40,400
But yeah, but I did not
think goals were there. Yeah.
125
00:06:40,500 --> 00:06:42,400
I felt like we were aiming for the Nobel.
126
00:06:43,900 --> 00:06:46,300
We were very happy to get
that noted econometrics.
127
00:06:49,900 --> 00:06:52,800
These are factors are ways of approaching
problems that lead people to be better at
128
00:06:52,800 --> 00:06:53,100
like
129
00:06:53,200 --> 00:06:56,600
recognizing the good stuff and taking the
time to do it as opposed to dismissing it.
130
00:06:56,600 --> 00:06:57,700
Sometimes I think it's helpful.
131
00:06:57,700 --> 00:07:01,300
If you're trying to convince somebody
that you have something useful to say
132
00:07:01,900 --> 00:07:04,500
and maybe they don't, you
know, speak your language.
133
00:07:04,700 --> 00:07:09,900
You might need to learn their language.
Yes. Yes. That's what we did with Don we cuz
134
00:07:10,300 --> 00:07:11,300
we figured out.
135
00:07:11,400 --> 00:07:12,200
How to remember.
136
00:07:12,200 --> 00:07:17,600
We had a very hard time explaining
the exclusion restriction to Dawn May.
137
00:07:17,800 --> 00:07:19,700
Rightfully, so it probably
138
00:07:20,000 --> 00:07:24,600
I think he do and I eventually figured out
that it wasn't formulated very clearly,
139
00:07:24,900 --> 00:07:25,400
you know,
140
00:07:25,400 --> 00:07:29,700
and we came up with a way to do that in
the potential outcomes framework that
141
00:07:29,700 --> 00:07:32,700
I think kind of worked
for the three of us. Yeah.
142
00:07:33,400 --> 00:07:35,800
Well, I've worked for
the bigger literature but
143
00:07:35,800 --> 00:07:39,000
I think what you're saying that is exactly
right. You kind of need to figure out
144
00:07:39,100 --> 00:07:41,400
how not just kind of say, okay. Well,
145
00:07:41,400 --> 00:07:43,900
I've got this language
and this this works great
146
00:07:43,900 --> 00:07:45,900
and I've got to convince someone
else to use the language.
147
00:07:45,900 --> 00:07:47,600
You could first figure out what language
148
00:07:47,700 --> 00:07:49,300
State using and then
149
00:07:49,600 --> 00:07:51,200
only then, can you try to say?
150
00:07:51,200 --> 00:07:56,100
Wow, but here you thinking of it this way,
but that's actually a pretty hard thing
151
00:07:56,700 --> 00:08:00,000
to do. You get someone from a
different discipline, convincing them.
152
00:08:00,200 --> 00:08:03,300
They kind of to Junior faculty in a
different department actually have something
153
00:08:03,300 --> 00:08:06,600
to say to you. That's that
takes a fair amount of effort.
154
00:08:07,500 --> 00:08:10,200
Yeah, I wrote I wrote
on a number of times.
155
00:08:10,300 --> 00:08:10,600
Yeah,
156
00:08:10,700 --> 00:08:14,500
it fairly long letters. And I remember
thinking this is worth doing, you know,
157
00:08:14,600 --> 00:08:17,600
that if I could convince
Don that would sort of,
158
00:08:17,800 --> 00:08:19,600
The framework to some extent.
159
00:08:20,300 --> 00:08:25,000
I think both both you and Dom were a little
bit more confident that you were right.
160
00:08:25,000 --> 00:08:28,200
We used to argue a lot and
you would sometimes refereed.
161
00:08:29,800 --> 00:08:30,700
That was fun.
162
00:08:30,800 --> 00:08:34,100
I remember it wasn't. It wasn't hurtful.
163
00:08:35,200 --> 00:08:39,800
I remember getting a little testy.
Once we had lunch in The Faculty Club
164
00:08:40,600 --> 00:08:44,400
and we're talking, we're talking
about the draft lottery paper. Yeah,
165
00:08:45,200 --> 00:08:47,600
talking about never take his kind of
166
00:08:47,700 --> 00:08:51,000
people wouldn't serve in
the military irrespective of
167
00:08:51,200 --> 00:08:53,700
whether they were getting drafted.
168
00:08:54,500 --> 00:08:58,700
And you are done said something
about shooting yourself in the foot,
169
00:08:59,800 --> 00:09:03,400
as a way of getting out of the military
and that may be the exclusion restriction
170
00:09:03,400 --> 00:09:05,300
for never takes Muslim working.
171
00:09:06,300 --> 00:09:08,900
And then wherever they were ever
said that the animal is going well.
172
00:09:08,900 --> 00:09:12,100
Yes, you could do that. But why would
you want to shoot yourself in the foot?
173
00:09:13,400 --> 00:09:17,600
It is Khalil. The I usually go for
moving to Canada from. Yeah. That's it.
174
00:09:17,900 --> 00:09:19,400
Lindsey Graham teacher,
175
00:09:19,700 --> 00:09:24,000
yes, but he thinks a tricky mean it's the,
176
00:09:24,100 --> 00:09:27,900
you know, I say I get students coming
from computer science and they want to do
177
00:09:28,100 --> 00:09:29,900
things on causal, inference.
178
00:09:30,400 --> 00:09:33,700
And it takes a huge amount of
effort to kind of do figure out how
179
00:09:33,700 --> 00:09:36,800
they actually thinking about problem
better. This, there's something there.
180
00:09:37,000 --> 00:09:38,500
And so, now over the years,
181
00:09:38,500 --> 00:09:40,600
I've got a little more
appreciation for the fact that Don
182
00:09:40,800 --> 00:09:42,100
was actually willing to kind of.
183
00:09:42,200 --> 00:09:46,000
Yeah, it took a while, but he did
engage kind of first with Josh.
184
00:09:46,400 --> 00:09:47,500
And then we both of us.
185
00:09:47,700 --> 00:09:51,000
And rather than kind of dismissing
the say, okay. Well, you know,
186
00:09:51,500 --> 00:09:55,700
I can't figure out what these guys are
doing and it's probably just not really that
187
00:09:55,800 --> 00:09:56,700
that interesting
188
00:09:57,200 --> 00:10:00,300
everybody always wants to
figure out quickly, you know,
189
00:10:00,300 --> 00:10:04,300
you want to save time and you want to save
your brain cells for other things. So,
190
00:10:04,700 --> 00:10:05,000
you know,
191
00:10:05,000 --> 00:10:08,900
the fastest route to that is to figure
out why you should dismiss something. Yes.
192
00:10:09,000 --> 00:10:11,300
Yes. I don't need to spend time on this.
193
00:10:12,700 --> 00:10:15,900
If you'd like to watch more
Nobel conversations, click here,
194
00:10:16,300 --> 00:10:18,600
or if you'd like to learn
more about econometrics,
195
00:10:18,700 --> 00:10:21,300
check out Josh's mastering
econometrics series.
196
00:10:21,800 --> 00:10:24,700
If you'd like to learn more
about he do Josh and Isaiah
197
00:10:24,900 --> 00:10:26,400
check out the links in the description.