1
00:00:00,900 --> 00:00:01,900
♪ [music] ♪
2
00:00:03,800 --> 00:00:05,700
- [Narrator] Welcome to
Nobel conversations.
3
00:00:07,300 --> 00:00:10,240
In this episode, Josh Angrist
and Guido Imbens,
4
00:00:10,240 --> 00:00:12,000
sit down with Isaiah Andrews
5
00:00:12,000 --> 00:00:14,303
to discuss how the research
was initially received
6
00:00:14,900 --> 00:00:17,736
and how they responded
to criticism.
7
00:00:18,700 --> 00:00:19,300
At the time, did you feel like
you are on to something,
8
00:00:20,400 --> 00:00:24,000
you felt like this was the beginning
of a whole line of work
9
00:00:24,000 --> 00:00:27,400
that you felt like was going
to be important or...?
10
00:00:27,600 --> 00:00:30,100
Not so much that it was
a whole line of work,
11
00:00:30,100 --> 00:00:32,600
but certainly I felt like,
"Wow, this--"
12
00:00:32,600 --> 00:00:34,700
We proved something
that people didn't know before,
13
00:00:34,700 --> 00:00:39,000
that it was worth knowing.
14
00:00:39,000 --> 00:00:40,000
Yeah, going back compared to
my job market papers having--
15
00:00:41,600 --> 00:00:45,900
I felt this was actually
a very clear crisp result.
16
00:00:46,400 --> 00:00:48,400
But there were definitely
was mixed reception
17
00:00:48,900 --> 00:00:53,200
and I don't think anybody said that,
18
00:00:53,300 --> 00:00:56,900
"Oh, wow, this is already,
something."
19
00:00:57,100 --> 00:00:59,600
No, which is the nightmare scenario
for a researcher
20
00:01:00,300 --> 00:01:03,200
where you think you've
discovered something
21
00:01:03,300 --> 00:01:04,800
and then somebody else says,
"Oh, I knew that."
22
00:01:05,000 --> 00:01:08,000
But there were definitely was
a need to convince people
23
00:01:08,000 --> 00:01:09,000
that this was worth knowing,
that instrumental variables
24
00:01:09,000 --> 00:01:12,800
estimates a causal effect
for compliers.
25
00:01:13,200 --> 00:01:18,000
Yeah, but even though it
took a long time
26
00:01:18,600 --> 00:01:20,400
to convince a bigger audience,
27
00:01:20,700 --> 00:01:24,600
sometimes even fairly quickly,
the reception was pretty good
28
00:01:24,800 --> 00:01:27,000
among a small group of people.
29
00:01:27,200 --> 00:01:31,500
Gary, clearly liked it a lot
from the beginning
30
00:01:31,800 --> 00:01:34,600
and I remember, because at that point
Josh had left for Israel,
31
00:01:34,700 --> 00:01:37,400
but I remember explaining it
to Don Ruben
32
00:01:37,600 --> 00:01:43,700
and he was like,
"Yeah, this really is something here."
33
00:01:43,800 --> 00:01:47,200
Not right away though,
Don took some convincing.
34
00:01:47,500 --> 00:01:48,400
By the time you got to Don,
35
00:01:48,500 --> 00:01:51,500
there have been some back
and forth with him
36
00:01:51,800 --> 00:01:53,500
and in correspondence actually.
37
00:01:53,700 --> 00:01:56,700
But I remember at some point
getting a call or email from him
38
00:01:56,700 --> 00:02:02,300
saying that he was sitting
at the airport in Rome
39
00:02:02,500 --> 00:02:03,700
and looking at the paper
and thinking,
40
00:02:03,700 --> 00:02:07,000
"Yeah, no actually,
you guys are onto something."
41
00:02:07,000 --> 00:02:09,800
We were happy about
42
00:02:09,800 --> 00:02:10,800
but that took longer
than I think you remember.
43
00:02:10,800 --> 00:02:12,500
Yeah, it wasn't right away
44
00:02:12,600 --> 00:02:13,600
[laughter]
45
00:02:13,700 --> 00:02:16,500
because I know that I was back
in Israel by the time that happened.
46
00:02:16,500 --> 00:02:18,300
I'd left for Israel in the summer--
47
00:02:18,400 --> 00:02:22,300
I was only at Harvard for two years.
We had that one year.
48
00:02:22,600 --> 00:02:25,700
It is remarkable, I mean, that
one year was so fateful for us.
49
00:02:25,900 --> 00:02:27,200
- [Guido] Yes.
50
00:02:28,500 --> 00:02:30,200
I think we understood there was
something good happening,
51
00:02:30,200 --> 00:02:34,000
but maybe we didn't think it was
life-changing, only in retrospect.
52
00:02:34,400 --> 00:02:35,400
♪ [music] ♪
53
00:02:35,800 --> 00:02:38,300
- [Isaiah] As you said, it sounds like
a small group of people
54
00:02:38,300 --> 00:02:39,300
were initially quite receptive,
55
00:02:39,300 --> 00:02:41,000
perhaps took some time for
a broader group of people
56
00:02:44,400 --> 00:02:46,700
to come around to seeing
the LATE framework
57
00:02:46,700 --> 00:02:47,700
as a valuable way to look
at the world.
58
00:02:47,700 --> 00:02:50,000
I guess, in over the course of that,
59
00:02:50,100 --> 00:02:52,200
were their periods
where you thought,
60
00:02:52,300 --> 00:02:53,200
maybe the people saying
this wasn't a useful way
61
00:02:53,300 --> 00:02:55,800
to look at the world were right?
62
00:02:55,800 --> 00:02:58,400
Did you get discouraged?
How did you think about?
63
00:02:58,400 --> 00:03:00,900
I don't think I was discouraged
but the people who were saying
64
00:03:00,900 --> 00:03:03,900
that we're smart people,
well informed metricians,
65
00:03:05,000 --> 00:03:08,000
sophisticated readers
66
00:03:08,900 --> 00:03:11,800
and I think the substance
of the comment was,
67
00:03:11,800 --> 00:03:15,600
this is not what econometrics
is about.
68
00:03:16,300 --> 00:03:20,700
Econometrics was being transmitted
at that time was about structure.
69
00:03:21,300 --> 00:03:24,700
There was this idea that
there's structure in the economy
70
00:03:25,100 --> 00:03:27,100
and it's our job to discover it
71
00:03:27,200 --> 00:03:31,200
and what makes it structure
is it's essentially invariant
72
00:03:31,900 --> 00:03:34,800
and so we're saying,
in the LATE theorem,
73
00:03:34,900 --> 00:03:39,100
that every instrument produces
its own causal effect,
74
00:03:39,300 --> 00:03:42,100
which is in contradiction to that
to some extent
75
00:03:42,400 --> 00:03:45,300
and so that was where the tension was.
76
00:03:45,300 --> 00:03:46,300
People didn't want
to give up that idea.
77
00:03:46,300 --> 00:03:47,700
Yeah, I remember once
people were started
78
00:03:51,200 --> 00:03:56,100
arguing more more vocally
against that,
79
00:03:56,900 --> 00:04:00,700
that never really
bothered me that much.
80
00:04:01,000 --> 00:04:03,700
It seems clear that
we had a result there
81
00:04:04,900 --> 00:04:08,100
and it was somewhat
controversial,
82
00:04:08,100 --> 00:04:09,100
but controversial in a good way.
83
00:04:09,100 --> 00:04:10,400
It was clear that people felt
84
00:04:10,700 --> 00:04:12,800
they had to come out against it because--
85
00:04:14,100 --> 00:04:17,500
Well, I think what
we think it's good now
86
00:04:17,500 --> 00:04:20,800
we might not have loved it
at the time.
87
00:04:20,800 --> 00:04:21,800
I remember being somewhat,
the more upset--
88
00:04:21,800 --> 00:04:26,400
there was some dinner
where someone said,
89
00:04:26,700 --> 00:04:28,300
"No, no, that paper with Josh,
90
00:04:28,800 --> 00:04:32,600
that was doing a disservice
to the profession."
91
00:04:32,600 --> 00:04:34,400
We definitely had
reactions like that.
92
00:04:34,800 --> 00:04:38,200
At some level, that may be
indicative of the culture
93
00:04:38,400 --> 00:04:40,000
in general in economics
at the time.
94
00:04:41,400 --> 00:04:44,300
I thought back later,
what if that'd happened now,
95
00:04:44,600 --> 00:04:48,200
if I was a senior person sitting
in that conversation,
96
00:04:48,300 --> 00:04:52,200
I would call that out because it
really was not appropriate--
97
00:04:53,000 --> 00:04:54,200
- [Josh] But it wasn't so bad.
98
00:04:54,600 --> 00:04:57,300
I think the criticism is--
99
00:04:57,700 --> 00:05:01,500
It wasn't completely misguided,
it was maybe wrong.
100
00:05:01,800 --> 00:05:04,700
No, no, but you can say
the paper is wrong
101
00:05:05,400 --> 00:05:07,300
but it's saying that
102
00:05:07,300 --> 00:05:08,300
it's a disservice
to the profession,
103
00:05:08,300 --> 00:05:09,300
that's not really--
104
00:05:09,300 --> 00:05:10,300
Personal.
105
00:05:10,300 --> 00:05:13,900
Yes, and doing that, not to me,
106
00:05:13,900 --> 00:05:14,900
but in front of
my senior colleagues.
107
00:05:14,900 --> 00:05:17,700
But nobody was saying
the result was wrong
108
00:05:17,700 --> 00:05:18,700
and I remember also,
109
00:05:18,700 --> 00:05:20,900
some of the comments
were thought-provoking
110
00:05:20,900 --> 00:05:24,000
so we had some negative reviews,
111
00:05:24,400 --> 00:05:26,300
I think on the average
causal response paper.
112
00:05:26,500 --> 00:05:30,600
Somebody said, "These compliers
you can't figure out who they are."
113
00:05:31,500 --> 00:05:31,900
Right.
114
00:05:32,000 --> 00:05:33,000
It's one thing to say
115
00:05:33,000 --> 00:05:35,200
you're estimating
the effect of treatment
116
00:05:35,200 --> 00:05:36,200
on the treated
or something like that.
117
00:05:36,200 --> 00:05:38,400
You can tell me who's treated,
118
00:05:38,600 --> 00:05:42,600
people in the CPS,
you can't tell me who's a complier.
119
00:05:42,800 --> 00:05:46,800
So that was a legitimate challenge.
120
00:05:46,800 --> 00:05:47,800
That's certainly fair
and I can see why
121
00:05:47,800 --> 00:05:53,700
that part made people
a little uneasy and uncomfortable.
122
00:05:53,800 --> 00:05:54,200
Yeah.
123
00:05:54,300 --> 00:05:56,400
But it's a at the same time
124
00:05:56,900 --> 00:06:00,800
because it showed that you couldn't
really go beyond that,
125
00:06:01,500 --> 00:06:05,500
it was very useful thing
to realize.
126
00:06:05,500 --> 00:06:06,500
I remember on the day,
127
00:06:06,500 --> 00:06:11,100
we got to the key result
that I was thinking,
128
00:06:11,100 --> 00:06:12,100
"Wow, this is as good as it gets.
129
00:06:12,100 --> 00:06:16,500
Here we actually have an insight
but clearly--"
130
00:06:17,500 --> 00:06:21,000
And we had to sell it.
131
00:06:21,000 --> 00:06:22,000
For quite a few years,
we had to sell
132
00:06:22,000 --> 00:06:24,800
and it's proven to be quite useful.
133
00:06:25,500 --> 00:06:29,300
I don't think we understood that it
would be so useful at the time.
134
00:06:30,100 --> 00:06:34,600
No, I did feel early on this was
a substantial insight.
135
00:06:34,600 --> 00:06:35,600
- [Josh] Yeah we [learned] something.
136
00:06:35,600 --> 00:06:40,400
But I did not think
goals were there.
137
00:06:40,500 --> 00:06:42,400
I don't think we were aiming
for the Nobel.
138
00:06:42,650 --> 00:06:43,650
[laughter]
139
00:06:43,900 --> 00:06:46,300
We were very happy to get
that note in Econometrica.
140
00:06:47,600 --> 00:06:48,600
♪ [music] ♪
141
00:06:49,900 --> 00:06:52,800
- [Isaiah] Are there factors
or are ways of approaching problems
142
00:06:53,200 --> 00:06:55,600
that lead people to be better
at recognizing the good stuff
143
00:06:55,600 --> 00:06:56,600
and taking the time to do it
as opposed to dismissing it?
144
00:06:56,600 --> 00:06:57,700
- [Josh] Sometimes
I think it's helpful.
145
00:06:57,700 --> 00:07:00,900
If you're trying to
convince somebody
146
00:07:00,900 --> 00:07:01,900
that you have something
useful to say
147
00:07:01,900 --> 00:07:04,500
and maybe they don't
speak your language,
148
00:07:04,700 --> 00:07:09,900
you might need
to learn their language.
149
00:07:10,300 --> 00:07:10,850
Yes, yes, exactly.
150
00:07:10,850 --> 00:07:11,400
That's what we did with Don,
we figured out how to--
151
00:07:12,200 --> 00:07:16,800
I remember we had a very hard time
152
00:07:16,800 --> 00:07:17,800
explaining the exclusion
restriction to Don,
153
00:07:17,800 --> 00:07:19,700
maybe rightfully so,
154
00:07:20,000 --> 00:07:24,400
I think Guido and I
eventually figured out
155
00:07:24,400 --> 00:07:25,400
that it wasn't formulated
very clearly,
156
00:07:25,400 --> 00:07:28,700
and we came up
with a way to do that
157
00:07:28,700 --> 00:07:29,700
in the potential outcomes framework
158
00:07:29,700 --> 00:07:32,700
that I think worked
for the three of us.
159
00:07:33,400 --> 00:07:35,800
Yeah, well, it worked for
the bigger literature
160
00:07:35,800 --> 00:07:38,100
but I think what you're saying
there is exactly right,
161
00:07:38,100 --> 00:07:39,100
you need to figure out
how not just say,
162
00:07:41,400 --> 00:07:43,900
"Okay well, I've got this language
and this this works great
163
00:07:43,900 --> 00:07:45,900
and I've got to convince someone
else to use the language.
164
00:07:45,900 --> 00:07:47,600
You could first figure out
what language they're using
165
00:07:49,600 --> 00:07:51,200
and then only then,
can you try to say,
166
00:07:51,200 --> 00:07:55,700
"Wow, but here you thinking of it
this way,"
167
00:07:55,700 --> 00:07:56,700
but that's actually
a pretty hard thing to do,
168
00:07:56,700 --> 00:08:00,000
get someone from
a different discipline,
169
00:08:00,200 --> 00:08:02,300
convincing them, two junior faculty
in a different department
170
00:08:02,300 --> 00:08:03,300
actually have something
to say to you,
171
00:08:03,300 --> 00:08:06,600
that's that takes
a fair amount of effort.
172
00:08:07,500 --> 00:08:10,200
Yeah, I wrote on a number of times,
in fairly long letters.
173
00:08:10,700 --> 00:08:14,500
I remember thinking
this is worth doing,
174
00:08:14,600 --> 00:08:17,600
that if I could convince Don
175
00:08:18,450 --> 00:08:19,450
that would validate the framework
to some extent.
176
00:08:20,300 --> 00:08:24,000
I think both you and Don were
177
00:08:24,000 --> 00:08:25,000
a little bit more confident
that you were right.
178
00:08:25,000 --> 00:08:27,500
Well, we used to argue a lot
179
00:08:27,500 --> 00:08:28,500
and you would sometimes
referee those.
180
00:08:28,500 --> 00:08:29,500
[laughter]
181
00:08:29,800 --> 00:08:30,700
That was fun.
182
00:08:30,800 --> 00:08:34,100
It wasn't hurtful.
183
00:08:35,200 --> 00:08:39,600
I remember getting
a little testy once,
184
00:08:39,600 --> 00:08:40,600
we had lunch in The Faculty Club
185
00:08:40,600 --> 00:08:44,400
and we're talking about
the draft lottery paper.
186
00:08:45,200 --> 00:08:47,600
We were talking about never takes
187
00:08:47,700 --> 00:08:51,000
as people wounded serve
in the military irrespective of
188
00:08:51,200 --> 00:08:53,700
whether they were getting drafted
189
00:08:54,500 --> 00:08:58,700
and you and Don said something
about shooting yourself in the foot,
190
00:08:59,800 --> 00:09:02,400
as a way of getting out of the military
191
00:09:02,400 --> 00:09:03,400
and that may be
the exclusion restriction
192
00:09:03,400 --> 00:09:05,300
for never takes wasn't working
193
00:09:06,300 --> 00:09:08,900
and then the other one would say,
194
00:09:08,900 --> 00:09:11,250
"Well, yes you could do that
195
00:09:11,250 --> 00:09:12,250
but why would you want
to shoot yourself in the foot?"
196
00:09:12,250 --> 00:09:12,825
[laughter]
197
00:09:12,825 --> 00:09:13,400
It got a little [out of hand] there.
198
00:09:13,400 --> 00:09:17,600
I usually go for moving to Canada,
for my example,
199
00:09:18,150 --> 00:09:19,150
when I'm teaching that.
200
00:09:19,700 --> 00:09:24,000
But he thinks it's tricky,
201
00:09:24,100 --> 00:09:27,900
I get students coming
from computer science
202
00:09:28,100 --> 00:09:29,900
and they want to do things
on causal inference
203
00:09:30,400 --> 00:09:33,700
and it takes a huge amount
of effort to figure out
204
00:09:33,700 --> 00:09:36,000
how they actually thinking
about problem
205
00:09:36,000 --> 00:09:37,000
and whether there's something there
206
00:09:37,000 --> 00:09:38,500
and so, now over the years,
207
00:09:38,500 --> 00:09:40,600
I've got a little more appreciation
for the fact
208
00:09:40,800 --> 00:09:42,100
that Don was actually willing to--
209
00:09:42,200 --> 00:09:46,000
It took him a while,
but he did engage first with Josh
210
00:09:46,400 --> 00:09:47,500
and then with both of us
211
00:09:47,700 --> 00:09:51,000
and rather than dismissing
and say,
212
00:09:51,500 --> 00:09:54,800
"Well, okay I can't figure out
what these guys are doing
213
00:09:54,800 --> 00:09:55,800
and it's probably just
not really interesting."
214
00:09:57,200 --> 00:10:00,300
Everybody always wants
to figure out quickly,
215
00:10:00,300 --> 00:10:04,000
you want to save time
216
00:10:04,000 --> 00:10:05,000
and you want to save your brain cells
for other things.
217
00:10:05,000 --> 00:10:07,000
The fastest route to
that is to figure out
218
00:10:07,000 --> 00:10:08,000
why you should dismiss something.
219
00:10:08,000 --> 00:10:09,000
Yes.
220
00:10:09,000 --> 00:10:11,300
I don't need to spend time on this.
221
00:10:11,500 --> 00:10:12,500
♪ [music] ♪
222
00:10:12,700 --> 00:10:15,300
- [Narrator] If you'd like
to watch more
223
00:10:15,300 --> 00:10:16,300
Nobel conversations, click here,
224
00:10:16,300 --> 00:10:18,600
or if you'd like to learn
more about econometrics,
225
00:10:18,700 --> 00:10:21,300
check out Josh's "Mastering
Econometrics" series.
226
00:10:21,800 --> 00:10:24,700
If you'd like to learn more
about Guido, Josh, and Isaiah
227
00:10:24,900 --> 00:10:26,400
check out the links
in the description.
228
00:10:26,702 --> 00:10:28,017
♪ [music] ♪