1
00:00:00,130 --> 00:00:02,200
♪ [music] ♪
2
00:00:03,645 --> 00:00:05,600
- [Narrator] Welcome
to Nobel Conversations.
3
00:00:07,250 --> 00:00:10,240
In this episode, Josh Angrist
and Guido Imbens,
4
00:00:10,240 --> 00:00:11,920
sit down with Isaiah Andrews
5
00:00:11,920 --> 00:00:14,303
to discuss how their research
was initially received
6
00:00:14,999 --> 00:00:17,090
and how they responded
to criticism.
7
00:00:18,700 --> 00:00:20,380
- [Isaiah] At the time,
did you feel like
8
00:00:20,380 --> 00:00:21,627
you were on to something,
9
00:00:21,627 --> 00:00:25,152
you felt this was the beginning
of a whole line of work
10
00:00:25,152 --> 00:00:27,202
that you felt like was going
to be important or...?
11
00:00:27,600 --> 00:00:30,000
- [Guido] Not so much
that it was a whole line of work,
12
00:00:30,000 --> 00:00:31,894
but certainly I felt like,
"Wow, this--"
13
00:00:32,277 --> 00:00:35,045
- [Josh] We've proved something
we didn't know before,
14
00:00:35,045 --> 00:00:36,114
that it was worth knowing.
15
00:00:36,114 --> 00:00:38,033
- Yeah, going back to the...
16
00:00:38,741 --> 00:00:41,080
compared to my job market
paper or something--
17
00:00:41,080 --> 00:00:45,560
No, I felt this was actually
a very clear, crisp result.
18
00:00:46,400 --> 00:00:49,530
- But there was definitely
a mixed reception
19
00:00:49,530 --> 00:00:52,420
and I don't think anybody
said that,
20
00:00:52,420 --> 00:00:55,487
"Oh, well, this is
already something
21
00:00:56,043 --> 00:00:59,386
which is the nightmare scenario
for a researcher
22
00:01:00,230 --> 00:01:02,003
where you think
you've discovered something
23
00:01:02,003 --> 00:01:04,461
and then somebody else says,
"Oh, I knew that."
24
00:01:05,000 --> 00:01:07,220
But there definitely was
a need to convince people
25
00:01:07,220 --> 00:01:10,370
that this was worth knowing,
that instrumental variables
26
00:01:10,370 --> 00:01:12,687
estimates a causal effect
for compliers.
27
00:01:13,200 --> 00:01:16,178
- Yeah, but even though
it took a long time
28
00:01:16,178 --> 00:01:19,348
to convince a bigger audience,
29
00:01:19,820 --> 00:01:24,346
sometimes even fairly quickly,
the reception was pretty good
30
00:01:24,800 --> 00:01:26,645
among a small group of people.
31
00:01:27,200 --> 00:01:31,297
Gary clearly liked it a lot
from the beginning,
32
00:01:31,800 --> 00:01:33,289
and I remember...
33
00:01:33,289 --> 00:01:35,645
because at that point
Josh had left for Israel,
34
00:01:35,645 --> 00:01:38,886
but I remember explaining it
to Don Rubin,
35
00:01:39,696 --> 00:01:43,700
and he was like, "You know,
this really is something here."
36
00:01:43,700 --> 00:01:45,197
- Not right away though.
37
00:01:45,932 --> 00:01:47,173
Don took some convincing.
38
00:01:47,500 --> 00:01:49,150
By the time you got to Don,
39
00:01:49,150 --> 00:01:51,226
there have been
some back and forth with him
40
00:01:51,226 --> 00:01:53,304
and in correspondence, actually.
41
00:01:53,700 --> 00:01:57,103
- But I remember at some point
getting a call or email from him
42
00:01:57,103 --> 00:02:00,020
saying that he was sitting
at the airport in Rome
43
00:02:00,020 --> 00:02:03,700
and looking at the paper
and thinking,
44
00:02:03,700 --> 00:02:07,000
"Yeah, no actually,
you guys are onto something."
45
00:02:07,490 --> 00:02:08,594
- We were happy about that.
46
00:02:08,594 --> 00:02:10,550
But that took longer
than I think you remember.
47
00:02:11,030 --> 00:02:12,500
It wasn't right away.
48
00:02:12,500 --> 00:02:13,700
[laughter]
49
00:02:13,700 --> 00:02:15,325
Because I know
that I was back in Israel
50
00:02:15,325 --> 00:02:16,627
by the time that happened.
51
00:02:16,627 --> 00:02:18,750
I'd left for Israel
in the summer of--
52
00:02:19,390 --> 00:02:21,190
I was only at Harvard
for two years.
53
00:02:21,190 --> 00:02:22,540
We had that one year.
54
00:02:22,540 --> 00:02:25,700
It is remarkable, I mean, that
one year was so fateful for us.
55
00:02:25,900 --> 00:02:27,200
- [Guido] Yes.
56
00:02:27,690 --> 00:02:30,200
I think we understood there was
something good happening,
57
00:02:30,200 --> 00:02:33,700
but maybe we didn't think it was
life-changing, only in retrospect.
58
00:02:33,700 --> 00:02:35,620
♪ [music] ♪
59
00:02:35,620 --> 00:02:37,495
- [Isaiah] As you said, it sounds
like a small group of people
60
00:02:37,495 --> 00:02:39,190
were initially quite receptive,
61
00:02:39,190 --> 00:02:42,190
perhaps took some time
for a broader group of people
62
00:02:43,090 --> 00:02:45,912
to come around to seeing
the LATE framework
63
00:02:45,912 --> 00:02:47,620
as a valuable way to look
at the world.
64
00:02:47,620 --> 00:02:50,100
I guess, in over
the course of that,
65
00:02:50,100 --> 00:02:52,128
were their periods
where you thought,
66
00:02:52,128 --> 00:02:54,450
maybe the people saying
this wasn't a useful way
67
00:02:54,450 --> 00:02:55,751
to look at the world were right?
68
00:02:55,751 --> 00:02:58,360
Did you get discouraged?
How did you think about?
69
00:02:58,360 --> 00:02:59,755
- I don't think I was discouraged,
70
00:02:59,755 --> 00:03:02,271
but the people who were saying
that we're smart people,
71
00:03:02,784 --> 00:03:06,117
well-informed econometricians,
72
00:03:06,117 --> 00:03:07,800
sophisticated readers,
73
00:03:08,900 --> 00:03:11,324
and I think the substance
of the comment
74
00:03:11,324 --> 00:03:14,297
was this is not what
econometrics is about.
75
00:03:14,297 --> 00:03:20,572
Econometrics being transmitted
at that time was about structure.
76
00:03:21,300 --> 00:03:24,490
There was this idea that
there's structure in the economy,
77
00:03:25,100 --> 00:03:27,200
and it's our job to discover it,
78
00:03:27,200 --> 00:03:30,952
and what makes its structure
is it's essentially invariant.
79
00:03:32,570 --> 00:03:34,900
And so we're saying,
in the LATE theorem,
80
00:03:34,900 --> 00:03:37,699
that every instrument produces
its own causal effect,
81
00:03:37,699 --> 00:03:41,386
which is in contradiction to that
to some extent,
82
00:03:41,386 --> 00:03:43,640
and so that was
where the tension was.
83
00:03:43,640 --> 00:03:45,551
People didn't want
to give up that idea.
84
00:03:46,300 --> 00:03:50,369
- Yeah, I remember once
people were started
85
00:03:51,200 --> 00:03:55,664
arguing more vocally against that,
86
00:03:56,900 --> 00:03:59,483
that never really
bothered me that much.
87
00:03:59,483 --> 00:04:03,051
It seemed clear that
we had a result there,
88
00:04:03,051 --> 00:04:05,878
and it became somewhat
controversial,
89
00:04:05,878 --> 00:04:08,395
but controversial in a good way.
90
00:04:08,620 --> 00:04:10,190
It was clear that people felt
91
00:04:10,820 --> 00:04:13,835
they had to come out
against it because--
92
00:04:13,970 --> 00:04:15,649
- Well, I think we think
it's good now.
93
00:04:17,426 --> 00:04:19,238
We might not have loved it
at the time.
94
00:04:20,168 --> 00:04:22,984
I remember being
somewhat more upset--
95
00:04:22,984 --> 00:04:24,780
there was some dinner
where someone said,
96
00:04:24,780 --> 00:04:27,455
"No, no, no,
that paper with Josh --
97
00:04:28,855 --> 00:04:30,749
that was doing a disservice
to the profession."
98
00:04:32,050 --> 00:04:33,850
- We definitely had
reactions like that.
99
00:04:35,410 --> 00:04:38,200
- At some level, that may be
indicative of the culture
100
00:04:38,400 --> 00:04:40,000
in general in economics
at the time.
101
00:04:41,400 --> 00:04:44,097
I thought back later,
what if that happened now?
102
00:04:44,600 --> 00:04:47,682
If I was a senior person
sitting in that conversation,
103
00:04:48,200 --> 00:04:51,898
I would call that out because
it really was not appropriate--
104
00:04:53,000 --> 00:04:54,200
- [Josh] It wasn't so bad.
105
00:04:54,600 --> 00:04:56,600
I think the criticism is...
106
00:04:57,700 --> 00:04:59,298
It wasn't completely misguided.
107
00:05:00,070 --> 00:05:01,351
It was maybe wrong.
108
00:05:01,800 --> 00:05:04,485
No, no, but you can say
that paper is wrong,
109
00:05:05,280 --> 00:05:06,440
but it's saying that
110
00:05:06,440 --> 00:05:08,128
it's a disservice
to the profession --
111
00:05:08,128 --> 00:05:10,300
- that's not really--
- [Isaiah] It's a bit personal.
112
00:05:10,300 --> 00:05:12,646
- Yes, and doing that not to me
113
00:05:12,646 --> 00:05:14,442
but in front of
my senior colleagues.
114
00:05:15,191 --> 00:05:17,369
- But nobody was saying
the result was wrong,
115
00:05:17,369 --> 00:05:18,700
and I remember also,
116
00:05:18,700 --> 00:05:21,579
some of the comments
were thought-provoking.
117
00:05:21,579 --> 00:05:23,059
So we had some negative reviews,
118
00:05:23,059 --> 00:05:25,861
I think, on the average
causal response paper.
119
00:05:26,500 --> 00:05:30,361
Somebody said, "These compliers,
you can't figure out who they are."
120
00:05:31,967 --> 00:05:33,891
It's one thing to say
you're estimating
121
00:05:33,891 --> 00:05:35,678
the effect of treatment
on the treated
122
00:05:35,678 --> 00:05:36,840
or something like that.
123
00:05:36,840 --> 00:05:38,400
You can tell me who's treated
124
00:05:38,700 --> 00:05:42,289
people in the CPS,
you can't tell me who's a complier.
125
00:05:42,929 --> 00:05:44,679
So that was a legitimate challenge.
126
00:05:44,679 --> 00:05:47,800
- That's certainly fair,
and I can see why
127
00:05:49,880 --> 00:05:53,502
that part made people
a little uneasy and uncomfortable.
128
00:05:54,300 --> 00:05:56,400
But at the same time,
129
00:05:56,900 --> 00:06:00,244
because it showed that you couldn't
really go beyond that,
130
00:06:00,800 --> 00:06:03,775
it was very useful thing
to realize.
131
00:06:04,630 --> 00:06:06,180
I remember on the day,
132
00:06:06,500 --> 00:06:09,771
we got to the key result
that I was thinking,
133
00:06:09,771 --> 00:06:13,113
"Wow, this is as good as it gets.
134
00:06:14,221 --> 00:06:16,978
Here we actually have an insight
but clearly--"
135
00:06:17,500 --> 00:06:19,250
And we had to sell it
at some point.
136
00:06:19,480 --> 00:06:21,261
For quite a few years,
we had to sell it
137
00:06:23,480 --> 00:06:24,892
and it's proven to be quite useful.
138
00:06:25,500 --> 00:06:28,761
I don't think we understood that
it would be so useful at the time.
139
00:06:28,761 --> 00:06:29,871
No.
140
00:06:30,170 --> 00:06:34,600
I did feel early on this was
a substantial insight.
141
00:06:34,600 --> 00:06:36,440
- [Josh] Yeah we [did] something.
142
00:06:36,440 --> 00:06:40,041
But I did not think
goals were there.
143
00:06:40,700 --> 00:06:42,600
I don't think we were aiming
for the Nobel.
144
00:06:42,600 --> 00:06:43,730
[laughter]
145
00:06:43,730 --> 00:06:46,243
We were very happy to get
that note in Econometrica.
146
00:06:46,859 --> 00:06:48,829
♪ [music] ♪
147
00:06:49,770 --> 00:06:51,500
- [Isaiah] Are there factors
or are ways of approaching problems
148
00:06:51,500 --> 00:06:54,186
that lead people to be better
at recognizing the good stuff
149
00:06:54,186 --> 00:06:56,600
and taking the time to do it
as opposed to dismissing it?
150
00:06:56,600 --> 00:06:57,830
- [Josh] Sometimes
I think it's helpful.
151
00:06:57,830 --> 00:06:59,478
If you're trying to
convince somebody
152
00:06:59,478 --> 00:07:01,247
that you have something
useful to say
153
00:07:01,900 --> 00:07:04,176
and maybe they don't
speak your language,
154
00:07:04,894 --> 00:07:06,541
you might need
to learn their language.
155
00:07:06,761 --> 00:07:07,910
Yes, yes, exactly.
156
00:07:07,910 --> 00:07:11,736
That's what we did with Don,
we figured out how to--
157
00:07:11,736 --> 00:07:14,052
I remember we had a very hard time
158
00:07:14,052 --> 00:07:15,816
explaining the exclusion restriction
to Don,
159
00:07:17,430 --> 00:07:18,993
maybe rightfully so,
160
00:07:19,804 --> 00:07:21,948
I think Guido and I
eventually figured out
161
00:07:21,948 --> 00:07:24,420
that it wasn't formulated
very clearly,
162
00:07:25,400 --> 00:07:27,450
and we came up
with a way to do that
163
00:07:27,450 --> 00:07:29,316
in the potential outcomes framework
164
00:07:29,316 --> 00:07:32,218
that I think worked
for the three of us.
165
00:07:32,218 --> 00:07:33,419
- [Guido] Yeah.
166
00:07:33,419 --> 00:07:35,454
Well, it worked for
the bigger literature
167
00:07:35,454 --> 00:07:37,639
but I think what you're saying
there is exactly right,
168
00:07:37,639 --> 00:07:40,860
you need to figure out
how not just say,
169
00:07:40,860 --> 00:07:43,894
"Okay well, I've got this language
and this this works great
170
00:07:43,894 --> 00:07:45,900
and I've got to convince someone
else to use the language.
171
00:07:45,900 --> 00:07:48,188
You could first figure out
what language they're using
172
00:07:48,680 --> 00:07:51,028
and then only then,
can you try to say,
173
00:07:51,028 --> 00:07:53,140
"Well, but here you thinking of it
this way,"
174
00:07:53,140 --> 00:07:56,880
but that's actually
a pretty hard thing to do,
175
00:07:56,880 --> 00:07:59,098
you get someone from
a different discipline,
176
00:07:59,098 --> 00:08:02,300
convincing them, two junior faculty
in a different department
177
00:08:02,300 --> 00:08:04,366
actually have something
to say to you,
178
00:08:04,596 --> 00:08:06,516
that takes a fair amount of effort.
179
00:08:07,500 --> 00:08:09,782
Yeah, I wrote Don
a number of times,
180
00:08:10,420 --> 00:08:11,868
in fairly long letters.
181
00:08:11,868 --> 00:08:13,805
I remember thinking
this is worth doing,
182
00:08:14,600 --> 00:08:16,006
that if I could convince Don
183
00:08:16,780 --> 00:08:19,444
that would validate the framework
to some extent.
184
00:08:20,300 --> 00:08:22,924
I think both you and Don were
185
00:08:22,924 --> 00:08:25,000
a little bit more confident
that you were right.
186
00:08:25,000 --> 00:08:26,438
Well, we used to argue a lot
187
00:08:26,438 --> 00:08:28,320
and you would sometimes
referee those.
188
00:08:28,320 --> 00:08:29,500
[laughter]
189
00:08:29,800 --> 00:08:30,800
That was fun.
190
00:08:32,760 --> 00:08:34,125
It wasn't hurtful.
191
00:08:35,200 --> 00:08:37,492
I remember it getting
a little testy once,
192
00:08:37,935 --> 00:08:39,606
we had lunch in The Faculty Club
193
00:08:40,600 --> 00:08:44,077
and we're talking about
the draft lottery paper.
194
00:08:44,930 --> 00:08:47,430
We were talking about "never takes"
195
00:08:47,430 --> 00:08:51,000
[as people who wound serve]
in the military irrespective of
196
00:08:51,000 --> 00:08:53,500
whether they were getting drafted
197
00:08:54,500 --> 00:08:58,800
and you or Don said something
about shooting yourself in the foot,
198
00:08:58,800 --> 00:08:59,800
[laughter]
199
00:08:59,800 --> 00:09:01,530
as a way of getting
out of the military
200
00:09:01,530 --> 00:09:03,230
and that may be
the exclusion restriction
201
00:09:03,230 --> 00:09:05,223
for never takes wasn't working
202
00:09:06,300 --> 00:09:08,520
and then the other one was going,
203
00:09:08,520 --> 00:09:09,791
"Well, yes you could do that
204
00:09:09,791 --> 00:09:12,008
but why would you want
to shoot yourself in the foot?"
205
00:09:12,008 --> 00:09:13,225
[laughter]
206
00:09:13,225 --> 00:09:15,400
It got a little [out of hand there]--
207
00:09:15,400 --> 00:09:17,860
I usually go for moving to Canada,
for my example,
208
00:09:18,690 --> 00:09:20,096
when I'm teaching that.
209
00:09:20,096 --> 00:09:21,365
[laughter]
210
00:09:22,030 --> 00:09:23,575
But things are tricky,
211
00:09:24,860 --> 00:09:26,595
I get students coming
from Computer Science
212
00:09:26,595 --> 00:09:29,943
and they want to do things
on causal inference
213
00:09:30,566 --> 00:09:33,460
and it takes a huge amount
of effort to figure out
214
00:09:33,460 --> 00:09:35,230
how they actually thinking
about problem
215
00:09:35,230 --> 00:09:37,000
and whether there's something there
216
00:09:37,000 --> 00:09:38,310
and so, now over the years,
217
00:09:38,310 --> 00:09:40,302
I've got a little more appreciation
for the fact
218
00:09:40,302 --> 00:09:41,958
that Don was actually willing to--
219
00:09:42,630 --> 00:09:46,000
It took him a while,
but he did engage first with Josh
220
00:09:46,400 --> 00:09:47,500
and then with both of us
221
00:09:48,380 --> 00:09:50,163
and rather than dismissing
and say,
222
00:09:50,163 --> 00:09:53,348
"Okay, well I can't figure out
what these guys are doing
223
00:09:53,348 --> 00:09:56,435
and it's probably just
not really that interesting."
224
00:09:57,200 --> 00:09:59,736
Everybody always wants
to figure out quickly,
225
00:10:00,196 --> 00:10:01,376
you want to save time
226
00:10:01,376 --> 00:10:03,410
and you want to save
your brain cells
227
00:10:03,410 --> 00:10:04,583
for other things.
228
00:10:05,000 --> 00:10:07,000
The fastest route to
that is to figure out
229
00:10:07,000 --> 00:10:08,460
why you should dismiss something.
230
00:10:08,460 --> 00:10:09,560
Yes.
231
00:10:09,560 --> 00:10:11,100
I don't need to spend time on this.
232
00:10:11,100 --> 00:10:12,498
♪ [music] ♪
233
00:10:12,498 --> 00:10:13,880
- [Narrator] If you'd like
to watch more
234
00:10:13,880 --> 00:10:15,822
Nobel conversations, click here,
235
00:10:16,220 --> 00:10:18,409
or if you'd like to learn
more about econometrics,
236
00:10:18,640 --> 00:10:21,240
check out Josh's "Mastering
Econometrics" series.
237
00:10:21,800 --> 00:10:24,540
If you'd like to learn more
about Guido, Josh, and Isaiah
238
00:10:24,860 --> 00:10:26,502
check out the links
in the description.
239
00:10:26,992 --> 00:10:28,307
♪ [music] ♪