Return to Video

How Is Econometrics Changing? (Josh Angrist, Guido Imbens, Isaiah Andrews)

  • 0:04 - 0:06
    Welcome to Nobel conversations.
  • 0:07 - 0:08
    In this episode,
  • 0:08 - 0:10
    Josh Angrist and Guido Imbens
  • 0:10 - 0:15
    sit down with Isaiah Andrews to discuss
    how the field of econometrics is evolving.
  • 0:16 - 0:18
    So he doing Josh, you're both
  • 0:18 - 0:22
    pioneers and developing tools for
    Empirical research in economics.
  • 0:22 - 0:25
    And so I'd like to explore sort of where
    you feel like the field is heading,
  • 0:25 - 0:29
    sort of Economics econometrics.
    The whole thing to start,
  • 0:30 - 0:31
    I'd be interested to hear
  • 0:32 - 0:35
    about whether you feel like
    the sort of the way in which
  • 0:35 - 0:37
    the local average treatment effects from.
  • 0:37 - 0:39
    Work sort of took hold
  • 0:39 - 0:44
    has any lessons for how new empirical
    methods and economics develop and spread or
  • 0:44 - 0:46
    how they should? That's a good question.
  • 0:47 - 0:48
    You go first.
  • 0:50 - 0:55
    Yeah, somebody I think the important
    thing is to come up with with
  • 0:55 - 0:59
    the good conferencing cases. Where
  • 1:01 - 1:02
    The questions are clear
  • 1:02 - 1:06
    and we're kind of the methods
    apply in general. So maybe
  • 1:06 - 1:12
    one thing I kind of looking back
    at the subsequent literature.
  • 1:12 - 1:13
    So I really like the
  • 1:14 - 1:17
    the regression discontinuity
    literature. Whenever
  • 1:17 - 1:21
    clearly a bunch of really convincing
    examples and that allowed people to kind of
  • 1:22 - 1:27
    think more clearly look harder at
    a methodological. The questions,
  • 1:27 - 1:30
    can it do a clear applications that
    allow you to kind of think about
  • 1:30 - 1:34
    Wow, do this type of sumption
    seem reasonable here. What kind of
  • 1:34 - 1:38
    what kind of things do we not?
    Like in this, the early papers?
  • 1:38 - 1:42
    How can we improve things? So having
    clear applications motivating,
  • 1:43 - 1:46
    these legislators? I think
    it's it's very helpful.
  • 1:47 - 1:49
    I'm glad you mentioned the
    regression discontinuity. He do.
  • 1:49 - 1:53
    I think there's a lot of
    complementarity between IV and RD.
  • 1:55 - 2:00
    Instrumental variables and
    regression discontinuity and the
  • 2:01 - 2:06
    A lot of the Econo metric applications
    of regression discontinuity are what used
  • 2:06 - 2:08
    to be called? Fuzzy rdd where
  • 2:08 - 2:12
    you know, it's not discrete or
    deterministic at the cutoff, but
  • 2:12 - 2:15
    just to change in rates or intensity.
  • 2:15 - 2:20
    And and the late framework helps us
    understand those applications and gives us
  • 2:20 - 2:25
    a clear interpretation for say something
    like, in my paper with Victor Lavie,
  • 2:25 - 2:28
    where we use, maimonides
    rule, the class size cut-offs.
  • 2:29 - 2:30
    What are you getting there? So?
  • 2:30 - 2:34
    Of course, you can answer that question
    with a linear constant effects model,
  • 2:34 - 2:38
    but it turns out we're not limited
    to that. And an RD is still
  • 2:38 - 2:40
    very powerful and Illuminating,
  • 2:41 - 2:42
    even when you know,
  • 2:42 - 2:46
    the correlation between the cut
    off and the variable of interest in
  • 2:46 - 2:51
    this case class size is partial,
    maybe even not that strong.
  • 2:52 - 2:56
    So there was definitely a kind of a
    parallel development. It's also interesting,
  • 2:57 - 3:00
    you know, nobody talked about
    regression discontinuity designs when we
  • 3:00 - 3:02
    Were in graduate school. It was
  • 3:02 - 3:05
    something that other social
    scientists were interested in
  • 3:06 - 3:12
    and that kind of grew up alongside the
    late framework and we've both done work on
  • 3:12 - 3:15
    both applications and methods there and
  • 3:15 - 3:20
    it's been very exciting to see that
    kind of develop and become so important.
  • 3:20 - 3:24
    It's part of a general Evolution.
    I think towards you know,
  • 3:24 - 3:30
    credible identification, strategies, causal
    effects less, you know, making a condom
  • 3:30 - 3:33
    Tricks more about causal
    questions that about models
  • 3:34 - 3:39
    in terms of the future. I think one thing
    that Slade has helped facilitate as a move
  • 3:39 - 3:43
    towards more creative
    randomized, trials, where,
  • 3:43 - 3:44
    you know, there's something of interest,
  • 3:46 - 3:51
    it's not possible or straightforward
    to Simply turn it off or on
  • 3:51 - 3:53
    but you can encourage it
  • 3:53 - 3:58
    or discourage it. So, you subsidize
    schooling with financial aid, for example,
  • 3:59 - 4:00
    so now we have a whole
  • 4:00 - 4:02
    Framework for interpreting that.
  • 4:03 - 4:04
    And,
  • 4:04 - 4:07
    and it kind of opens
    the doors to randomized,
  • 4:07 - 4:09
    Trials of things that that maybe would,
  • 4:10 - 4:10
    you know,
  • 4:10 - 4:14
    not have seen possible before we've,
  • 4:14 - 4:20
    we've used that a lot in the work. We do
    on schools in our in the blueprint Lab
  • 4:20 - 4:27
    at MIT were exploiting random assignment
    and in very creative ways, I think.
  • 4:28 - 4:32
    Related to that. Do you see sort
    of particular factors that make for
  • 4:32 - 4:34
    useful research and econometrics.
  • 4:34 - 4:37
    You've alluded to it?
  • 4:37 - 4:40
    Having a clear connection to
    problems that are actually coming up.
  • 4:40 - 4:45
    And empirical practice is often a good
    idea. I'll send it. Always a good idea.
  • 4:46 - 4:50
    I often find myself sitting in an
    economy metrics Theory, seminar.
  • 4:51 - 4:52
    Say the Harvard MIT seminar
  • 4:53 - 4:57
    and I'm thinking what problem is
    this guy solving who has this?
  • 4:57 - 5:00
    This problem and you know,
  • 5:02 - 5:05
    sometimes there's an
    embarrassing silence if I ask
  • 5:05 - 5:08
    or there might be a
    fairly contrived scenario.
  • 5:09 - 5:12
    I want to see where the tool is useful.
  • 5:12 - 5:15
    There are some purely foundational tools.
  • 5:15 - 5:18
    I do take the point, you
    know, there are people who are
  • 5:18 - 5:22
    working on conceptual
    foundations of you know,
  • 5:23 - 5:25
    it's more becomes more like
    mathematical statistics.
  • 5:26 - 5:27
    I mean, I remember an early example
  • 5:27 - 5:28
    I believe that that I,
  • 5:28 - 5:32
    you know, I struggled to understand was
    the idea of stochastic Equity continuity,
  • 5:32 - 5:37
    which my one of my thesis advisors Whitney
    knew he was using to great effect and
  • 5:38 - 5:40
    I was trying to understand
    that and there isn't really.
  • 5:41 - 5:45
    It's really foundational. It's not
    but an application that's driving that
  • 5:46 - 5:47
    at least not immediately
  • 5:49 - 5:53
    but but most things are not like that
    and so there should be a problem.
  • 5:54 - 5:57
    And the I think it's on the it's on
  • 5:57 - 6:00
    On the, the seller of that sort of thing,
  • 6:00 - 6:04
    you know, because there's opportunity
    cost the time and attention and effort
  • 6:04 - 6:07
    to understand things to, you
    know, it's on the seller to say.
  • 6:07 - 6:09
    Hey, I'm solving this problem
  • 6:09 - 6:13
    and and here's a set of results
    that show that it's useful.
  • 6:13 - 6:15
    And here's some insight that I get.
  • 6:16 - 6:17
    As you said, Josh, great,
  • 6:17 - 6:21
    sort of there's been a move in the
    direction of thinking more about causality
  • 6:21 - 6:23
    in economics and empirical
    work in economics,
  • 6:23 - 6:25
    any consequences of sort of the Wilds,
  • 6:25 - 6:27
    the spread of that view. That
    surprised you or anything.
  • 6:27 - 6:31
    UB was downsides of sort of the way that
    he could empirical, economics has gone
  • 6:32 - 6:32
    sometimes.
  • 6:32 - 6:38
    I see somebody does Ivy and they get a
    result which seems implausibly large.
  • 6:39 - 6:40
    That's the usual case.
  • 6:42 - 6:44
    So it might be, you know,
  • 6:44 - 6:49
    an extraordinarily large causal effect
    of some relatively minor Intervention,
  • 6:49 - 6:53
    which was randomized or for
    which you could make a case that
  • 6:53 - 6:57
    that there's a good design.
    And then when I see that,
  • 6:57 - 6:58
    That and,
  • 6:58 - 6:58
    you know, I think,
  • 6:58 - 6:59
    you know,
  • 6:59 - 7:03
    it's very hard for me to believe that this
    relatively minor intervention has such
  • 7:03 - 7:04
    a large defect,
  • 7:04 - 7:07
    the author. Well, sometimes
    resort to the local average,
  • 7:07 - 7:09
    treatment effects theorem and say,
  • 7:09 - 7:13
    wow, these compliers, you know,
    they're special in some way.
  • 7:13 - 7:18
    And, you know, they just benefit
    extraordinarily from this intervention
  • 7:18 - 7:22
    and I'm reluctant to take that
    at face value. I think, you know,
  • 7:22 - 7:24
    often when effects are too big,
  • 7:24 - 7:27
    it's because the exclusion
    restriction is failing. So
  • 7:27 - 7:32
    Don't really have the right endogenous
    variable to scale that result.
  • 7:32 - 7:36
    And so I'm not too happy to see
  • 7:36 - 7:39
    you know, just sort of
    a generic heterogeneity
  • 7:39 - 7:44
    argument being used to excuse something
    that I think might be a deeper problem.
  • 7:45 - 7:47
    I think it played somewhat
    of an unfortunate roll pin.
  • 7:47 - 7:52
    The discussions kind of between reduced
    form and structural approaches where
  • 7:53 - 7:54
    I feel that wasn't quite
  • 7:55 - 7:59
    right. The instrumental
    variables assumptions are
  • 8:00 - 8:05
    at the core structural assumptions about
    Behavior. They were coming from economic
  • 8:07 - 8:10
    thinking about the economic
    behavior of agents,
  • 8:10 - 8:15
    and it's somehow it got
    pushed in a Direction.
  • 8:15 - 8:18
    That I think wasn't
    really very helpful. If
  • 8:19 - 8:22
    the way I think, initially the
  • 8:23 - 8:27
    we wrote things up. It was it was describing
    what was happening, there was set of
  • 8:28 - 8:32
    methods. People were using be
    clarified what those methods were doing
  • 8:33 - 8:38
    and in a way that I think
    contain a fair amount of insight,
  • 8:39 - 8:45
    but it somehow it got pushed into a corner
    that I think was not necessarily very
  • 8:45 - 8:49
    or even just the language of
    reduced form versus structural.
  • 8:49 - 8:51
    I find kind of funny in
    the sense that the right
  • 8:51 - 8:53
    the local average treatment
    effect model, right?
  • 8:53 - 8:55
    The potential outcomes
    model is a nonparametric.
  • 8:55 - 8:56
    Structural model,
  • 8:56 - 8:59
    if you want to think about it, as
    you sort of suggested, he does.
  • 8:59 - 8:59
    So, there's something,
  • 9:00 - 9:04
    there's something a little funny about
    putting these two things in a position when
  • 9:04 - 9:07
    yes, well, that language, of
    course, comes from the area, the
  • 9:07 - 9:10
    70s equations framework that we inherited.
  • 9:10 - 9:12
    It has the advantage that people seem
  • 9:12 - 9:15
    to know what you mean
    when you use it, but might
  • 9:15 - 9:18
    That people are hearing different. Different
    people are hearing different things.
  • 9:18 - 9:21
    Yeah. I think I think veggies
    Farmers had become use
  • 9:21 - 9:23
    a little bit of the
    pejoratives. Okay? Yeah.
  • 9:23 - 9:28
    The word, which is not really quite
    what it was originally intended for.
  • 9:30 - 9:34
    I guess something else that strikes
    me in thinking about the effects of
  • 9:34 - 9:38
    the local average treatment effect
    framework is that often folks will appeal to
  • 9:38 - 9:42
    a local average, treatment effects
    intuition for settings. Well, beyond
  • 9:42 - 9:45
    ones, where any sort of formal
    results has actually been
  • 9:45 - 9:50
    Shhhhht. And I'm curious given
    all the work that you guys did to,
  • 9:50 - 9:53
    you know, establish late results in
    different in different settings. I'm curious
  • 9:53 - 9:58
    any thoughts on that. I think there's
    going to be a lot of cases where
  • 9:58 - 10:02
    the intuition does get.
    You get you some distance,
  • 10:03 - 10:08
    but it's going to be somewhat limited
    and establishing formal results. There
  • 10:08 - 10:13
    may be a little tricky and there may
    be only work in special circumstances,
  • 10:13 - 10:14
    you need.
  • 10:15 - 10:20
    And you end up with a lot of formality
    that may not quite capture the intuition
  • 10:20 - 10:23
    sometimes I'm somewhat uneasy with them
    and they are not necessarily the papers.
  • 10:23 - 10:25
    I would want to ride that the
  • 10:25 - 10:30
    but I do think something do intuition
    orphaned US capture part of the
  • 10:30 - 10:31
    of the problem.
  • 10:33 - 10:36
    I think, in some sense we were
    kind of very fortunate there
  • 10:37 - 10:40
    in the way. The late paper go handle. It.
    Don't know if that, actually the editor,
  • 10:41 - 10:42
    made it much shorter
  • 10:42 - 10:46
    and that then allowed us to kind of
    focus on very clear, crisp results
  • 10:47 - 10:50
    where if, you know, this,
  • 10:50 - 10:54
    this is somewhat unfortunate tendency in
    the commercialization of having the papers.
  • 10:55 - 10:59
    Well, you should be able to fix that, man.
    I'm trying to take some time to fix that.
  • 10:59 - 11:03
    I think this is an example where it's sort
    of very clear that having it. Be sure.
  • 11:03 - 11:08
    It's actually impose that no paper can
    be longer than the late paper that wow.
  • 11:09 - 11:14
    Great. At least no Theory. No Theory Pig.
    Yeah, and I think, I think they're well,
  • 11:14 - 11:17
    I'm trying very hard to get
    the papers to be shorter.
  • 11:17 - 11:19
    And I think there's a lot of value
  • 11:19 - 11:23
    today because it's often the second
    part of the paper that doesn't actually
  • 11:24 - 11:26
    Get you much further
    and understanding things
  • 11:27 - 11:32
    but and it does make things much
    harder to read and, you know,
  • 11:32 - 11:34
    it sort of goes back to
  • 11:34 - 11:38
    how I think he kind of a trick should
    be done to you should focus on the see.
  • 11:39 - 11:41
    It should be reasonably
    close to empirical problems.
  • 11:42 - 11:44
    They should be very clear problems.
  • 11:45 - 11:49
    But then often the the theory
    doesn't need to be quite so long.
  • 11:49 - 11:49
    Yeah,
  • 11:51 - 11:53
    I think they had things have
  • 11:54 - 11:55
    On a little off track.
  • 11:56 - 11:58
    The relatively recent change has been a
  • 11:58 - 12:02
    seeming big increase in demand for
    people with sort of econometrics.
  • 12:02 - 12:05
    Causal effect, estimation
    skills in the tech sector.
  • 12:05 - 12:09
    I'm interested either of you have
    thoughts on sort of how that's gonna
  • 12:09 - 12:12
    how that's going to interact with
    the development of empirical methods,
  • 12:12 - 12:14
    or Empirical research, and
    economics. Going forward, sort of
  • 12:15 - 12:21
    whether sort of a meta point, which
    is there's this new kind of employer
  • 12:22 - 12:26
    the Amazons and the Uber and, you know,
  • 12:26 - 12:28
    Riser world
  • 12:28 - 12:29
    and I think that's great.
  • 12:29 - 12:33
    And I'd like to tell my students about
    that, you know, especially at MIT.
  • 12:33 - 12:37
    We have a lot of computer science
    Majors. That's our biggest major
  • 12:37 - 12:43
    and I try to seduce some of those folks
    into economics by saying, you know,
  • 12:43 - 12:46
    you can go work for these,
  • 12:46 - 12:48
    you know companies that
    people are very keen to
  • 12:49 - 12:51
    work for because the work seems exciting,
  • 12:52 - 12:56
    you know that the skills that you get in
    econometrics are are as good or better.
  • 12:56 - 13:01
    Better than than any competing discipline
    has to offer. So you should at least
  • 13:01 - 13:04
    take some econ, take some
    econometrics. And some econ.
  • 13:05 - 13:07
    I did a fun project with a uber
  • 13:08 - 13:13
    on labor supply of Uber drivers and was
    very, very exciting to be part of that.
  • 13:13 - 13:15
    Plus. I got to drive for Uber for a while
  • 13:16 - 13:21
    and I thought that was fun tonight. I did
    not make enough that I was attempted to
  • 13:21 - 13:25
    give up by a mighty job, but
    I enjoyed the experience.
  • 13:25 - 13:26
    I see a
  • 13:26 - 13:31
    Cho challenge to our model
    of graduate education here,
  • 13:32 - 13:37
    which is if we're trading people
    to go work at Amazon, you know,
  • 13:38 - 13:43
    it's not clear. Why? You know, we should
    be paying graduate stipends for that.
  • 13:43 - 13:45
    Why should the taxpayer effectively
  • 13:46 - 13:51
    be subsidizing? That our graduate education
    in the u.s. Is generously subsidized?
  • 13:51 - 13:56
    Even in private universities. It's
    ultimately there's a lot of public money.
  • 13:56 - 13:59
    Me there. And I think the
    traditional rationale for that is,
  • 14:00 - 14:04
    you know, we were training, Educators and
    Scholars, and there's a great externality
  • 14:04 - 14:06
    from the work that we do.
  • 14:06 - 14:10
    It's either the research externality,
    or a teaching externality.
  • 14:10 - 14:15
    But, you know, if many of our students
    are going to work in the private sector,
  • 14:16 - 14:22
    that's fine, but that maybe their
    employers should pay for that.
  • 14:22 - 14:25
    He says, so different from
    people working for a Consulting.
  • 14:26 - 14:27
    Trust me.
  • 14:27 - 14:33
    It's not clear to me that the number
    of jobs in academics has changed.
  • 14:33 - 14:38
    It's just, I feel like this is a
    growing sector whereas Consulting,
  • 14:38 - 14:42
    your right to raise that, it might
    be the same for for Consulting.
  • 14:43 - 14:44
    But this,
  • 14:44 - 14:48
    you know, I'm placing more and
    more students in these businesses.
  • 14:48 - 14:50
    So, it's on my mind in
    a way that I've sort of,
  • 14:51 - 14:56
    you know, not been attentive to consulting
    jobs, you know, Consulting was always,
  • 14:56 - 15:00
    It's important and I think they'll so
    there's some movement from Consulting back
  • 15:00 - 15:03
    into research. It's a little more fluid.
  • 15:03 - 15:04
    The,
  • 15:04 - 15:05
    a lot of the work in the
  • 15:06 - 15:10
    in both domains. I have to say,
    it's not really different but
  • 15:10 - 15:14
    you know, people who are working
    in the tech sector are doing things
  • 15:14 - 15:17
    that are potentially of scientific
    interest, but mostly it's hidden.
  • 15:17 - 15:21
    Then you really I have to say, you know,
    why, why is the government paying for this?
  • 15:22 - 15:26
    Yeah, although yeah, I mean taquitos point,
    I guess it. There's a, there's a data.
  • 15:26 - 15:30
    Question here of it has the sort
    of total nanak. It sort of say
  • 15:30 - 15:31
    private
  • 15:31 - 15:34
    for-profit sector employment of econ Ph.D.
  • 15:34 - 15:38
    Program graduates increased or has
    it just been a substitution from
  • 15:38 - 15:40
    finance and Consulting towards tack.
  • 15:40 - 15:44
    I may be a reaction to something
    that's not really happening
  • 15:44 - 15:48
    so bad. I've actually done some work
    with some of these tech companies.
  • 15:49 - 15:52
    So I don't disagree with Justice
    point that we need to think
  • 15:52 - 15:55
    a little bit about the funding model
    whose it was in the end paying for the
  • 15:56 - 15:59
    It education. But from a
    scientific perspective.
  • 16:00 - 16:04
    The only do these places have
    have great data and nowadays.
  • 16:04 - 16:07
    They tend to be very careful
    with that for privacy reasons,
  • 16:08 - 16:09
    but also have great questions.
  • 16:10 - 16:11
    I find it very
  • 16:12 - 16:13
    inspiring kind of to listen to
  • 16:13 - 16:16
    the people there and kind of see
    what kind of questions they have
  • 16:16 - 16:17
    and often their questions.
  • 16:18 - 16:21
    That also come up outside of these.
  • 16:21 - 16:25
    These companies have a couple of
    papers with the rights in the chat.
  • 16:26 - 16:26
    And then
  • 16:26 - 16:32
    as soon as an atheist kind of where we
    look at ways of combining experimental data
  • 16:32 - 16:34
    and observational data, and can it there.
  • 16:36 - 16:39
    Rights Chetty was interested
    in what is the effect
  • 16:39 - 16:45
    of Early Childhood programs on outcomes
    later in life? Not just kind of test scores,
  • 16:45 - 16:48
    but on earnings and stuff, and
    we cannot be developed methods
  • 16:49 - 16:52
    that would help you shed
    light on that, on the some,
  • 16:53 - 16:55
    in some settings and the same problems.
  • 16:56 - 17:00
    Came up kind of in this
    tech company settings.
  • 17:01 - 17:04
    And so for my perspective, it's
  • 17:04 - 17:08
    the same kind of a stocking two
    people doing a protocol work.
  • 17:08 - 17:12
    I tried to kind of look at these
    specific problems and then try to come up
  • 17:12 - 17:18
    with more General problems that we
    formulating the problems at a higher level.
  • 17:18 - 17:23
    So that I can think about solutions
    that work in a range of settings.
  • 17:23 - 17:26
    And so from that perspective, the
  • 17:26 - 17:30
    His with the the tech companies I
    just very valuable and very useful.
  • 17:31 - 17:31
    It's know.
  • 17:32 - 17:34
    We do have students. Now spent
  • 17:34 - 17:37
    doing internships there and
    then coming back and writing
  • 17:37 - 17:43
    more interesting thesis, as a
    result of their experiences there.
  • 17:45 - 17:48
    If you'd like to watch more
    Nobel conversations, click here,
  • 17:48 - 17:50
    or if you'd like to learn
    more about econometrics,
  • 17:51 - 17:53
    check out Josh's mastering
    econometrics series.
  • 17:54 - 17:56
    If you'd like to learn more about he do.
  • 17:56 - 17:58
    Josh and Isaiah check out
    the links in the description.
Title:
How Is Econometrics Changing? (Josh Angrist, Guido Imbens, Isaiah Andrews)
ASR Confidence:
0.80
Description:

more » « less
Video Language:
English
Team:
Marginal Revolution University
Duration:
18:03

English subtitles

Revisions Compare revisions