[Script Info] Title: [Events] Format: Layer, Start, End, Style, Name, MarginL, MarginR, MarginV, Effect, Text Dialogue: 0,0:00:00.10,0:00:02.05,Default,,0000,0000,0000,,♪ [music] ♪ Dialogue: 0,0:00:03.62,0:00:05.70,Default,,0000,0000,0000,,- [Narrator] Welcome\Nto Nobel Conversations. Dialogue: 0,0:00:07.00,0:00:10.04,Default,,0000,0000,0000,,In this episode, Josh Angrist\Nand Guido Imbens Dialogue: 0,0:00:10.04,0:00:13.68,Default,,0000,0000,0000,,sit down with Isaiah Andrews\Nto discuss and disagree Dialogue: 0,0:00:13.68,0:00:16.58,Default,,0000,0000,0000,,over the role of machine learning\Nin applied econometrics. Dialogue: 0,0:00:17.90,0:00:19.77,Default,,0000,0000,0000,,- [Isaiah] So, of course,\Nthere are a lot of topics Dialogue: 0,0:00:19.77,0:00:21.09,Default,,0000,0000,0000,,where you guys largely agree, Dialogue: 0,0:00:21.09,0:00:22.31,Default,,0000,0000,0000,,but I'd like to turn to one Dialogue: 0,0:00:22.31,0:00:24.24,Default,,0000,0000,0000,,where maybe you have\Nsome differences of opinion. Dialogue: 0,0:00:24.24,0:00:25.73,Default,,0000,0000,0000,,I'd love to hear\Nsome of your thoughts Dialogue: 0,0:00:25.73,0:00:26.88,Default,,0000,0000,0000,,about machine learning Dialogue: 0,0:00:26.88,0:00:29.90,Default,,0000,0000,0000,,and the goal that it's playing\Nand is going to play in economics. Dialogue: 0,0:00:30.20,0:00:33.35,Default,,0000,0000,0000,,- [Guido] I've looked at some data\Nlike the proprietary. Dialogue: 0,0:00:33.35,0:00:35.15,Default,,0000,0000,0000,,We see that there's\Nno published paper there. Dialogue: 0,0:00:36.12,0:00:38.16,Default,,0000,0000,0000,,There was an experiment\Nthat was done Dialogue: 0,0:00:38.16,0:00:39.50,Default,,0000,0000,0000,,on some search algorithm, Dialogue: 0,0:00:39.70,0:00:41.33,Default,,0000,0000,0000,,and the question was -- Dialogue: 0,0:00:42.90,0:00:45.60,Default,,0000,0000,0000,,it was about ranking things\Nand changing the ranking. Dialogue: 0,0:00:45.90,0:00:47.29,Default,,0000,0000,0000,,And it was sort of clear Dialogue: 0,0:00:48.40,0:00:50.61,Default,,0000,0000,0000,,that there was going to be\Na lot of heterogeneity there. Dialogue: 0,0:00:52.16,0:00:56.28,Default,,0000,0000,0000,,If you look for, say, Dialogue: 0,0:00:57.83,0:01:00.62,Default,,0000,0000,0000,,a picture of Britney Spears -- Dialogue: 0,0:01:00.62,0:01:02.49,Default,,0000,0000,0000,,that it doesn't really matter\Nwhere you rank it Dialogue: 0,0:01:02.49,0:01:05.50,Default,,0000,0000,0000,,because you're going to figure out\Nwhat you're looking for, Dialogue: 0,0:01:06.20,0:01:07.87,Default,,0000,0000,0000,,whether you put it\Nin the first or second Dialogue: 0,0:01:07.87,0:01:09.92,Default,,0000,0000,0000,,or third position of the ranking. Dialogue: 0,0:01:10.10,0:01:12.50,Default,,0000,0000,0000,,But if you're looking\Nfor the best econometrics book, Dialogue: 0,0:01:13.30,0:01:16.43,Default,,0000,0000,0000,,if you put your book first\Nor your book tenth -- Dialogue: 0,0:01:16.43,0:01:18.10,Default,,0000,0000,0000,,that's going to make\Na big difference Dialogue: 0,0:01:18.60,0:01:20.98,Default,,0000,0000,0000,,how often people\Nare going to click on it. Dialogue: 0,0:01:21.83,0:01:23.42,Default,,0000,0000,0000,,And so there you -- Dialogue: 0,0:01:23.42,0:01:27.12,Default,,0000,0000,0000,,- [Josh] Why do I need\Nmachine learning to discover that? Dialogue: 0,0:01:27.12,0:01:29.20,Default,,0000,0000,0000,,It seems like -- because\NI can discover it simply. Dialogue: 0,0:01:29.20,0:01:30.44,Default,,0000,0000,0000,,- [Guido] So in general -- Dialogue: 0,0:01:30.44,0:01:32.10,Default,,0000,0000,0000,,- [Josh] There were lots\Nof possible... Dialogue: 0,0:01:32.10,0:01:35.04,Default,,0000,0000,0000,,- You want to think about\Nthere being lots of characteristics Dialogue: 0,0:01:35.49,0:01:37.28,Default,,0000,0000,0000,,of the items, Dialogue: 0,0:01:37.61,0:01:41.68,Default,,0000,0000,0000,,that you want to understand\Nwhat drives the heterogeneity Dialogue: 0,0:01:42.18,0:01:43.43,Default,,0000,0000,0000,,in the effect of -- Dialogue: 0,0:01:43.43,0:01:45.01,Default,,0000,0000,0000,,- But you're just predicting Dialogue: 0,0:01:45.01,0:01:47.66,Default,,0000,0000,0000,,In some sense, you're solving\Na marketing problem. Dialogue: 0,0:01:47.66,0:01:49.38,Default,,0000,0000,0000,,- No, it's a causal effect, Dialogue: 0,0:01:49.38,0:01:51.91,Default,,0000,0000,0000,,- It's causal, but it has\Nno scientific content. Dialogue: 0,0:01:51.91,0:01:53.14,Default,,0000,0000,0000,,Think about... Dialogue: 0,0:01:53.66,0:01:57.30,Default,,0000,0000,0000,,- No, but there's similar things\Nin medical settings. Dialogue: 0,0:01:58.00,0:02:01.30,Default,,0000,0000,0000,,If you do an experiment, \Nyou may actually be very interested Dialogue: 0,0:02:01.30,0:02:03.90,Default,,0000,0000,0000,,in whether the treatment works\Nfor some groups or not. Dialogue: 0,0:02:03.90,0:02:06.14,Default,,0000,0000,0000,,And you have a lot\Nof individual characteristics, Dialogue: 0,0:02:06.14,0:02:08.00,Default,,0000,0000,0000,,and you want\Nto systematically search -- Dialogue: 0,0:02:08.00,0:02:09.50,Default,,0000,0000,0000,,- Yeah. I'm skeptical about that -- Dialogue: 0,0:02:09.50,0:02:12.60,Default,,0000,0000,0000,,that sort of idea that there's\Nthis personal causal effect Dialogue: 0,0:02:12.60,0:02:14.00,Default,,0000,0000,0000,,that I should care about, Dialogue: 0,0:02:14.00,0:02:15.74,Default,,0000,0000,0000,,and that machine learning\Ncan discover it Dialogue: 0,0:02:15.74,0:02:17.26,Default,,0000,0000,0000,,in some way that's useful. Dialogue: 0,0:02:17.26,0:02:20.04,Default,,0000,0000,0000,,So think about -- I've done\Na lot of work on schools, Dialogue: 0,0:02:20.04,0:02:22.34,Default,,0000,0000,0000,,going to, say, a charter school, Dialogue: 0,0:02:22.34,0:02:24.43,Default,,0000,0000,0000,,a publicly funded private school, Dialogue: 0,0:02:25.22,0:02:27.39,Default,,0000,0000,0000,,effectively, \Nthat's free to structure Dialogue: 0,0:02:27.39,0:02:29.40,Default,,0000,0000,0000,,its own curriculum\Nfor context there. Dialogue: 0,0:02:29.40,0:02:31.37,Default,,0000,0000,0000,,Some types of charter schools Dialogue: 0,0:02:31.37,0:02:33.70,Default,,0000,0000,0000,,generate spectacular\Nachievement gains, Dialogue: 0,0:02:33.70,0:02:36.40,Default,,0000,0000,0000,,and in the data set\Nthat produces that result, Dialogue: 0,0:02:36.40,0:02:37.80,Default,,0000,0000,0000,,I have a lot of covariates. Dialogue: 0,0:02:37.80,0:02:41.35,Default,,0000,0000,0000,,So I have baseline scores,\Nand I have family background, Dialogue: 0,0:02:41.35,0:02:43.21,Default,,0000,0000,0000,,the education of the parents, Dialogue: 0,0:02:43.58,0:02:45.80,Default,,0000,0000,0000,,the sex of the child, \Nthe race of the child. Dialogue: 0,0:02:45.80,0:02:49.80,Default,,0000,0000,0000,,And, well, soon as I put\Nhalf a dozen of those together, Dialogue: 0,0:02:49.80,0:02:51.90,Default,,0000,0000,0000,,I have a very \Nhigh-dimensional space. Dialogue: 0,0:02:52.30,0:02:55.20,Default,,0000,0000,0000,,I'm definitely interested\Nin course features Dialogue: 0,0:02:55.20,0:02:56.46,Default,,0000,0000,0000,,of that treatment effect, Dialogue: 0,0:02:56.46,0:02:58.74,Default,,0000,0000,0000,,like whether it's better for people Dialogue: 0,0:02:58.74,0:03:02.05,Default,,0000,0000,0000,,who come from\Nlower-income families. Dialogue: 0,0:03:02.60,0:03:05.76,Default,,0000,0000,0000,,I have a hard time believing\Nthat there's an application Dialogue: 0,0:03:07.27,0:03:09.87,Default,,0000,0000,0000,,for the very high-dimensional\Nversion of that, Dialogue: 0,0:03:09.87,0:03:12.41,Default,,0000,0000,0000,,where I discovered\Nthat for non-white children Dialogue: 0,0:03:12.41,0:03:14.97,Default,,0000,0000,0000,,who have high family incomes Dialogue: 0,0:03:14.97,0:03:17.80,Default,,0000,0000,0000,,but baseline scores\Nin the third quartile Dialogue: 0,0:03:18.17,0:03:21.78,Default,,0000,0000,0000,,and only went to public school\Nin the third grade Dialogue: 0,0:03:21.78,0:03:23.00,Default,,0000,0000,0000,,but not the sixth grade. Dialogue: 0,0:03:23.00,0:03:25.80,Default,,0000,0000,0000,,So that's what that \Nhigh-dimensional analysis produces. Dialogue: 0,0:03:25.80,0:03:28.10,Default,,0000,0000,0000,,It's a very elaborate\Nconditional statement. Dialogue: 0,0:03:28.30,0:03:30.60,Default,,0000,0000,0000,,There's two things that are wrong\Nwith that in my view. Dialogue: 0,0:03:30.60,0:03:31.80,Default,,0000,0000,0000,,First, I don't see it as -- Dialogue: 0,0:03:31.80,0:03:34.00,Default,,0000,0000,0000,,I just can't imagine\Nwhy it's actionable. Dialogue: 0,0:03:34.60,0:03:36.60,Default,,0000,0000,0000,,I don't know why\Nyou'd want to act on it. Dialogue: 0,0:03:36.60,0:03:39.46,Default,,0000,0000,0000,,And I know also that\Nthere's some alternative model Dialogue: 0,0:03:39.46,0:03:41.20,Default,,0000,0000,0000,,that fits almost as well, Dialogue: 0,0:03:41.80,0:03:43.40,Default,,0000,0000,0000,,that flips everything. Dialogue: 0,0:03:43.40,0:03:45.35,Default,,0000,0000,0000,,Because machine learning\Ndoesn't tell me Dialogue: 0,0:03:45.35,0:03:48.58,Default,,0000,0000,0000,,that this is really\Nthe predictor that matters -- Dialogue: 0,0:03:48.58,0:03:50.96,Default,,0000,0000,0000,,it just tells me\Nthat this is a good predictor. Dialogue: 0,0:03:51.49,0:03:54.87,Default,,0000,0000,0000,,And so, I think\Nthere is something different Dialogue: 0,0:03:54.87,0:03:57.59,Default,,0000,0000,0000,,about the social science context. Dialogue: 0,0:03:57.94,0:04:00.19,Default,,0000,0000,0000,,- [Guido] I think\Nthe social science applications Dialogue: 0,0:04:00.19,0:04:01.63,Default,,0000,0000,0000,,you're talking about Dialogue: 0,0:04:01.63,0:04:02.74,Default,,0000,0000,0000,,are ones where, Dialogue: 0,0:04:03.40,0:04:08.10,Default,,0000,0000,0000,,I think, there's not a huge amount\Nof heterogeneity in the effects. Dialogue: 0,0:04:09.78,0:04:11.54,Default,,0000,0000,0000,,- [Josh] Well, there might be\Nif you allow me Dialogue: 0,0:04:11.54,0:04:13.47,Default,,0000,0000,0000,,to fill that space. Dialogue: 0,0:04:13.47,0:04:15.74,Default,,0000,0000,0000,,- No... not even then. Dialogue: 0,0:04:15.74,0:04:18.61,Default,,0000,0000,0000,,I think for a lot\Nof those interventions, Dialogue: 0,0:04:18.61,0:04:22.76,Default,,0000,0000,0000,,you would expect that the effect\Nis the same sign for everybody. Dialogue: 0,0:04:24.37,0:04:27.60,Default,,0000,0000,0000,,There may be small differences\Nin the magnitude, but it's not... Dialogue: 0,0:04:28.20,0:04:30.23,Default,,0000,0000,0000,,For a lot of these\Neducational defenses -- Dialogue: 0,0:04:30.23,0:04:31.70,Default,,0000,0000,0000,,they're good for everybody. Dialogue: 0,0:04:34.17,0:04:36.03,Default,,0000,0000,0000,,It's not that they're bad\Nfor some people Dialogue: 0,0:04:36.03,0:04:37.60,Default,,0000,0000,0000,,and good for other people, Dialogue: 0,0:04:37.60,0:04:39.10,Default,,0000,0000,0000,,and that is kind\Nof very small pockets Dialogue: 0,0:04:39.10,0:04:40.90,Default,,0000,0000,0000,,where they're bad there. Dialogue: 0,0:04:40.90,0:04:44.01,Default,,0000,0000,0000,,But there may be some variation\Nin the magnitude, Dialogue: 0,0:04:44.01,0:04:46.96,Default,,0000,0000,0000,,but you would need very, \Nvery big data sets to find those. Dialogue: 0,0:04:47.91,0:04:49.08,Default,,0000,0000,0000,,I agree that in those cases, Dialogue: 0,0:04:49.08,0:04:51.40,Default,,0000,0000,0000,,they probably wouldn't be\Nvery actionable anyway. Dialogue: 0,0:04:51.70,0:04:53.80,Default,,0000,0000,0000,,But I think there's a lot\Nof other settings Dialogue: 0,0:04:54.10,0:04:56.60,Default,,0000,0000,0000,,where there is\Nmuch more heterogeneity. Dialogue: 0,0:04:57.25,0:04:59.10,Default,,0000,0000,0000,,- Well, I'm open\Nto that possibility, Dialogue: 0,0:04:59.10,0:05:04.92,Default,,0000,0000,0000,,and I think the example you gave\Nis essentially a marketing example. Dialogue: 0,0:05:06.32,0:05:09.66,Default,,0000,0000,0000,,- No, those have\Nimplications for it Dialogue: 0,0:05:09.66,0:05:11.07,Default,,0000,0000,0000,,and that's the organization, Dialogue: 0,0:05:12.25,0:05:14.33,Default,,0000,0000,0000,,whether you need\Nto worry about the... Dialogue: 0,0:05:15.47,0:05:17.90,Default,,0000,0000,0000,,- Well, I need to see that paper. Dialogue: 0,0:05:18.40,0:05:21.07,Default,,0000,0000,0000,,- So the sense\NI'm getting is that -- Dialogue: 0,0:05:21.47,0:05:23.100,Default,,0000,0000,0000,,- We still disagree on something.\N- Yes. Dialogue: 0,0:05:23.100,0:05:25.44,Default,,0000,0000,0000,,- We haven't converged\Non everything. Dialogue: 0,0:05:25.44,0:05:27.20,Default,,0000,0000,0000,,- I'm getting that sense.\N[laughter] Dialogue: 0,0:05:27.20,0:05:28.68,Default,,0000,0000,0000,,- Actually, we've diverged on this Dialogue: 0,0:05:28.68,0:05:30.83,Default,,0000,0000,0000,,because this wasn't around\Nto argue about. Dialogue: 0,0:05:30.83,0:05:32.33,Default,,0000,0000,0000,,[laughter] Dialogue: 0,0:05:33.06,0:05:34.77,Default,,0000,0000,0000,,- Is it getting a little warm here? Dialogue: 0,0:05:35.82,0:05:38.15,Default,,0000,0000,0000,,- Warmed up. Warmed up is good. Dialogue: 0,0:05:38.15,0:05:41.19,Default,,0000,0000,0000,,The sense I'm getting is, \NJosh, you're not saying Dialogue: 0,0:05:41.19,0:05:43.12,Default,,0000,0000,0000,,that you're confident\Nthat there is no way Dialogue: 0,0:05:43.12,0:05:45.35,Default,,0000,0000,0000,,that there is an application\Nwhere this stuff is useful. Dialogue: 0,0:05:45.35,0:05:47.03,Default,,0000,0000,0000,,You are saying\Nyou are unconvinced Dialogue: 0,0:05:47.03,0:05:49.49,Default,,0000,0000,0000,,by the existing\Napplications to date. Dialogue: 0,0:05:49.92,0:05:52.02,Default,,0000,0000,0000,,- Fair enough.\N- I'm very confident. Dialogue: 0,0:05:52.02,0:05:53.70,Default,,0000,0000,0000,,[laughter] Dialogue: 0,0:05:54.16,0:05:55.19,Default,,0000,0000,0000,,- In this case. Dialogue: 0,0:05:55.19,0:05:56.56,Default,,0000,0000,0000,,- I think Josh does have a point Dialogue: 0,0:05:56.56,0:06:00.45,Default,,0000,0000,0000,,that even in the prediction cases Dialogue: 0,0:06:01.64,0:06:04.52,Default,,0000,0000,0000,,where a lot of the machine learning\Nmethods really shine Dialogue: 0,0:06:04.52,0:06:06.74,Default,,0000,0000,0000,,is where there's just a lot\Nof heterogeneity. Dialogue: 0,0:06:07.30,0:06:10.77,Default,,0000,0000,0000,,- You don't really care much\Nabout the details there, right? Dialogue: 0,0:06:10.77,0:06:11.84,Default,,0000,0000,0000,,- [Guido] Yes. Dialogue: 0,0:06:11.84,0:06:15.00,Default,,0000,0000,0000,,- It doesn't have\Na policy angle or something. Dialogue: 0,0:06:15.20,0:06:18.23,Default,,0000,0000,0000,,- The kind of recognizing\Nhandwritten digits and stuff -- Dialogue: 0,0:06:18.80,0:06:20.09,Default,,0000,0000,0000,,it does much better there Dialogue: 0,0:06:20.09,0:06:24.00,Default,,0000,0000,0000,,than building\Nsome complicated model. Dialogue: 0,0:06:24.40,0:06:28.18,Default,,0000,0000,0000,,But a lot of the social science,\Na lot of the economic applications, Dialogue: 0,0:06:28.18,0:06:30.38,Default,,0000,0000,0000,,we actually know a huge amount\Nabout the relationship Dialogue: 0,0:06:30.38,0:06:32.10,Default,,0000,0000,0000,,between its variables. Dialogue: 0,0:06:32.10,0:06:34.70,Default,,0000,0000,0000,,A lot of the relationships\Nare strictly monotone. Dialogue: 0,0:06:37.17,0:06:39.42,Default,,0000,0000,0000,,Education is going to increase\Npeople's earnings, Dialogue: 0,0:06:39.70,0:06:41.95,Default,,0000,0000,0000,,irrespective of the demographic, Dialogue: 0,0:06:41.95,0:06:44.93,Default,,0000,0000,0000,,irrespective of the level\Nof education you already have. Dialogue: 0,0:06:44.93,0:06:46.18,Default,,0000,0000,0000,,- Until they get to a Ph.D. Dialogue: 0,0:06:46.18,0:06:47.96,Default,,0000,0000,0000,,- Is that true for graduate school? Dialogue: 0,0:06:47.96,0:06:49.23,Default,,0000,0000,0000,,[laughter] Dialogue: 0,0:06:49.23,0:06:50.70,Default,,0000,0000,0000,,- Over a reasonable range. Dialogue: 0,0:06:51.60,0:06:55.49,Default,,0000,0000,0000,,It's not going\Nto go down very much. Dialogue: 0,0:06:56.10,0:06:58.12,Default,,0000,0000,0000,,In a lot of the settings Dialogue: 0,0:06:58.12,0:07:00.10,Default,,0000,0000,0000,,where these machine learning\Nmethods shine, Dialogue: 0,0:07:00.10,0:07:01.90,Default,,0000,0000,0000,,there's a lot of non-monotonicity, Dialogue: 0,0:07:02.10,0:07:04.90,Default,,0000,0000,0000,,kind of multimodality\Nin these relationships, Dialogue: 0,0:07:05.30,0:07:08.92,Default,,0000,0000,0000,,and they're going to be\Nvery powerful. Dialogue: 0,0:07:08.92,0:07:11.79,Default,,0000,0000,0000,,But I still stand by that. Dialogue: 0,0:07:12.41,0:07:14.98,Default,,0000,0000,0000,,These methods just have\Na huge amount to offer Dialogue: 0,0:07:15.92,0:07:17.56,Default,,0000,0000,0000,,for economists, Dialogue: 0,0:07:17.56,0:07:21.70,Default,,0000,0000,0000,,and they're going to be\Na big part of the future. Dialogue: 0,0:07:21.93,0:07:23.02,Default,,0000,0000,0000,,♪ [music] ♪ Dialogue: 0,0:07:23.02,0:07:24.60,Default,,0000,0000,0000,,- [Isaiah] It feels like\Nthere's something interesting Dialogue: 0,0:07:24.60,0:07:25.91,Default,,0000,0000,0000,,to be said about\Nmachine learning here. Dialogue: 0,0:07:25.91,0:07:28.10,Default,,0000,0000,0000,,So, Guido, I was wondering,\Ncould you give some more... Dialogue: 0,0:07:28.10,0:07:29.81,Default,,0000,0000,0000,,maybe some examples\Nof the sorts of examples Dialogue: 0,0:07:29.81,0:07:30.91,Default,,0000,0000,0000,,you're thinking about Dialogue: 0,0:07:30.91,0:07:32.66,Default,,0000,0000,0000,,with applications coming out\Nat the moment? Dialogue: 0,0:07:32.66,0:07:34.18,Default,,0000,0000,0000,,- So one area is where Dialogue: 0,0:07:34.70,0:07:36.95,Default,,0000,0000,0000,,instead of looking\Nfor average causal effects, Dialogue: 0,0:07:36.95,0:07:39.35,Default,,0000,0000,0000,,we're looking for\Nindividualized estimates, Dialogue: 0,0:07:41.35,0:07:43.29,Default,,0000,0000,0000,,predictions of causal effects, Dialogue: 0,0:07:43.29,0:07:46.11,Default,,0000,0000,0000,,and there, \Nthe machine learning algorithms Dialogue: 0,0:07:46.11,0:07:47.94,Default,,0000,0000,0000,,have been very effective. Dialogue: 0,0:07:47.94,0:07:51.42,Default,,0000,0000,0000,,Traditionally, we would have done\Nthese things using kernel methods, Dialogue: 0,0:07:51.42,0:07:54.00,Default,,0000,0000,0000,,and theoretically, they work great, Dialogue: 0,0:07:54.00,0:07:55.64,Default,,0000,0000,0000,,and there's some arguments Dialogue: 0,0:07:55.64,0:07:57.61,Default,,0000,0000,0000,,that, formally, \Nyou can't do any better. Dialogue: 0,0:07:57.61,0:07:59.58,Default,,0000,0000,0000,,But in practice, \Nthey don't work very well. Dialogue: 0,0:08:00.90,0:08:03.53,Default,,0000,0000,0000,,Random causal forest-type things Dialogue: 0,0:08:03.53,0:08:06.92,Default,,0000,0000,0000,,that Stefan Wager and Susan Athey\Nhave been working on Dialogue: 0,0:08:06.92,0:08:09.45,Default,,0000,0000,0000,,are used very widely. Dialogue: 0,0:08:09.45,0:08:12.20,Default,,0000,0000,0000,,They've been very effective\Nin these settings Dialogue: 0,0:08:12.40,0:08:19.21,Default,,0000,0000,0000,,to actually get causal effects\Nthat vary by covariates. Dialogue: 0,0:08:20.70,0:08:23.73,Default,,0000,0000,0000,,I think this is still just\Nthe beginning of these methods. Dialogue: 0,0:08:23.73,0:08:25.70,Default,,0000,0000,0000,,But in many cases, Dialogue: 0,0:08:27.35,0:08:31.60,Default,,0000,0000,0000,,these algorithms are very effective\Nas searching over big spaces Dialogue: 0,0:08:31.80,0:08:37.13,Default,,0000,0000,0000,,and finding the functions\Nthat fit very well Dialogue: 0,0:08:37.13,0:08:40.95,Default,,0000,0000,0000,,in ways that we couldn't\Nreally do beforehand. Dialogue: 0,0:08:41.50,0:08:42.70,Default,,0000,0000,0000,,- I don't know of an example Dialogue: 0,0:08:42.70,0:08:45.30,Default,,0000,0000,0000,,where machine learning\Nhas generated insights Dialogue: 0,0:08:45.30,0:08:47.66,Default,,0000,0000,0000,,about a causal effect\Nthat I'm interested in. Dialogue: 0,0:08:47.66,0:08:49.61,Default,,0000,0000,0000,,And I do know of examples Dialogue: 0,0:08:49.61,0:08:51.30,Default,,0000,0000,0000,,where it's potentially\Nvery misleading. Dialogue: 0,0:08:51.30,0:08:53.70,Default,,0000,0000,0000,,So I've done some work\Nwith Brigham Frandsen, Dialogue: 0,0:08:54.10,0:08:57.78,Default,,0000,0000,0000,,using, for example, random forests\Nto model covariate effects Dialogue: 0,0:08:57.78,0:09:00.27,Default,,0000,0000,0000,,in an instrumental\Nvariables problem Dialogue: 0,0:09:00.27,0:09:03.38,Default,,0000,0000,0000,,where you need\Nto condition on covariates. Dialogue: 0,0:09:04.40,0:09:06.53,Default,,0000,0000,0000,,And you don't particularly\Nhave strong feelings Dialogue: 0,0:09:06.53,0:09:08.20,Default,,0000,0000,0000,,about the functional form for that, Dialogue: 0,0:09:08.20,0:09:10.00,Default,,0000,0000,0000,,so maybe you should curve... Dialogue: 0,0:09:10.90,0:09:12.80,Default,,0000,0000,0000,,be open to flexible curve fitting, Dialogue: 0,0:09:12.80,0:09:14.50,Default,,0000,0000,0000,,And that leads you down a path Dialogue: 0,0:09:14.50,0:09:16.85,Default,,0000,0000,0000,,where there's a lot\Nof nonlinearities in the model, Dialogue: 0,0:09:17.38,0:09:19.93,Default,,0000,0000,0000,,and that's very dangerous with IV Dialogue: 0,0:09:19.93,0:09:23.00,Default,,0000,0000,0000,,because any sort\Nof excluded non-linearity Dialogue: 0,0:09:23.30,0:09:25.84,Default,,0000,0000,0000,,potentially generates\Na spurious causal effect, Dialogue: 0,0:09:25.84,0:09:29.29,Default,,0000,0000,0000,,and Brigham and I showed that\Nvery powerfully, I think, Dialogue: 0,0:09:29.29,0:09:32.20,Default,,0000,0000,0000,,in the case of two instruments Dialogue: 0,0:09:32.94,0:09:35.11,Default,,0000,0000,0000,,that come from a paper of mine\Nwith Bill Evans, Dialogue: 0,0:09:35.11,0:09:37.60,Default,,0000,0000,0000,,where if you replace it... Dialogue: 0,0:09:38.71,0:09:40.82,Default,,0000,0000,0000,,a traditional two-stage\Nleast squares estimator Dialogue: 0,0:09:40.82,0:09:42.60,Default,,0000,0000,0000,,with some kind of random forest, Dialogue: 0,0:09:42.90,0:09:46.81,Default,,0000,0000,0000,,you get very precisely estimated\Nnonsense estimates. Dialogue: 0,0:09:49.17,0:09:51.10,Default,,0000,0000,0000,,I think that's a big caution. Dialogue: 0,0:09:51.94,0:09:55.10,Default,,0000,0000,0000,,In view of those findings,\Nin an example I care about Dialogue: 0,0:09:55.10,0:09:57.10,Default,,0000,0000,0000,,where the instruments\Nare very simple Dialogue: 0,0:09:57.40,0:09:59.10,Default,,0000,0000,0000,,and I believe that they're valid, Dialogue: 0,0:09:59.30,0:10:01.10,Default,,0000,0000,0000,,I would be skeptical of that. Dialogue: 0,0:10:02.90,0:10:06.44,Default,,0000,0000,0000,,Non-linearity and IV\Ndon't mix very comfortably. Dialogue: 0,0:10:06.44,0:10:09.42,Default,,0000,0000,0000,,- No, it sounds like that's already\Na more complicated... Dialogue: 0,0:10:10.21,0:10:11.84,Default,,0000,0000,0000,,- Well, it's IV...\N- Yeah. Dialogue: 0,0:10:12.59,0:10:14.03,Default,,0000,0000,0000,,- ...but then we work on that. Dialogue: 0,0:10:14.40,0:10:15.91,Default,,0000,0000,0000,,[laughter] Dialogue: 0,0:10:15.91,0:10:17.29,Default,,0000,0000,0000,,- Fair enough. Dialogue: 0,0:10:17.29,0:10:18.41,Default,,0000,0000,0000,,♪ [music] ♪ Dialogue: 0,0:10:18.41,0:10:20.00,Default,,0000,0000,0000,,- [Guido] As editor \Nof Econometrica, Dialogue: 0,0:10:20.00,0:10:22.05,Default,,0000,0000,0000,,a lot of these papers\Ncross my desk, Dialogue: 0,0:10:22.70,0:10:26.82,Default,,0000,0000,0000,,but the motivation is not clear Dialogue: 0,0:10:27.56,0:10:29.50,Default,,0000,0000,0000,,and, in fact, really lacking. Dialogue: 0,0:10:29.80,0:10:31.03,Default,,0000,0000,0000,,They're not... Dialogue: 0,0:10:31.59,0:10:34.93,Default,,0000,0000,0000,,big old type semiparametric\Nfoundational papers. Dialogue: 0,0:10:35.32,0:10:37.15,Default,,0000,0000,0000,,So that's a big problem. Dialogue: 0,0:10:38.76,0:10:42.66,Default,,0000,0000,0000,,A related problem is that we have\Nthis tradition in econometrics Dialogue: 0,0:10:42.66,0:10:46.56,Default,,0000,0000,0000,,of being very focused\Non these formal asymptotic results. Dialogue: 0,0:10:48.80,0:10:53.29,Default,,0000,0000,0000,,We just have a lot of papers\Nwhere people propose a method, Dialogue: 0,0:10:53.29,0:10:55.70,Default,,0000,0000,0000,,and then they establish\Nthe asymptotic properties Dialogue: 0,0:10:56.30,0:10:59.42,Default,,0000,0000,0000,,in a very kind of standardized way. Dialogue: 0,0:11:00.87,0:11:02.06,Default,,0000,0000,0000,,- Is that bad? Dialogue: 0,0:11:02.90,0:11:06.42,Default,,0000,0000,0000,,- Well, I think it's sort\Nof closed the door Dialogue: 0,0:11:06.42,0:11:09.04,Default,,0000,0000,0000,,for a lot of work\Nthat doesn't fit into that Dialogue: 0,0:11:09.04,0:11:11.60,Default,,0000,0000,0000,,where in the machine\Nlearning literature, Dialogue: 0,0:11:11.90,0:11:13.45,Default,,0000,0000,0000,,a lot of things\Nare more algorithmic. Dialogue: 0,0:11:13.81,0:11:18.50,Default,,0000,0000,0000,,People had algorithms\Nfor coming up with predictions Dialogue: 0,0:11:18.80,0:11:20.88,Default,,0000,0000,0000,,that turn out\Nto actually work much better Dialogue: 0,0:11:20.88,0:11:23.60,Default,,0000,0000,0000,,than, say, nonparametric\Nkernel regression. Dialogue: 0,0:11:24.00,0:11:26.80,Default,,0000,0000,0000,,For a long time, we were doing all\Nthe nonparametrics in econometrics, Dialogue: 0,0:11:26.80,0:11:28.95,Default,,0000,0000,0000,,and we were using\Nkernel regression, Dialogue: 0,0:11:28.95,0:11:31.21,Default,,0000,0000,0000,,and that was great\Nfor proving theorems. Dialogue: 0,0:11:31.21,0:11:32.58,Default,,0000,0000,0000,,You could get confidence intervals Dialogue: 0,0:11:32.58,0:11:34.68,Default,,0000,0000,0000,,and consistency, \Nand asymptotic normality, Dialogue: 0,0:11:34.68,0:11:35.74,Default,,0000,0000,0000,,and it was all great, Dialogue: 0,0:11:35.74,0:11:37.00,Default,,0000,0000,0000,,But it wasn't very useful. Dialogue: 0,0:11:37.30,0:11:39.10,Default,,0000,0000,0000,,And the things they did\Nin machine learning Dialogue: 0,0:11:39.10,0:11:41.05,Default,,0000,0000,0000,,are just way, way better. Dialogue: 0,0:11:41.05,0:11:42.56,Default,,0000,0000,0000,,But they didn't have the problem -- Dialogue: 0,0:11:42.56,0:11:44.45,Default,,0000,0000,0000,,- That's not my beef\Nwith machine learning, Dialogue: 0,0:11:44.45,0:11:45.87,Default,,0000,0000,0000,,that the theory is weak. Dialogue: 0,0:11:45.87,0:11:47.14,Default,,0000,0000,0000,,[laughter] Dialogue: 0,0:11:47.14,0:11:51.25,Default,,0000,0000,0000,,- No, but I'm saying there,\Nfor the prediction part, Dialogue: 0,0:11:51.25,0:11:52.39,Default,,0000,0000,0000,,it does much better. Dialogue: 0,0:11:52.39,0:11:54.50,Default,,0000,0000,0000,,- Yeah, it's a better\Ncurve fitting tool. Dialogue: 0,0:11:54.90,0:11:57.61,Default,,0000,0000,0000,,- But it did so in a way Dialogue: 0,0:11:57.61,0:11:59.78,Default,,0000,0000,0000,,that would not have made\Nthose papers Dialogue: 0,0:11:59.78,0:12:04.23,Default,,0000,0000,0000,,initially easy to get into,\Nthe econometrics journals, Dialogue: 0,0:12:04.23,0:12:06.27,Default,,0000,0000,0000,,because it wasn't proving\Nthe type of things... Dialogue: 0,0:12:06.79,0:12:09.86,Default,,0000,0000,0000,,When Breiman was doing\Nhis regression trees -- Dialogue: 0,0:12:09.86,0:12:11.20,Default,,0000,0000,0000,,they just didn't fit in. Dialogue: 0,0:12:12.94,0:12:14.93,Default,,0000,0000,0000,,I think he would have had\Na very hard time Dialogue: 0,0:12:14.93,0:12:18.40,Default,,0000,0000,0000,,publishing these things\Nin econometrics journals. Dialogue: 0,0:12:20.19,0:12:23.66,Default,,0000,0000,0000,,I think we've limited\Nourselves too much Dialogue: 0,0:12:24.70,0:12:27.83,Default,,0000,0000,0000,,that left us close things off Dialogue: 0,0:12:27.83,0:12:29.62,Default,,0000,0000,0000,,for a lot of these\Nmachine-learning methods Dialogue: 0,0:12:29.62,0:12:31.16,Default,,0000,0000,0000,,that are actually very useful. Dialogue: 0,0:12:31.16,0:12:34.00,Default,,0000,0000,0000,,I mean, I think, in general, Dialogue: 0,0:12:34.90,0:12:36.53,Default,,0000,0000,0000,,that literature, \Nthe computer scientist, Dialogue: 0,0:12:36.53,0:12:40.01,Default,,0000,0000,0000,,have brought a huge number\Nof these algorithms there -- Dialogue: 0,0:12:40.58,0:12:42.63,Default,,0000,0000,0000,,have proposed a huge number\Nof these algorithms Dialogue: 0,0:12:42.63,0:12:43.89,Default,,0000,0000,0000,,that actually are very useful. Dialogue: 0,0:12:43.89,0:12:46.07,Default,,0000,0000,0000,,and that are affecting Dialogue: 0,0:12:46.07,0:12:49.10,Default,,0000,0000,0000,,the way we're going\Nto be doing empirical work. Dialogue: 0,0:12:49.80,0:12:52.10,Default,,0000,0000,0000,,But we've not fully\Ninternalized that Dialogue: 0,0:12:52.10,0:12:53.57,Default,,0000,0000,0000,,because we're still very focused Dialogue: 0,0:12:53.57,0:12:57.50,Default,,0000,0000,0000,,on getting point estimates\Nand getting standard errors Dialogue: 0,0:12:58.60,0:13:00.16,Default,,0000,0000,0000,,and getting P values Dialogue: 0,0:13:00.16,0:13:03.21,Default,,0000,0000,0000,,in a way that we need\Nto move beyond Dialogue: 0,0:13:03.21,0:13:06.09,Default,,0000,0000,0000,,to fully harness the force, Dialogue: 0,0:13:06.55,0:13:08.35,Default,,0000,0000,0000,,the benefits Dialogue: 0,0:13:08.35,0:13:10.98,Default,,0000,0000,0000,,from the machine \Nlearning literature. Dialogue: 0,0:13:11.20,0:13:13.55,Default,,0000,0000,0000,,- On the one hand, I guess I very\Nmuch take your point Dialogue: 0,0:13:13.55,0:13:16.85,Default,,0000,0000,0000,,that sort of the traditional\Neconometrics framework Dialogue: 0,0:13:16.85,0:13:19.82,Default,,0000,0000,0000,,of propose a method,\Nprove a limit theorem Dialogue: 0,0:13:19.82,0:13:23.87,Default,,0000,0000,0000,,under some asymptotic story,\Nstory, story, story, story... Dialogue: 0,0:13:24.42,0:13:27.06,Default,,0000,0000,0000,,publisher paper is constraining, Dialogue: 0,0:13:27.22,0:13:30.13,Default,,0000,0000,0000,,and that, in some sense,\Nby thinking more broadly Dialogue: 0,0:13:30.13,0:13:31.83,Default,,0000,0000,0000,,about what a methods paper\Ncould look like, Dialogue: 0,0:13:31.83,0:13:33.49,Default,,0000,0000,0000,,we may write, in some sense, Dialogue: 0,0:13:33.49,0:13:35.23,Default,,0000,0000,0000,,certainly that the machine\Nlearning literature Dialogue: 0,0:13:35.23,0:13:37.19,Default,,0000,0000,0000,,has found a bunch of things\Nwhich seem to work quite well Dialogue: 0,0:13:37.19,0:13:38.30,Default,,0000,0000,0000,,for a number of problems Dialogue: 0,0:13:38.30,0:13:41.27,Default,,0000,0000,0000,,and are now having\Nsubstantial influence in economics. Dialogue: 0,0:13:41.27,0:13:43.26,Default,,0000,0000,0000,,I guess a question\NI'm interested in Dialogue: 0,0:13:43.26,0:13:46.46,Default,,0000,0000,0000,,is how do you think\Nabout the role of... Dialogue: 0,0:13:48.66,0:13:51.20,Default,,0000,0000,0000,,Do you think there is no value\Nin the theory part of it? Dialogue: 0,0:13:51.60,0:13:54.19,Default,,0000,0000,0000,,Because I guess a question\Nthat I often have Dialogue: 0,0:13:54.19,0:13:56.80,Default,,0000,0000,0000,,to seeing the output\Nfrom a machine learning tool, Dialogue: 0,0:13:56.80,0:13:58.21,Default,,0000,0000,0000,,and actually a number\Nof the methods Dialogue: 0,0:13:58.21,0:13:59.22,Default,,0000,0000,0000,,that you talked about Dialogue: 0,0:13:59.22,0:14:00.68,Default,,0000,0000,0000,,actually do have\Ninferential results Dialogue: 0,0:14:00.68,0:14:01.94,Default,,0000,0000,0000,,developed for them, Dialogue: 0,0:14:02.52,0:14:03.96,Default,,0000,0000,0000,,something that\NI always wonder about, Dialogue: 0,0:14:03.96,0:14:06.66,Default,,0000,0000,0000,,a sort of uncertainty\Nquantification and just... Dialogue: 0,0:14:06.66,0:14:08.00,Default,,0000,0000,0000,,I have my prior, Dialogue: 0,0:14:08.00,0:14:11.00,Default,,0000,0000,0000,,I come into the world with my view,\NI see the result of this thing. Dialogue: 0,0:14:11.00,0:14:12.40,Default,,0000,0000,0000,,How should I update based on it? Dialogue: 0,0:14:12.40,0:14:13.87,Default,,0000,0000,0000,,And in some sense, \Nif I'm in a world Dialogue: 0,0:14:13.87,0:14:15.91,Default,,0000,0000,0000,,where things\Nare normally distributed, Dialogue: 0,0:14:15.91,0:14:17.28,Default,,0000,0000,0000,,I know how to do it -- Dialogue: 0,0:14:17.28,0:14:18.30,Default,,0000,0000,0000,,here I don't. Dialogue: 0,0:14:18.30,0:14:21.03,Default,,0000,0000,0000,,And so I'm interested to hear\Nwhat you think about that. Dialogue: 0,0:14:21.50,0:14:24.42,Default,,0000,0000,0000,,- I don't see this \Nas sort of saying, well, Dialogue: 0,0:14:24.70,0:14:26.56,Default,,0000,0000,0000,,these results are not interesting, Dialogue: 0,0:14:26.56,0:14:27.97,Default,,0000,0000,0000,,but it's going to be a lot of cases Dialogue: 0,0:14:27.97,0:14:30.23,Default,,0000,0000,0000,,where it's going to be incredibly\Nhard to get those results, Dialogue: 0,0:14:30.23,0:14:32.49,Default,,0000,0000,0000,,and we may not\Nbe able to get there, Dialogue: 0,0:14:32.49,0:14:34.94,Default,,0000,0000,0000,,and we may need to do it in stages Dialogue: 0,0:14:34.94,0:14:36.44,Default,,0000,0000,0000,,where first someone says, Dialogue: 0,0:14:36.44,0:14:40.90,Default,,0000,0000,0000,,"Hey, I have\Nthis interesting algorithm Dialogue: 0,0:14:40.90,0:14:42.20,Default,,0000,0000,0000,,for doing something," Dialogue: 0,0:14:42.20,0:14:47.21,Default,,0000,0000,0000,,and it works well\Nby some criterion Dialogue: 0,0:14:47.21,0:14:49.90,Default,,0000,0000,0000,,on this particular data set, Dialogue: 0,0:14:51.00,0:14:52.60,Default,,0000,0000,0000,,and we should put it out there. Dialogue: 0,0:14:52.60,0:14:55.41,Default,,0000,0000,0000,,and maybe someone\Nwill figure out a way Dialogue: 0,0:14:55.41,0:14:57.83,Default,,0000,0000,0000,,that you can later actually\Nstill do inference Dialogue: 0,0:14:57.83,0:14:59.46,Default,,0000,0000,0000,,under some conditions, Dialogue: 0,0:14:59.46,0:15:02.10,Default,,0000,0000,0000,,and maybe those are not\Nparticularly realistic conditions. Dialogue: 0,0:15:02.10,0:15:03.80,Default,,0000,0000,0000,,Then we kind of go further. Dialogue: 0,0:15:03.80,0:15:08.42,Default,,0000,0000,0000,,But I think we've been\Nconstraining things too much Dialogue: 0,0:15:08.42,0:15:09.52,Default,,0000,0000,0000,,where we said, Dialogue: 0,0:15:09.52,0:15:13.18,Default,,0000,0000,0000,,"This is the type of things\Nthat we need to do." Dialogue: 0,0:15:13.18,0:15:14.50,Default,,0000,0000,0000,,And in some sense, Dialogue: 0,0:15:15.70,0:15:18.20,Default,,0000,0000,0000,,that goes back\Nto the way Josh and I Dialogue: 0,0:15:19.70,0:15:21.98,Default,,0000,0000,0000,,thought about things for the local\Naverage treatment effect. Dialogue: 0,0:15:21.98,0:15:23.14,Default,,0000,0000,0000,,That wasn't quite the way Dialogue: 0,0:15:23.14,0:15:25.14,Default,,0000,0000,0000,,people were thinking\Nabout these problems before. Dialogue: 0,0:15:25.80,0:15:28.86,Default,,0000,0000,0000,,There was a sense\Nthat some of the people said Dialogue: 0,0:15:29.50,0:15:31.90,Default,,0000,0000,0000,,the way you need to do\Nthese things is you first say Dialogue: 0,0:15:32.20,0:15:34.14,Default,,0000,0000,0000,,what you're interested\Nin estimating, Dialogue: 0,0:15:34.14,0:15:37.51,Default,,0000,0000,0000,,and then you do the best job\Nyou can in estimating that. Dialogue: 0,0:15:38.10,0:15:43.87,Default,,0000,0000,0000,,And what you guys are doing\Nis you're doing it backwards. Dialogue: 0,0:15:44.30,0:15:46.70,Default,,0000,0000,0000,,You kind of say,\N"Here, I have an estimator, Dialogue: 0,0:15:47.30,0:15:50.64,Default,,0000,0000,0000,,and now I'm going to figure out\Nwhat it's estimating." Dialogue: 0,0:15:50.64,0:15:53.90,Default,,0000,0000,0000,,And I suppose you're going to say\Nwhy you think that's interesting Dialogue: 0,0:15:53.90,0:15:56.60,Default,,0000,0000,0000,,or maybe why it's not interesting,\Nand that's not okay. Dialogue: 0,0:15:56.60,0:15:58.60,Default,,0000,0000,0000,,You're not allowed\Nto do that in that way. Dialogue: 0,0:15:59.00,0:16:02.03,Default,,0000,0000,0000,,And I think we should\Njust be a little bit more flexible Dialogue: 0,0:16:02.03,0:16:06.65,Default,,0000,0000,0000,,in thinking about\Nhow to look at problems Dialogue: 0,0:16:06.65,0:16:08.33,Default,,0000,0000,0000,,because I think\Nwe've missed some things Dialogue: 0,0:16:08.33,0:16:11.30,Default,,0000,0000,0000,,by not doing that. Dialogue: 0,0:16:11.30,0:16:12.82,Default,,0000,0000,0000,,♪ [music] ♪ Dialogue: 0,0:16:12.82,0:16:14.75,Default,,0000,0000,0000,,- [Josh] So you've heard\Nour views, Isaiah, Dialogue: 0,0:16:14.75,0:16:18.19,Default,,0000,0000,0000,,and you've seen that we have\Nsome points of disagreement. Dialogue: 0,0:16:18.19,0:16:20.40,Default,,0000,0000,0000,,Why don't you referee\Nthis dispute for us? Dialogue: 0,0:16:20.95,0:16:22.39,Default,,0000,0000,0000,,[laughter] Dialogue: 0,0:16:22.50,0:16:24.100,Default,,0000,0000,0000,,- Oh, it's so nice of you\Nto ask me a small question. Dialogue: 0,0:16:24.100,0:16:26.21,Default,,0000,0000,0000,,[laughter] Dialogue: 0,0:16:26.42,0:16:27.99,Default,,0000,0000,0000,,So I guess, for one, Dialogue: 0,0:16:27.99,0:16:33.20,Default,,0000,0000,0000,,I very much agree with something\Nthat Guido said earlier of... Dialogue: 0,0:16:34.10,0:16:35.71,Default,,0000,0000,0000,,[laughter] Dialogue: 0,0:16:35.92,0:16:37.15,Default,,0000,0000,0000,,So one thing where it seems Dialogue: 0,0:16:37.15,0:16:40.07,Default,,0000,0000,0000,,where the case for machine learning\Nseems relatively clear Dialogue: 0,0:16:40.07,0:16:43.32,Default,,0000,0000,0000,,is in settings where\Nwe're interested in some version Dialogue: 0,0:16:43.32,0:16:45.10,Default,,0000,0000,0000,,of a nonparametric\Nprediction problem. Dialogue: 0,0:16:45.10,0:16:46.39,Default,,0000,0000,0000,,So I'm interested in estimating Dialogue: 0,0:16:46.39,0:16:49.70,Default,,0000,0000,0000,,a conditional expectation\Nor conditional probability, Dialogue: 0,0:16:50.00,0:16:52.02,Default,,0000,0000,0000,,and in the past, maybe\NI would have run a kernel... Dialogue: 0,0:16:52.02,0:16:53.53,Default,,0000,0000,0000,,I would have run\Na kernel regression, Dialogue: 0,0:16:53.53,0:16:55.18,Default,,0000,0000,0000,,or I would have run\Na series regression, Dialogue: 0,0:16:55.18,0:16:57.40,Default,,0000,0000,0000,,or something along those lines. Dialogue: 0,0:16:57.98,0:17:00.35,Default,,0000,0000,0000,,It seems like, at this point, \Nwe've a fairly good sense Dialogue: 0,0:17:00.35,0:17:03.10,Default,,0000,0000,0000,,that in a fairly wide range\Nof applications, Dialogue: 0,0:17:03.10,0:17:05.67,Default,,0000,0000,0000,,machine learning methods\Nseem to do better Dialogue: 0,0:17:05.67,0:17:08.61,Default,,0000,0000,0000,,for estimating conditional\Nmean functions, Dialogue: 0,0:17:08.61,0:17:09.81,Default,,0000,0000,0000,,or conditional probabilities, Dialogue: 0,0:17:09.81,0:17:12.00,Default,,0000,0000,0000,,or various other\Nnonparametric objects Dialogue: 0,0:17:12.40,0:17:15.31,Default,,0000,0000,0000,,than more traditional\Nnonparametric methods Dialogue: 0,0:17:15.31,0:17:17.29,Default,,0000,0000,0000,,that were studied\Nin econometrics and statistics, Dialogue: 0,0:17:17.29,0:17:19.10,Default,,0000,0000,0000,,especially in\Nhigh-dimensional settings. Dialogue: 0,0:17:19.50,0:17:21.85,Default,,0000,0000,0000,,- So you're thinking of maybe\Nthe propensity score Dialogue: 0,0:17:21.85,0:17:23.16,Default,,0000,0000,0000,,or something like that? Dialogue: 0,0:17:23.16,0:17:25.06,Default,,0000,0000,0000,,- Yeah, exactly,\N- Nuisance functions. Dialogue: 0,0:17:25.06,0:17:27.10,Default,,0000,0000,0000,,- Yeah, so things\Nlike propensity scores. Dialogue: 0,0:17:27.87,0:17:29.96,Default,,0000,0000,0000,,Even objects of more direct Dialogue: 0,0:17:29.96,0:17:32.40,Default,,0000,0000,0000,,interest-like conditional\Naverage treatment effects, Dialogue: 0,0:17:32.40,0:17:35.10,Default,,0000,0000,0000,,which are the difference of two\Nconditional expectation functions, Dialogue: 0,0:17:35.10,0:17:36.62,Default,,0000,0000,0000,,potentially things like that. Dialogue: 0,0:17:36.62,0:17:40.57,Default,,0000,0000,0000,,Of course, even there, \Nthe theory... Dialogue: 0,0:17:40.57,0:17:43.62,Default,,0000,0000,0000,,for inference of the theory\Nfor how to interpret, Dialogue: 0,0:17:43.62,0:17:45.80,Default,,0000,0000,0000,,how to make large sample statements\Nabout some of these things Dialogue: 0,0:17:45.80,0:17:47.73,Default,,0000,0000,0000,,are less well-developed\Ndepending on Dialogue: 0,0:17:47.73,0:17:50.10,Default,,0000,0000,0000,,the machine learning\Nestimator used. Dialogue: 0,0:17:50.10,0:17:52.98,Default,,0000,0000,0000,,And so I think\Nsomething that is tricky Dialogue: 0,0:17:52.98,0:17:55.70,Default,,0000,0000,0000,,is that we can have these methods,\Nwhich work a lot, Dialogue: 0,0:17:55.70,0:17:58.00,Default,,0000,0000,0000,,which seem to work\Na lot better for some purposes Dialogue: 0,0:17:58.00,0:18:01.23,Default,,0000,0000,0000,,but which we need to be a bit\Ncareful in how we plug them in Dialogue: 0,0:18:01.23,0:18:03.30,Default,,0000,0000,0000,,or how we interpret\Nthe resulting statements. Dialogue: 0,0:18:03.60,0:18:05.98,Default,,0000,0000,0000,,But, of course, that's a very,\Nvery active area right now Dialogue: 0,0:18:05.98,0:18:07.67,Default,,0000,0000,0000,,where people are doing\Ntons of great work. Dialogue: 0,0:18:07.67,0:18:10.69,Default,,0000,0000,0000,,So I fully expect\Nand hope to see Dialogue: 0,0:18:10.69,0:18:12.80,Default,,0000,0000,0000,,much more going forward there. Dialogue: 0,0:18:13.00,0:18:16.78,Default,,0000,0000,0000,,So one issue with machine learning\Nthat always seems a danger is... Dialogue: 0,0:18:16.78,0:18:18.52,Default,,0000,0000,0000,,or that is sometimes a danger Dialogue: 0,0:18:18.52,0:18:20.94,Default,,0000,0000,0000,,and has sometimes\Nled to applications Dialogue: 0,0:18:20.94,0:18:22.14,Default,,0000,0000,0000,,that have made less sense Dialogue: 0,0:18:22.14,0:18:27.31,Default,,0000,0000,0000,,is when folks start with a method\Nthat they're very excited about Dialogue: 0,0:18:27.31,0:18:28.68,Default,,0000,0000,0000,,rather than a question. Dialogue: 0,0:18:28.90,0:18:30.49,Default,,0000,0000,0000,,So sort of starting with a question Dialogue: 0,0:18:30.49,0:18:33.78,Default,,0000,0000,0000,,where here's the object\NI'm interested in, Dialogue: 0,0:18:33.78,0:18:35.23,Default,,0000,0000,0000,,here is the parameter\Nof interest -- Dialogue: 0,0:18:35.53,0:18:39.50,Default,,0000,0000,0000,,let me think about how I would\Nidentify that thing, Dialogue: 0,0:18:39.50,0:18:41.82,Default,,0000,0000,0000,,how I would recover that thing\Nif I had a ton of data. Dialogue: 0,0:18:41.82,0:18:44.00,Default,,0000,0000,0000,,Oh, here's a conditional\Nexpectation function, Dialogue: 0,0:18:44.00,0:18:47.06,Default,,0000,0000,0000,,let me plug in a machine\Nlearning estimator for that -- Dialogue: 0,0:18:47.06,0:18:48.80,Default,,0000,0000,0000,,that seems very, very sensible. Dialogue: 0,0:18:49.00,0:18:52.96,Default,,0000,0000,0000,,Whereas, you know, \Nif I regress quantity on price Dialogue: 0,0:18:53.50,0:18:56.00,Default,,0000,0000,0000,,and say that I used\Na machine learning method, Dialogue: 0,0:18:56.30,0:18:58.79,Default,,0000,0000,0000,,maybe I'm satisfied that \Nthat solves the endogeneity problem Dialogue: 0,0:18:58.79,0:19:01.20,Default,,0000,0000,0000,,we're usually worried\Nabout there -- maybe I'm not. Dialogue: 0,0:19:01.50,0:19:02.65,Default,,0000,0000,0000,,But, again, that's something Dialogue: 0,0:19:02.65,0:19:06.30,Default,,0000,0000,0000,,where the way to address it\Nseems relatively clear. Dialogue: 0,0:19:06.50,0:19:08.18,Default,,0000,0000,0000,,It's to find\Nyour object of interest Dialogue: 0,0:19:08.18,0:19:09.78,Default,,0000,0000,0000,,and think about -- Dialogue: 0,0:19:09.78,0:19:11.49,Default,,0000,0000,0000,,- Just bring in the economics. Dialogue: 0,0:19:11.49,0:19:12.74,Default,,0000,0000,0000,,- Exactly. Dialogue: 0,0:19:12.74,0:19:14.27,Default,,0000,0000,0000,,- And think about\Nthe heterogeneity, Dialogue: 0,0:19:14.27,0:19:17.07,Default,,0000,0000,0000,,but harness the power\Nof the machine learning methods Dialogue: 0,0:19:17.07,0:19:20.15,Default,,0000,0000,0000,,for some of the components. Dialogue: 0,0:19:20.35,0:19:21.39,Default,,0000,0000,0000,,- Precisely. Exactly. Dialogue: 0,0:19:21.39,0:19:23.67,Default,,0000,0000,0000,,So the question of interest Dialogue: 0,0:19:23.67,0:19:25.80,Default,,0000,0000,0000,,is the same as the question\Nof interest has always been, Dialogue: 0,0:19:25.80,0:19:28.60,Default,,0000,0000,0000,,but we now have better methods\Nfor estimating some pieces of this. Dialogue: 0,0:19:29.90,0:19:32.70,Default,,0000,0000,0000,,The place that seems\Nharder to forecast Dialogue: 0,0:19:32.70,0:19:35.82,Default,,0000,0000,0000,,is obviously there's\Na huge amount going on Dialogue: 0,0:19:35.82,0:19:37.50,Default,,0000,0000,0000,,in the machine learning literature, Dialogue: 0,0:19:37.50,0:19:40.22,Default,,0000,0000,0000,,and the limited ways\Nof plugging it in Dialogue: 0,0:19:40.22,0:19:41.39,Default,,0000,0000,0000,,that I've referenced so far Dialogue: 0,0:19:41.39,0:19:43.09,Default,,0000,0000,0000,,are a limited piece of that. Dialogue: 0,0:19:43.09,0:19:45.39,Default,,0000,0000,0000,,So I think there are all sorts\Nof other interesting questions Dialogue: 0,0:19:45.39,0:19:46.63,Default,,0000,0000,0000,,about where... Dialogue: 0,0:19:47.10,0:19:49.30,Default,,0000,0000,0000,,where does this interaction go? \NWhat else can we learn? Dialogue: 0,0:19:49.30,0:19:52.93,Default,,0000,0000,0000,,And that's something where\NI think there's a ton going on, Dialogue: 0,0:19:52.93,0:19:54.41,Default,,0000,0000,0000,,which seems very promising, Dialogue: 0,0:19:54.41,0:19:56.40,Default,,0000,0000,0000,,and I have no idea\Nwhat the answer is. Dialogue: 0,0:19:57.00,0:20:00.30,Default,,0000,0000,0000,,- No, I totally agree with that, Dialogue: 0,0:20:00.30,0:20:03.54,Default,,0000,0000,0000,,but that makes it very exciting. Dialogue: 0,0:20:03.54,0:20:06.10,Default,,0000,0000,0000,,And I think there's just\Na little work to be done there. Dialogue: 0,0:20:06.60,0:20:08.72,Default,,0000,0000,0000,,Alright. So Isaiah agrees \Nwith me there. Dialogue: 0,0:20:08.72,0:20:10.17,Default,,0000,0000,0000,,[laughter] Dialogue: 0,0:20:10.17,0:20:11.63,Default,,0000,0000,0000,,- I didn't say that per se. Dialogue: 0,0:20:12.93,0:20:14.42,Default,,0000,0000,0000,,♪ [music] ♪ Dialogue: 0,0:20:14.42,0:20:16.83,Default,,0000,0000,0000,,- [Narrator] If you'd like to watch\Nmore Nobel Conversations, Dialogue: 0,0:20:16.83,0:20:18.01,Default,,0000,0000,0000,,click here. Dialogue: 0,0:20:18.01,0:20:20.50,Default,,0000,0000,0000,,Or if you'd like to learn\Nmore about econometrics, Dialogue: 0,0:20:20.50,0:20:23.10,Default,,0000,0000,0000,,check out Josh's\NMastering Econometrics series. Dialogue: 0,0:20:23.60,0:20:26.57,Default,,0000,0000,0000,,If you'd like to learn more\Nabout Guido, Josh, and Isaiah, Dialogue: 0,0:20:26.57,0:20:28.55,Default,,0000,0000,0000,,check out the links\Nin the description. Dialogue: 0,0:20:28.55,0:20:30.54,Default,,0000,0000,0000,,♪ [music] ♪