[Script Info] Title: [Events] Format: Layer, Start, End, Style, Name, MarginL, MarginR, MarginV, Effect, Text Dialogue: 0,0:00:00.10,0:00:02.35,Default,,0000,0000,0000,,♪ [music] ♪ Dialogue: 0,0:00:03.70,0:00:05.70,Default,,0000,0000,0000,,- [narrator] Welcome\Nto Nobel Conversations. Dialogue: 0,0:00:07.00,0:00:10.13,Default,,0000,0000,0000,,In this episode, Josh Angrist\Nand Guido Imbens Dialogue: 0,0:00:10.13,0:00:13.70,Default,,0000,0000,0000,,sit down with Isaiah Andrews\Nto discuss and disagree Dialogue: 0,0:00:13.70,0:00:16.58,Default,,0000,0000,0000,,over the role of machine learning\Nin applied econometrics. Dialogue: 0,0:00:18.30,0:00:19.77,Default,,0000,0000,0000,,- [Isaiah] So, of course,\Nthere are a lot of topics Dialogue: 0,0:00:19.77,0:00:21.09,Default,,0000,0000,0000,,where you guys largely agree, Dialogue: 0,0:00:21.09,0:00:22.31,Default,,0000,0000,0000,,but I'd like to turn to one Dialogue: 0,0:00:22.31,0:00:24.24,Default,,0000,0000,0000,,where maybe you have\Nsome differences of opinion. Dialogue: 0,0:00:24.24,0:00:25.73,Default,,0000,0000,0000,,So I'd love to hear\Nsome of your thoughts Dialogue: 0,0:00:25.73,0:00:26.88,Default,,0000,0000,0000,,about machine learning Dialogue: 0,0:00:26.88,0:00:29.90,Default,,0000,0000,0000,,and the goal that it's playing\Nand is going to play in economics. Dialogue: 0,0:00:30.20,0:00:33.35,Default,,0000,0000,0000,,- [Guido] I've looked at some data\Nlike the proprietary Dialogue: 0,0:00:33.35,0:00:35.10,Default,,0000,0000,0000,,so that there's\Nno published paper there. Dialogue: 0,0:00:36.72,0:00:38.16,Default,,0000,0000,0000,,There was an experiment\Nthat was done Dialogue: 0,0:00:38.16,0:00:39.50,Default,,0000,0000,0000,,on some search algorithm. Dialogue: 0,0:00:39.70,0:00:41.50,Default,,0000,0000,0000,,And the question was... Dialogue: 0,0:00:42.90,0:00:45.60,Default,,0000,0000,0000,,it was about ranking things\Nand changing the ranking. Dialogue: 0,0:00:45.90,0:00:47.50,Default,,0000,0000,0000,,That was sort of clear... Dialogue: 0,0:00:48.40,0:00:50.60,Default,,0000,0000,0000,,that was going to be\Na lot of heterogeneity there. Dialogue: 0,0:00:50.60,0:00:51.70,Default,,0000,0000,0000,,Mmm, Dialogue: 0,0:00:51.70,0:00:58.12,Default,,0000,0000,0000,,You know, if you look for say, Dialogue: 0,0:00:58.30,0:01:00.35,Default,,0000,0000,0000,,a picture of Britney Spears Dialogue: 0,0:01:00.35,0:01:02.40,Default,,0000,0000,0000,,that it doesn't really matter\Nwhere you rank it Dialogue: 0,0:01:02.40,0:01:05.50,Default,,0000,0000,0000,,because you're going to figure out\Nwhat you're looking for, Dialogue: 0,0:01:06.20,0:01:07.87,Default,,0000,0000,0000,,whether you put it\Nin the first or second Dialogue: 0,0:01:07.87,0:01:09.80,Default,,0000,0000,0000,,or third position of the ranking. Dialogue: 0,0:01:10.10,0:01:12.50,Default,,0000,0000,0000,,But if you're looking\Nfor the best econometrics book, Dialogue: 0,0:01:13.30,0:01:16.50,Default,,0000,0000,0000,,if you put your book\Nfirst or your book tenth, Dialogue: 0,0:01:16.50,0:01:18.10,Default,,0000,0000,0000,,that's going to make\Na big difference Dialogue: 0,0:01:18.60,0:01:21.83,Default,,0000,0000,0000,,how much how often people\Nare going to click on it. Dialogue: 0,0:01:21.83,0:01:23.42,Default,,0000,0000,0000,,And so there you go -- Dialogue: 0,0:01:23.42,0:01:27.22,Default,,0000,0000,0000,,- [Josh] Why do I need\Nmachine learning to discover that? Dialogue: 0,0:01:27.22,0:01:29.10,Default,,0000,0000,0000,,It seems like because\NI can discover it simply. Dialogue: 0,0:01:29.30,0:01:31.80,Default,,0000,0000,0000,,So in general, there\Nwere lots of possible. Dialogue: 0,0:01:32.10,0:01:36.30,Default,,0000,0000,0000,,You what you want to think about there\Nbeing lots of characteristics of the Dialogue: 0,0:01:36.40,0:01:42.00,Default,,0000,0000,0000,,the items that you want to understand\Nwhere, what drives the heterogeneity Dialogue: 0,0:01:42.30,0:01:45.60,Default,,0000,0000,0000,,in the effect of your just rekt,\Nyou know, that in some sense. Dialogue: 0,0:01:45.60,0:01:47.70,Default,,0000,0000,0000,,You're solving a marketing problem. Dialogue: 0,0:01:48.40,0:01:51.80,Default,,0000,0000,0000,,Also affect you, it's causal,\Nbut it has no scientific content. Dialogue: 0,0:01:51.80,0:01:53.30,Default,,0000,0000,0000,,I think about think about, Dialogue: 0,0:01:54.10,0:01:57.30,Default,,0000,0000,0000,,but it's similar things\Nand medical settings. Dialogue: 0,0:01:58.00,0:02:01.20,Default,,0000,0000,0000,,If you do an experiment, you\Nmay actually be very interested Dialogue: 0,0:02:01.30,0:02:03.80,Default,,0000,0000,0000,,in whether the treatment\Nworks for some groups or not. Dialogue: 0,0:02:03.90,0:02:06.50,Default,,0000,0000,0000,,And you have a lot of individual\Ncharacteristics and you want Dialogue: 0,0:02:06.50,0:02:09.50,Default,,0000,0000,0000,,to systematically search.\NYeah. I'm skeptical about that. Dialogue: 0,0:02:09.50,0:02:13.90,Default,,0000,0000,0000,,That sort of idea that there's this personal\Ncausal effect that I should care about, Dialogue: 0,0:02:14.00,0:02:18.20,Default,,0000,0000,0000,,and that machine learning can Discover it\Nin some way that's useful. So think about Dialogue: 0,0:02:18.30,0:02:21.40,Default,,0000,0000,0000,,I've done a lot of work\Non schools, going to say Dialogue: 0,0:02:21.40,0:02:26.50,Default,,0000,0000,0000,,a charter school publicly funded\Nprivate school effectively, you know, Dialogue: 0,0:02:26.50,0:02:29.30,Default,,0000,0000,0000,,that's free to structure its own\Ncurriculum for context there. Dialogue: 0,0:02:29.30,0:02:32.70,Default,,0000,0000,0000,,Some types of charter, schools\Nare generate spectacular, Dialogue: 0,0:02:32.70,0:02:36.40,Default,,0000,0000,0000,,achievement gains and in the data\Nset that produces that result. Dialogue: 0,0:02:36.40,0:02:37.80,Default,,0000,0000,0000,,I have a lot of covariance. Dialogue: 0,0:02:37.80,0:02:41.20,Default,,0000,0000,0000,,So I have Baseline scores,\Nand I have family background, Dialogue: 0,0:02:41.20,0:02:45.80,Default,,0000,0000,0000,,the education of the parents, the sex\Nof the child, the race of the child. Dialogue: 0,0:02:45.80,0:02:48.30,Default,,0000,0000,0000,,And, well, soon as I put Dialogue: 0,0:02:48.40,0:02:51.90,Default,,0000,0000,0000,,Half a dozen of those together. I\Nhave a very high dimensional space. Dialogue: 0,0:02:52.30,0:02:54.90,Default,,0000,0000,0000,,I'm definitely interested\Nin in sort, of course, Dialogue: 0,0:02:54.90,0:02:59.40,Default,,0000,0000,0000,,features of that treatment effect,\Nlike whether it's better for people who Dialogue: 0,0:02:59.90,0:03:02.10,Default,,0000,0000,0000,,come from lower income families. Dialogue: 0,0:03:02.60,0:03:06.00,Default,,0000,0000,0000,,I have a hard time believing\Nthat there's an application, Dialogue: 0,0:03:06.40,0:03:10.30,Default,,0000,0000,0000,,you know, for the very high\Ndimensional version of that, where Dialogue: 0,0:03:10.50,0:03:13.20,Default,,0000,0000,0000,,I discovered that for\Nnon-white children who have Dialogue: 0,0:03:13.80,0:03:17.80,Default,,0000,0000,0000,,high family incomes, but Baseline\Nscores in the third quartile, Dialogue: 0,0:03:18.30,0:03:23.00,Default,,0000,0000,0000,,And only went to public school in the\Nthird grade, but not the sixth grade. Dialogue: 0,0:03:23.00,0:03:25.50,Default,,0000,0000,0000,,So that's what that high\Ndimensional analysis produces. Dialogue: 0,0:03:25.80,0:03:28.10,Default,,0000,0000,0000,,This very elaborate conditional statement. Dialogue: 0,0:03:28.30,0:03:31.00,Default,,0000,0000,0000,,There's two things that are wrong\Nwith that. In my view first. Dialogue: 0,0:03:31.00,0:03:34.00,Default,,0000,0000,0000,,I don't see it as I just can't\Nimagine why it's actionable. Dialogue: 0,0:03:34.60,0:03:36.60,Default,,0000,0000,0000,,I don't know why you'd want to act on it. Dialogue: 0,0:03:36.60,0:03:41.20,Default,,0000,0000,0000,,And I know also that there's some\Nalternative model that fits almost as well. Dialogue: 0,0:03:41.80,0:03:43.00,Default,,0000,0000,0000,,That flips everything, Dialogue: 0,0:03:43.20,0:03:47.50,Default,,0000,0000,0000,,right? Because machine learning doesn't\Ntell me that this is really the predictor Dialogue: 0,0:03:47.90,0:03:48.10,Default,,0000,0000,0000,,that Dialogue: 0,0:03:48.40,0:03:52.30,Default,,0000,0000,0000,,Is it just tells me that this\Nis a good predictor? And so, Dialogue: 0,0:03:52.80,0:03:55.90,Default,,0000,0000,0000,,you know, I think there is\Nsomething different about the Dialogue: 0,0:03:56.00,0:03:58.40,Default,,0000,0000,0000,,Moss social science contest. So I think Dialogue: 0,0:03:58.50,0:04:02.60,Default,,0000,0000,0000,,the socialized signs of applications\Nyou're talking about once where Dialogue: 0,0:04:03.40,0:04:08.10,Default,,0000,0000,0000,,I think there's not a huge amount\Nof heterogeneity in the effects. Dialogue: 0,0:04:08.40,0:04:14.00,Default,,0000,0000,0000,,And so what there might be a few\Nallow me to to fill that space. No, Dialogue: 0,0:04:14.60,0:04:18.10,Default,,0000,0000,0000,,not even then I think for\Na lot of those those into Dialogue: 0,0:04:18.30,0:04:22.00,Default,,0000,0000,0000,,Sanctions even effect. You would expect\Nthat. The effect is the same sign Dialogue: 0,0:04:22.10,0:04:22.90,Default,,0000,0000,0000,,for everybody. Dialogue: 0,0:04:23.40,0:04:27.60,Default,,0000,0000,0000,,It may be there may be small differences\Nin the magnitude, but it's not Dialogue: 0,0:04:28.20,0:04:31.70,Default,,0000,0000,0000,,for a lot of these education\Ndefenses. They're good for everybody. Dialogue: 0,0:04:31.80,0:04:32.30,Default,,0000,0000,0000,,They're Dialogue: 0,0:04:32.90,0:04:37.60,Default,,0000,0000,0000,,the it's not that they're bad for some\Npeople and good for other people and Dialogue: 0,0:04:37.60,0:04:40.80,Default,,0000,0000,0000,,that is kind of very small\NPockets where they're bad the Dialogue: 0,0:04:40.90,0:04:43.90,Default,,0000,0000,0000,,but it may be some\Nvariation in the magnitude, Dialogue: 0,0:04:44.00,0:04:48.20,Default,,0000,0000,0000,,but you would need very very big\Ndata sets to find those and I Dialogue: 0,0:04:48.40,0:04:51.40,Default,,0000,0000,0000,,Then in those cases, they probably\Nwouldn't be very actionable anyone. Dialogue: 0,0:04:51.70,0:04:53.80,Default,,0000,0000,0000,,But there's I think there's\Na lot of other settings Dialogue: 0,0:04:54.10,0:04:56.60,Default,,0000,0000,0000,,where there is much more hydrogen it. Dialogue: 0,0:04:57.40,0:05:01.60,Default,,0000,0000,0000,,Well, I'm open to that possibility\Nand I think the example you gave of Dialogue: 0,0:05:01.90,0:05:05.00,Default,,0000,0000,0000,,it's essentially a marketing example. Dialogue: 0,0:05:06.40,0:05:08.40,Default,,0000,0000,0000,,Now that maybe they\Nsay there's a there's a Dialogue: 0,0:05:08.50,0:05:10.70,Default,,0000,0000,0000,,have implications for\Nand that's organization. Dialogue: 0,0:05:10.70,0:05:13.90,Default,,0000,0000,0000,,How you actually need to\Nwhether you need to worry about Dialogue: 0,0:05:14.00,0:05:17.90,Default,,0000,0000,0000,,the well, I know Market\Npower, some see that paper. Dialogue: 0,0:05:18.40,0:05:21.20,Default,,0000,0000,0000,,So that's the sense. The\Nsense I'm getting is that Dialogue: 0,0:05:21.50,0:05:23.50,Default,,0000,0000,0000,,we still disagree on something. Yes. Dialogue: 0,0:05:24.10,0:05:26.70,Default,,0000,0000,0000,,We have it converged on\Neverything. I'm getting that sense. Dialogue: 0,0:05:27.20,0:05:31.00,Default,,0000,0000,0000,,Actually. We've diverged on this because\Nthis wasn't around to argue about. Dialogue: 0,0:05:33.20,0:05:38.00,Default,,0000,0000,0000,,Is it getting a little warm here? Yeah.\NWarm warmed up. Warmed up is good. Dialogue: 0,0:05:38.10,0:05:40.80,Default,,0000,0000,0000,,The sense. I'm getting his Jaws.\NSort of, you're not, you're not Dialogue: 0,0:05:40.90,0:05:43.40,Default,,0000,0000,0000,,saying that you're confident\Nthat there is no way. Dialogue: 0,0:05:43.40,0:05:45.40,Default,,0000,0000,0000,,That there is an application\Nwhere the stuff is useful. Dialogue: 0,0:05:45.40,0:05:48.20,Default,,0000,0000,0000,,You are saying you are you're\Nunconvinced by the existing. Dialogue: 0,0:05:48.30,0:05:52.20,Default,,0000,0000,0000,,Applications to dedicate fair\Nthat I'm very confident. Yeah, Dialogue: 0,0:05:54.20,0:05:55.00,Default,,0000,0000,0000,,in this case. Dialogue: 0,0:05:55.30,0:05:57.50,Default,,0000,0000,0000,,I think Josh does have a point that today Dialogue: 0,0:05:58.00,0:06:02.10,Default,,0000,0000,0000,,even in the prediction cases the where Dialogue: 0,0:06:02.30,0:06:05.00,Default,,0000,0000,0000,,a lot of the machine learning\Nmethods really shine is Dialogue: 0,0:06:05.00,0:06:06.60,Default,,0000,0000,0000,,where there's just a lot of heterogeneity. Dialogue: 0,0:06:07.30,0:06:10.60,Default,,0000,0000,0000,,You don't really care much\Nabout the details there, right? Dialogue: 0,0:06:10.90,0:06:15.00,Default,,0000,0000,0000,,Yes. It does. It doesn't have\Na policy angle or something. Dialogue: 0,0:06:15.20,0:06:18.10,Default,,0000,0000,0000,,They kind of recognizing\Nhandwritten digits and stuff. Dialogue: 0,0:06:18.30,0:06:24.00,Default,,0000,0000,0000,,For it does much better there than\Nbuilding some complicated model. Dialogue: 0,0:06:24.40,0:06:28.10,Default,,0000,0000,0000,,But a lot of the social science, a\Nlot of the economic applications. Dialogue: 0,0:06:28.30,0:06:32.10,Default,,0000,0000,0000,,We actually know a huge amount about the\Nrelationship between various variables. Dialogue: 0,0:06:32.10,0:06:34.60,Default,,0000,0000,0000,,A lot of the relationships\Nare strictly monotone. Dialogue: 0,0:06:35.40,0:06:39.40,Default,,0000,0000,0000,,There and education is going\Nto increase people's earnings, Dialogue: 0,0:06:39.80,0:06:44.10,Default,,0000,0000,0000,,irrespective of the demographic,\Nirrespective of the level of Education. Dialogue: 0,0:06:44.10,0:06:47.80,Default,,0000,0000,0000,,You already have until they get to a\NPhD. Yeah. There is a graduate school. Dialogue: 0,0:06:49.50,0:06:50.70,Default,,0000,0000,0000,,A reasonable range. Dialogue: 0,0:06:51.60,0:06:55.90,Default,,0000,0000,0000,,It's a it's not going to\Ngo down very much. We're Dialogue: 0,0:06:56.10,0:06:59.70,Default,,0000,0000,0000,,in a lot of the settings. For these\Nmachine learning method shine. Dialogue: 0,0:06:59.70,0:07:01.90,Default,,0000,0000,0000,,It's going to there's a lot\Nof non-monetary Necessities Dialogue: 0,0:07:02.10,0:07:04.90,Default,,0000,0000,0000,,kind of multi modality\Nin these relationships Dialogue: 0,0:07:05.30,0:07:11.50,Default,,0000,0000,0000,,and they're they're going to be very\Npowerful but I still stand by that. Dialogue: 0,0:07:11.70,0:07:16.10,Default,,0000,0000,0000,,It kind of It kind of this message just\Nhave a huge amount to offer the for Dialogue: 0,0:07:16.40,0:07:18.10,Default,,0000,0000,0000,,for economists and they go. Dialogue: 0,0:07:18.20,0:07:21.70,Default,,0000,0000,0000,,To be a big part of the future. Dialogue: 0,0:07:23.40,0:07:25.80,Default,,0000,0000,0000,,Feels like there's something interesting\Nto be said about machine learning here. Dialogue: 0,0:07:25.80,0:07:27.70,Default,,0000,0000,0000,,So, here I was wondering,\Ncould you give some more, Dialogue: 0,0:07:28.00,0:07:29.00,Default,,0000,0000,0000,,maybe some examples Dialogue: 0,0:07:29.00,0:07:32.50,Default,,0000,0000,0000,,of the sorts of examples you're thinking\Nabout with applications? I'm at the moment. Dialogue: 0,0:07:32.50,0:07:34.10,Default,,0000,0000,0000,,So while I'm on areas where Dialogue: 0,0:07:34.70,0:07:36.40,Default,,0000,0000,0000,,instead of looking for average Dialogue: 0,0:07:36.50,0:07:42.20,Default,,0000,0000,0000,,cause of facts were looking for\Nindividualized estimates, and predictions of Dialogue: 0,0:07:42.40,0:07:47.50,Default,,0000,0000,0000,,of course of facts and their machine\Nlearning algorithms have been very effective, Dialogue: 0,0:07:48.00,0:07:48.10,Default,,0000,0000,0000,,too. Dialogue: 0,0:07:48.30,0:07:51.50,Default,,0000,0000,0000,,Surely would have, we would have done\Nthese things, using kernel methods. Dialogue: 0,0:07:51.60,0:07:54.50,Default,,0000,0000,0000,,And theoretically they work great and Dialogue: 0,0:07:54.60,0:07:57.40,Default,,0000,0000,0000,,the sort of some arguments that\Nyou formally can't do any better. Dialogue: 0,0:07:57.60,0:08:00.50,Default,,0000,0000,0000,,But in practice, they\Ndon't work very well and Dialogue: 0,0:08:00.90,0:08:05.40,Default,,0000,0000,0000,,random Forest, random cause of forest\Ntype things that stuff on wagon, Susan. Dialogue: 0,0:08:05.40,0:08:09.50,Default,,0000,0000,0000,,I think I've been working\Non. I used very widely. Dialogue: 0,0:08:09.60,0:08:12.20,Default,,0000,0000,0000,,They've been very effective,\Nkind of, in the settings Dialogue: 0,0:08:12.40,0:08:18.10,Default,,0000,0000,0000,,to actually get cause of facts\Nthat are that the ferry by Dialogue: 0,0:08:18.20,0:08:19.90,Default,,0000,0000,0000,,Bike over has, and this kind of, Dialogue: 0,0:08:20.70,0:08:25.70,Default,,0000,0000,0000,,I think this is still just the beginning\Nof these methods. But in many cases, Dialogue: 0,0:08:26.40,0:08:31.60,Default,,0000,0000,0000,,the these algorithms are very\Neffective as searching over big spaces Dialogue: 0,0:08:31.80,0:08:35.60,Default,,0000,0000,0000,,and finding the functions that fit Dialogue: 0,0:08:35.90,0:08:41.10,Default,,0000,0000,0000,,the very well in ways that we\Ncouldn't really do the beforehand. Dialogue: 0,0:08:41.50,0:08:45.30,Default,,0000,0000,0000,,I don't know of an example, where\Nmachine learning has generated insights Dialogue: 0,0:08:45.30,0:08:48.10,Default,,0000,0000,0000,,about a causal effect that\NI'm interested in. And I, Dialogue: 0,0:08:48.30,0:08:51.30,Default,,0000,0000,0000,,You know of examples where it's\Npotentially very misleading. Dialogue: 0,0:08:51.30,0:08:53.70,Default,,0000,0000,0000,,So I've done some work\Nwith Brigham Franz and Dialogue: 0,0:08:54.10,0:08:55.10,Default,,0000,0000,0000,,using, for example, Dialogue: 0,0:08:55.10,0:08:59.90,Default,,0000,0000,0000,,random Forest to model covariate effects\Nin an instrumental variables problem. Dialogue: 0,0:09:00.20,0:09:01.20,Default,,0000,0000,0000,,Where you need, Dialogue: 0,0:09:01.60,0:09:03.50,Default,,0000,0000,0000,,you need to condition on covariance Dialogue: 0,0:09:04.40,0:09:08.20,Default,,0000,0000,0000,,and you don't particularly have strong\Nfeelings about the functional form for that. Dialogue: 0,0:09:08.20,0:09:10.00,Default,,0000,0000,0000,,So maybe you should curve Dialogue: 0,0:09:10.50,0:09:10.90,Default,,0000,0000,0000,,think, Dialogue: 0,0:09:10.90,0:09:14.50,Default,,0000,0000,0000,,be open to flexible curve fitting\Nand that leads you down a path Dialogue: 0,0:09:14.50,0:09:18.00,Default,,0000,0000,0000,,where there's a lot of\Nnonlinearities in the model and Dialogue: 0,0:09:18.20,0:09:23.00,Default,,0000,0000,0000,,That's very dangerous with IV because\Nany sort of excluded non-linearity Dialogue: 0,0:09:23.30,0:09:27.60,Default,,0000,0000,0000,,potentially generates a spurious, causal\Neffect and Brigham. And I showed that Dialogue: 0,0:09:27.90,0:09:32.20,Default,,0000,0000,0000,,very powerfully. I think in\Nthe case of two instruments Dialogue: 0,0:09:32.70,0:09:36.00,Default,,0000,0000,0000,,that come from a paper, mine\Nwith Bill Evans. Where if you, Dialogue: 0,0:09:36.50,0:09:37.60,Default,,0000,0000,0000,,you know, replace it Dialogue: 0,0:09:38.10,0:09:42.60,Default,,0000,0000,0000,,in a traditional two stage least squares,\Nestimator with some kind of random Forest. Dialogue: 0,0:09:42.90,0:09:48.00,Default,,0000,0000,0000,,You get very precisely at\Nestimated nonsense estimates and Dialogue: 0,0:09:49.00,0:09:51.10,Default,,0000,0000,0000,,You know, I think that's\Na, that's a big caution. Dialogue: 0,0:09:51.10,0:09:53.40,Default,,0000,0000,0000,,And I, you know, in view of those findings Dialogue: 0,0:09:53.70,0:09:57.10,Default,,0000,0000,0000,,in an example, I care about where\Nthe instruments are very simple Dialogue: 0,0:09:57.40,0:09:59.10,Default,,0000,0000,0000,,and I believe that they're valid, Dialogue: 0,0:09:59.30,0:10:01.60,Default,,0000,0000,0000,,you know, I would be skeptical of that. So Dialogue: 0,0:10:02.90,0:10:06.80,Default,,0000,0000,0000,,non-linearity and Ivy don't mix\Nvery comfortably. Now I said, Dialogue: 0,0:10:07.20,0:10:11.40,Default,,0000,0000,0000,,you know in some sense that's already\Na more complicated. Well, it's Ivy. Dialogue: 0,0:10:11.60,0:10:11.90,Default,,0000,0000,0000,,Yeah, Dialogue: 0,0:10:12.50,0:10:16.70,Default,,0000,0000,0000,,but then we work on that and friend out. Dialogue: 0,0:10:18.60,0:10:22.30,Default,,0000,0000,0000,,I sat in tow vehicle actually guy a lot\Nof these papers Cross by my desk and it, Dialogue: 0,0:10:22.70,0:10:29.50,Default,,0000,0000,0000,,but the motivation is is not\Nclear at a fact, really lacking. Dialogue: 0,0:10:29.80,0:10:35.10,Default,,0000,0000,0000,,And they're not, they're not, they called\Ntype semi-parametric foundational papers. Dialogue: 0,0:10:35.40,0:10:37.10,Default,,0000,0000,0000,,So that that's a big problem Dialogue: 0,0:10:38.00,0:10:42.40,Default,,0000,0000,0000,,and kind of related problem is that\Nwe have this tradition in econometrics Dialogue: 0,0:10:42.60,0:10:47.50,Default,,0000,0000,0000,,being very focused on these formulas\Nand tonic results kind of weird. Dialogue: 0,0:10:48.80,0:10:52.60,Default,,0000,0000,0000,,We have just have a lot of papers\Nthat where you people, propose Dialogue: 0,0:10:52.80,0:10:55.70,Default,,0000,0000,0000,,a method and then establish\Nthe asymptotic properties Dialogue: 0,0:10:56.30,0:11:01.90,Default,,0000,0000,0000,,in in a very kind of\Nstandardized way that bad. Dialogue: 0,0:11:02.90,0:11:07.20,Default,,0000,0000,0000,,Well, I think it's sort of close\Nthe door for a lot of work. Dialogue: 0,0:11:07.20,0:11:11.60,Default,,0000,0000,0000,,That doesn't fit it into that. We're\Nin the machine learning literature. Dialogue: 0,0:11:11.90,0:11:14.30,Default,,0000,0000,0000,,A lot of things are\Nmore algorithmic people. Dialogue: 0,0:11:15.70,0:11:18.50,Default,,0000,0000,0000,,Had algorithms for coming\Nup with predictions. Dialogue: 0,0:11:18.80,0:11:23.60,Default,,0000,0000,0000,,The turn out to actually work much better\Nthan say, nonparametric kernel regression Dialogue: 0,0:11:24.00,0:11:26.80,Default,,0000,0000,0000,,for a long-ass time. We're doing all\Nthe nonparametric syndecan, metrics. Dialogue: 0,0:11:26.80,0:11:31.10,Default,,0000,0000,0000,,We do it using kernel regression and\NI was great for proving theorems. Dialogue: 0,0:11:31.30,0:11:34.80,Default,,0000,0000,0000,,You could get confidence, intervals and\Nconsistency, and asymptotic normality, Dialogue: 0,0:11:34.80,0:11:37.00,Default,,0000,0000,0000,,and it was all great, but\Nit wasn't very useful. Dialogue: 0,0:11:37.30,0:11:40.90,Default,,0000,0000,0000,,And the things they did in machine\Nlearning. I just way way better, Dialogue: 0,0:11:41.00,0:11:45.10,Default,,0000,0000,0000,,but they didn't have to the proper. That's\Nnot my beef with machine learning theory. Dialogue: 0,0:11:45.30,0:11:51.20,Default,,0000,0000,0000,,As we know my name, I'm saying\Nthere for the prediction part. Dialogue: 0,0:11:51.40,0:11:54.50,Default,,0000,0000,0000,,It does much better. Yeah, that's\Na better curve fitting to it. Dialogue: 0,0:11:54.90,0:11:56.50,Default,,0000,0000,0000,,But it did. So Dialogue: 0,0:11:57.10,0:12:02.70,Default,,0000,0000,0000,,in a way that would not have made\Nthose papers initially easy to get into Dialogue: 0,0:12:03.00,0:12:06.30,Default,,0000,0000,0000,,the econometrics journals because it\Nwasn't proving the type of things. Dialogue: 0,0:12:06.40,0:12:11.20,Default,,0000,0000,0000,,You know, when when Brian was doing his\Nregression trees that just didn't fit in Dialogue: 0,0:12:11.80,0:12:15.10,Default,,0000,0000,0000,,and I think he would have\Nhad a very hard time. Dialogue: 0,0:12:15.20,0:12:18.40,Default,,0000,0000,0000,,Polishing these things. And it\Ncould have had six journals. Dialogue: 0,0:12:18.90,0:12:24.40,Default,,0000,0000,0000,,I, so I think we're we limited\Nourselves too much and we Dialogue: 0,0:12:24.70,0:12:27.90,Default,,0000,0000,0000,,that left us close things off Dialogue: 0,0:12:28.00,0:12:30.80,Default,,0000,0000,0000,,for a lot of these machine learning\Nmethods, that actually very useful. Dialogue: 0,0:12:30.90,0:12:34.00,Default,,0000,0000,0000,,Hmm. I mean, I think they're in general, Dialogue: 0,0:12:34.90,0:12:36.20,Default,,0000,0000,0000,,that literature the computer. Dialogue: 0,0:12:36.20,0:12:39.30,Default,,0000,0000,0000,,Scientists have brought a huge\Nnumber of these algorithms. Dialogue: 0,0:12:39.60,0:12:43.90,Default,,0000,0000,0000,,The have proposed a huge number of these\Nalgorithms that actually very useful Dialogue: 0,0:12:44.00,0:12:44.70,Default,,0000,0000,0000,,at that are Dialogue: 0,0:12:45.50,0:12:49.10,Default,,0000,0000,0000,,Affecting the way we're going\Nto be doing empirical work, Dialogue: 0,0:12:49.80,0:12:55.10,Default,,0000,0000,0000,,but we've not fully internalize that\Nbecause we're still very focused on getting Dialogue: 0,0:12:55.30,0:12:57.50,Default,,0000,0000,0000,,Point estimates and\Ngetting standard errors Dialogue: 0,0:12:58.60,0:13:01.20,Default,,0000,0000,0000,,and getting P values in a way that Dialogue: 0,0:13:01.70,0:13:03.10,Default,,0000,0000,0000,,we need to move Beyond Dialogue: 0,0:13:03.30,0:13:04.30,Default,,0000,0000,0000,,to fully harness. Dialogue: 0,0:13:04.30,0:13:10.70,Default,,0000,0000,0000,,The force, the quote, the benefits\Nfrom machine learning literature. Dialogue: 0,0:13:10.90,0:13:15.10,Default,,0000,0000,0000,,Hmm. On the one hand. I guess I very\Nmuch take your point that sort of the the Dialogue: 0,0:13:15.20,0:13:18.60,Default,,0000,0000,0000,,Tional. Econometrics, framework\Nof sort of propose, a method, Dialogue: 0,0:13:18.60,0:13:22.60,Default,,0000,0000,0000,,proved a limit theorem under some\Nasymptotic story, story story, Dialogue: 0,0:13:22.60,0:13:26.90,Default,,0000,0000,0000,,story story publish a\Npaper is constraining. Dialogue: 0,0:13:26.90,0:13:29.70,Default,,0000,0000,0000,,And that in some sense by thinking, more, Dialogue: 0,0:13:29.70,0:13:33.20,Default,,0000,0000,0000,,broadly about what a methods paper could\Nlook. Like we may write in some sense. Dialogue: 0,0:13:33.20,0:13:35.90,Default,,0000,0000,0000,,Certainly the machine learning\Nliterature has found a bunch of things, Dialogue: 0,0:13:35.90,0:13:38.30,Default,,0000,0000,0000,,which seem to work quite\Nwell for a number of problems Dialogue: 0,0:13:38.30,0:13:42.40,Default,,0000,0000,0000,,and are now having substantial influence\Nin economics. I guess a question. Dialogue: 0,0:13:42.40,0:13:44.80,Default,,0000,0000,0000,,I'm interested in is, how do you think? Dialogue: 0,0:13:45.20,0:13:47.60,Default,,0000,0000,0000,,The goal of fear. Dialogue: 0,0:13:47.90,0:13:51.20,Default,,0000,0000,0000,,Sort of, do you think there is? There's\Nno value in the theory part of it? Dialogue: 0,0:13:51.60,0:13:54.80,Default,,0000,0000,0000,,Because I guess it's sort of a question\Nthat I often have to sort of seeing Dialogue: 0,0:13:54.80,0:13:56.90,Default,,0000,0000,0000,,that output from a machine learning tool Dialogue: 0,0:13:56.90,0:13:59.40,Default,,0000,0000,0000,,that actually a number of the\Nmethods that you talked about. Dialogue: 0,0:13:59.40,0:14:01.80,Default,,0000,0000,0000,,Actually do have inferential\Nresults, develop for them, Dialogue: 0,0:14:02.60,0:14:06.40,Default,,0000,0000,0000,,something that I always wonder about a sort\Nof uncertainty quantification and just, Dialogue: 0,0:14:06.50,0:14:08.00,Default,,0000,0000,0000,,you know, I I have my prior, Dialogue: 0,0:14:08.00,0:14:11.00,Default,,0000,0000,0000,,I come into the world with my view.\NI see the result of this thing. Dialogue: 0,0:14:11.00,0:14:14.50,Default,,0000,0000,0000,,How should I update based on it? And\Nin some sense, if I'm in a world where Dialogue: 0,0:14:14.60,0:14:15.10,Default,,0000,0000,0000,,things are. Dialogue: 0,0:14:15.20,0:14:18.20,Default,,0000,0000,0000,,Normally distributed. I know\Nhow to do it here. I don't. Dialogue: 0,0:14:18.20,0:14:21.40,Default,,0000,0000,0000,,And so I'm interested to hear\Nhad I think it sounds. So Dialogue: 0,0:14:21.50,0:14:24.30,Default,,0000,0000,0000,,I don't see this as sort\Nof close it saying, well Dialogue: 0,0:14:24.40,0:14:26.50,Default,,0000,0000,0000,,we do these results\Nare not not interesting Dialogue: 0,0:14:26.60,0:14:27.70,Default,,0000,0000,0000,,but it's gonna be a lot of cases Dialogue: 0,0:14:28.00,0:14:31.20,Default,,0000,0000,0000,,where it's going to be incredibly hard to\Nget those results and we may not be able Dialogue: 0,0:14:31.20,0:14:33.20,Default,,0000,0000,0000,,to get there and Dialogue: 0,0:14:33.40,0:14:37.70,Default,,0000,0000,0000,,we may need to do it in stages. Where\Nfirst someone says. Hey I have this Dialogue: 0,0:14:39.60,0:14:44.80,Default,,0000,0000,0000,,interesting algorithm for for doing\Nsomething and it works well by some Dialogue: 0,0:14:45.60,0:14:49.90,Default,,0000,0000,0000,,The Criterion that on this\Nthis particular data set Dialogue: 0,0:14:51.00,0:14:53.40,Default,,0000,0000,0000,,and I'm visit put it\Nout there and we should Dialogue: 0,0:14:53.70,0:14:58.00,Default,,0000,0000,0000,,maybe someone will figure out a way that\Nyou can later actually still do inference Dialogue: 0,0:14:58.00,0:14:59.10,Default,,0000,0000,0000,,on the some condition. Dialogue: 0,0:14:59.10,0:15:02.10,Default,,0000,0000,0000,,So and maybe those are not\Nparticularly realistic conditions, Dialogue: 0,0:15:02.10,0:15:05.50,Default,,0000,0000,0000,,then we kind of go further,\Nbut I think we've been Dialogue: 0,0:15:06.70,0:15:11.40,Default,,0000,0000,0000,,Too constraining things too much where we\Nsaid, you know, this is the type of things Dialogue: 0,0:15:12.10,0:15:14.40,Default,,0000,0000,0000,,that we need to do. And I had some sense Dialogue: 0,0:15:15.70,0:15:18.20,Default,,0000,0000,0000,,that goes back to kind of\Nthe way they dress and I Dialogue: 0,0:15:19.70,0:15:21.90,Default,,0000,0000,0000,,thought about things for the\Nlocal average treatment effect. Dialogue: 0,0:15:21.90,0:15:24.60,Default,,0000,0000,0000,,That wasn't quite the way people\Nwere thinking about these problems. Dialogue: 0,0:15:24.60,0:15:29.20,Default,,0000,0000,0000,,Before they say they there was a sense\Nthat some of the people said, you know, Dialogue: 0,0:15:29.50,0:15:31.90,Default,,0000,0000,0000,,the way you need to do. These\Nthings, is you first, say Dialogue: 0,0:15:32.20,0:15:36.30,Default,,0000,0000,0000,,what you're interested in estimating\Nand then you do the best job you can. Dialogue: 0,0:15:36.50,0:15:37.70,Default,,0000,0000,0000,,In estimating that Dialogue: 0,0:15:38.10,0:15:44.20,Default,,0000,0000,0000,,and what you have you guys had doing is\Ndoing it, you guys are doing it backwards. Dialogue: 0,0:15:44.30,0:15:46.70,Default,,0000,0000,0000,,You're going to say\Nhere. I have an estimator Dialogue: 0,0:15:47.30,0:15:49.60,Default,,0000,0000,0000,,and now I'm going to figure out what what Dialogue: 0,0:15:49.80,0:15:51.40,Default,,0000,0000,0000,,what it says estimating then expose. Dialogue: 0,0:15:51.40,0:15:53.90,Default,,0000,0000,0000,,You're going to say why you\Nthink that's interesting Dialogue: 0,0:15:53.90,0:15:56.60,Default,,0000,0000,0000,,or maybe why it's not interesting\Nand that's that's not okay. Dialogue: 0,0:15:56.60,0:15:58.60,Default,,0000,0000,0000,,You're not allowed to do that that way. Dialogue: 0,0:15:59.00,0:16:04.10,Default,,0000,0000,0000,,And I think we should just be a little\Nbit more flexible and thinking about the Dialogue: 0,0:16:04.30,0:16:06.30,Default,,0000,0000,0000,,how to look at at Dialogue: 0,0:16:06.40,0:16:11.30,Default,,0000,0000,0000,,Problems because I think we've missed\Nsome things by not by not doing that. Dialogue: 0,0:16:13.00,0:16:16.60,Default,,0000,0000,0000,,So you've heard our views.\NIsaiah, you've seen that, we have Dialogue: 0,0:16:17.00,0:16:20.40,Default,,0000,0000,0000,,some points of disagreement. Why\Ndon't you referee this dispute for us? Dialogue: 0,0:16:22.50,0:16:28.10,Default,,0000,0000,0000,,Oh, I'm so so nice of you to ask me\Na small question. So I guess for one. Dialogue: 0,0:16:28.20,0:16:33.20,Default,,0000,0000,0000,,I very much agree with something\Nthat he do said earlier of. Dialogue: 0,0:16:36.00,0:16:36.30,Default,,0000,0000,0000,,So what? Dialogue: 0,0:16:36.50,0:16:37.90,Default,,0000,0000,0000,,Where it seems. Where the, Dialogue: 0,0:16:37.90,0:16:41.40,Default,,0000,0000,0000,,the case for machine learning seems\Nrelatively clear is in settings, where Dialogue: 0,0:16:41.50,0:16:45.10,Default,,0000,0000,0000,,you know, we're interested in some version\Nof a nonparametric prediction problem. Dialogue: 0,0:16:45.10,0:16:49.70,Default,,0000,0000,0000,,So I'm interested in estimating a conditional\Nexpectation or conditional probability Dialogue: 0,0:16:50.00,0:16:52.10,Default,,0000,0000,0000,,and in the past, maybe I\Nwould have run a colonel, Dialogue: 0,0:16:52.10,0:16:55.80,Default,,0000,0000,0000,,I would have run a kernel regression or\NI would have run a series regression or Dialogue: 0,0:16:56.10,0:16:57.40,Default,,0000,0000,0000,,something along those lines. Dialogue: 0,0:16:57.70,0:16:58.00,Default,,0000,0000,0000,,Sort of, Dialogue: 0,0:16:58.00,0:16:58.70,Default,,0000,0000,0000,,it seems like Dialogue: 0,0:16:58.70,0:17:02.00,Default,,0000,0000,0000,,at this point we've a fairly good\Nsense that in a fairly wide range Dialogue: 0,0:17:02.00,0:17:06.30,Default,,0000,0000,0000,,of applications machine learning\Nmethods seem to do better for Dialogue: 0,0:17:06.40,0:17:06.80,Default,,0000,0000,0000,,Or, you know, Dialogue: 0,0:17:06.80,0:17:08.80,Default,,0000,0000,0000,,estimating conditional mean functions Dialogue: 0,0:17:08.80,0:17:12.00,Default,,0000,0000,0000,,or conditional probabilities or\Nvarious other nonparametric objects Dialogue: 0,0:17:12.40,0:17:16.60,Default,,0000,0000,0000,,than more traditional nonparametric\Nmethods that were studied in econometrics Dialogue: 0,0:17:16.60,0:17:19.10,Default,,0000,0000,0000,,and statistics, especially\Nin high dimensional settings. Dialogue: 0,0:17:19.50,0:17:23.10,Default,,0000,0000,0000,,So you thinking of maybe the propensity\Nscore or something like that? Dialogue: 0,0:17:23.10,0:17:25.30,Default,,0000,0000,0000,,So exactly, so nuisance functions. Yeah. Dialogue: 0,0:17:25.30,0:17:28.90,Default,,0000,0000,0000,,So things like propensity scores\Nthings or I mean even objects Dialogue: 0,0:17:28.90,0:17:30.10,Default,,0000,0000,0000,,of more direct inference Dialogue: 0,0:17:30.20,0:17:32.40,Default,,0000,0000,0000,,interest, like conditional\Naverage treatment effects, right? Dialogue: 0,0:17:32.40,0:17:35.10,Default,,0000,0000,0000,,Which of the difference of two\Nconditional, expectation functions, Dialogue: 0,0:17:35.10,0:17:36.30,Default,,0000,0000,0000,,potentially things like that. Dialogue: 0,0:17:36.50,0:17:40.40,Default,,0000,0000,0000,,Of course, even there,\Nright? We the the theory Dialogue: 0,0:17:40.50,0:17:43.70,Default,,0000,0000,0000,,for in France or the theory for\Nsort of how to how to interpret, Dialogue: 0,0:17:43.70,0:17:45.90,Default,,0000,0000,0000,,how to make large simple statements\Nabout some of these things are Dialogue: 0,0:17:46.00,0:17:50.10,Default,,0000,0000,0000,,less well-developed depending on the\Nmachine learning, estimator used. Dialogue: 0,0:17:50.10,0:17:53.80,Default,,0000,0000,0000,,And so, I think there's something\Nthat is tricky is that we Dialogue: 0,0:17:53.90,0:17:55.70,Default,,0000,0000,0000,,can have these methods, which work a lot, Dialogue: 0,0:17:55.70,0:17:58.00,Default,,0000,0000,0000,,which seemed to work a lot\Nbetter for some purposes. Dialogue: 0,0:17:58.00,0:18:01.60,Default,,0000,0000,0000,,But which we need to be a bit\Ncareful in how we plug them in or how Dialogue: 0,0:18:01.60,0:18:03.30,Default,,0000,0000,0000,,we interpret the resulting statements. Dialogue: 0,0:18:03.60,0:18:06.20,Default,,0000,0000,0000,,But of course, that's a very,\Nvery active area right now. We're Dialogue: 0,0:18:06.40,0:18:10.40,Default,,0000,0000,0000,,People are doing tons of great work.\NAnd so I exfoli expect and hope Dialogue: 0,0:18:10.40,0:18:12.80,Default,,0000,0000,0000,,to see much more going forward there. Dialogue: 0,0:18:13.00,0:18:17.30,Default,,0000,0000,0000,,So one issue with machine learning,\Nthat always seems a danger is, or Dialogue: 0,0:18:17.40,0:18:20.30,Default,,0000,0000,0000,,that is sometimes a danger\Nand had some times led to Dialogue: 0,0:18:20.50,0:18:22.60,Default,,0000,0000,0000,,applications that have\Nmade. Less sense, is Dialogue: 0,0:18:22.80,0:18:25.10,Default,,0000,0000,0000,,when folks start with a method that are Dialogue: 0,0:18:25.30,0:18:28.50,Default,,0000,0000,0000,,start with a method that they're very\Nexcited about rather than a question, Dialogue: 0,0:18:28.90,0:18:32.10,Default,,0000,0000,0000,,right? So sort of starting with\Na question where here's the Dialogue: 0,0:18:32.50,0:18:36.20,Default,,0000,0000,0000,,object I'm interested in here is\Nthe parameter of Interest. Let me Dialogue: 0,0:18:36.70,0:18:37.10,Default,,0000,0000,0000,,You know, Dialogue: 0,0:18:37.30,0:18:39.50,Default,,0000,0000,0000,,think about how I would\Nidentify that thing, Dialogue: 0,0:18:39.50,0:18:41.80,Default,,0000,0000,0000,,how I would recover that\Nthing, if I had a ton of data, Dialogue: 0,0:18:41.90,0:18:44.00,Default,,0000,0000,0000,,oh, here's a conditional\Nexpectation function. Dialogue: 0,0:18:44.00,0:18:47.10,Default,,0000,0000,0000,,Let me plug in an estimator on\Nmachine. Learning estimator for that. Dialogue: 0,0:18:47.20,0:18:48.80,Default,,0000,0000,0000,,That seems very very sensible. Dialogue: 0,0:18:49.00,0:18:53.10,Default,,0000,0000,0000,,Whereas, you know, if I\Ndigress quantity on price Dialogue: 0,0:18:53.70,0:18:56.00,Default,,0000,0000,0000,,and say that I used a\Nmachine learning method, Dialogue: 0,0:18:56.30,0:18:58.90,Default,,0000,0000,0000,,maybe I'm satisfied that that\Nsolves the in dodging, 80 problem. Dialogue: 0,0:18:58.90,0:19:01.20,Default,,0000,0000,0000,,We're usually worried\Nabout their maybe I'm not, Dialogue: 0,0:19:01.50,0:19:03.20,Default,,0000,0000,0000,,but again, that's something where the, Dialogue: 0,0:19:03.40,0:19:06.30,Default,,0000,0000,0000,,the way to address. It, seems\Nrelatively clear, right? Dialogue: 0,0:19:06.50,0:19:09.00,Default,,0000,0000,0000,,It's the find your object of interest and Dialogue: 0,0:19:09.20,0:19:11.60,Default,,0000,0000,0000,,think about, is that just\Nbringing the economics? Dialogue: 0,0:19:11.70,0:19:12.20,Default,,0000,0000,0000,,Exactly. Dialogue: 0,0:19:12.20,0:19:15.40,Default,,0000,0000,0000,,And and can I think about it,\Nand they denied it, but harnessed Dialogue: 0,0:19:15.40,0:19:18.30,Default,,0000,0000,0000,,the power of the machine\Nlearning methods for precisely Dialogue: 0,0:19:18.50,0:19:22.80,Default,,0000,0000,0000,,for some of the components precisely.\NExactly. So sort of, you know, the, the, Dialogue: 0,0:19:22.90,0:19:25.60,Default,,0000,0000,0000,,the question of interest is the same as\Nthe question of interest is always been, Dialogue: 0,0:19:25.60,0:19:29.50,Default,,0000,0000,0000,,but we now better methods for estimating\Nsome pieces of this, right? The Dialogue: 0,0:19:29.90,0:19:31.60,Default,,0000,0000,0000,,the place that seems harder to, uh, Dialogue: 0,0:19:31.90,0:19:33.40,Default,,0000,0000,0000,,harder to forecast is Right. Dialogue: 0,0:19:33.40,0:19:36.30,Default,,0000,0000,0000,,Obviously, there's a huge amount\Ngoing in going on in the machine. Dialogue: 0,0:19:36.40,0:19:37.40,Default,,0000,0000,0000,,Learning literature Dialogue: 0,0:19:37.50,0:19:39.70,Default,,0000,0000,0000,,and the great sort of The Limited ways Dialogue: 0,0:19:39.70,0:19:42.90,Default,,0000,0000,0000,,of plugging it in that I've referenced\Nso far are limited piece of that. Dialogue: 0,0:19:43.00,0:19:46.10,Default,,0000,0000,0000,,And so I think there are all sorts of\Nother interesting questions about where, Dialogue: 0,0:19:46.30,0:19:46.90,Default,,0000,0000,0000,,right sort of Dialogue: 0,0:19:47.10,0:19:49.30,Default,,0000,0000,0000,,where does this interaction\Ngo? What else can we learn? Dialogue: 0,0:19:49.30,0:19:52.00,Default,,0000,0000,0000,,And that's something where,\Nyou know, I think there's Dialogue: 0,0:19:52.20,0:19:56.40,Default,,0000,0000,0000,,a ton going on which seems very promising\Nand I have no idea what the answer is. Dialogue: 0,0:19:57.00,0:20:01.20,Default,,0000,0000,0000,,No, no. No, it's I so I totally\Nagree with that but it's no. Dialogue: 0,0:20:01.80,0:20:03.50,Default,,0000,0000,0000,,That's makes it very exciting. Dialogue: 0,0:20:03.80,0:20:06.10,Default,,0000,0000,0000,,And I think that's just a\Nlittle work to be done there. Dialogue: 0,0:20:06.60,0:20:11.40,Default,,0000,0000,0000,,All right. So I say agrees\Nwith me there, say that person. Dialogue: 0,0:20:14.50,0:20:17.70,Default,,0000,0000,0000,,If you'd like to watch more\NNobel conversations, click here, Dialogue: 0,0:20:18.00,0:20:20.40,Default,,0000,0000,0000,,or if you'd like to learn\Nmore about econometrics, Dialogue: 0,0:20:20.50,0:20:23.10,Default,,0000,0000,0000,,check out Josh's mastering\Neconometrics series. Dialogue: 0,0:20:23.60,0:20:26.50,Default,,0000,0000,0000,,If you'd like to learn more\Nabout he do Josh and Isaiah Dialogue: 0,0:20:26.70,0:20:28.20,Default,,0000,0000,0000,,check out the links in the description.