[Script Info] Title: [Events] Format: Layer, Start, End, Style, Name, MarginL, MarginR, MarginV, Effect, Text Dialogue: 0,0:00:00.00,0:00:05.40,Default,,0000,0000,0000,,Hi, in this lecture we are going to look\Nat our fourth category of reasons about why Dialogue: 0,0:00:05.40,0:00:09.47,Default,,0000,0000,0000,,you'd want to take a course in modeling,\Nwhy modeling is so important. And that is Dialogue: 0,0:00:09.47,0:00:12.98,Default,,0000,0000,0000,,to help you make better decisions,\Nstrategize better, and design things Dialogue: 0,0:00:12.98,0:00:16.95,Default,,0000,0000,0000,,better. So lets get started, this should\Nbe a lot of fun. Alright, so first reason Dialogue: 0,0:00:16.95,0:00:20.96,Default,,0000,0000,0000,,why models are so useful. They are good\Ndecision aides, they help you make better Dialogue: 0,0:00:20.96,0:00:25.04,Default,,0000,0000,0000,,decisions. Let me give you an example.\NThese get us going here. So what you see Dialogue: 0,0:00:25.04,0:00:29.38,Default,,0000,0000,0000,,is a whole bunch of different financial\Ninstitutions, these are companies like Dialogue: 0,0:00:29.38,0:00:33.05,Default,,0000,0000,0000,,Bear Sterns, AIG, CitiGroup, Morgan\NStanley and this represents the Dialogue: 0,0:00:33.05,0:00:37.39,Default,,0000,0000,0000,,relationship between these companies, in\Nterms of how one of their economic success Dialogue: 0,0:00:37.39,0:00:41.49,Default,,0000,0000,0000,,depends on another. Now imagine you are\Nthe federal government and you've got a Dialogue: 0,0:00:41.49,0:00:45.34,Default,,0000,0000,0000,,financial crisis. So a lot of these\Ncompanies, or some of these companies are Dialogue: 0,0:00:45.34,0:00:49.55,Default,,0000,0000,0000,,starting to fail and you've got to decide\Nokay do I bail them out, do I save one of Dialogue: 0,0:00:49.55,0:00:53.80,Default,,0000,0000,0000,,these companies? Well now lets use one of\Nthese very simple models to help make that Dialogue: 0,0:00:53.80,0:00:57.96,Default,,0000,0000,0000,,decision. So to do that we need a little\Nmore of an understanding of what Dialogue: 0,0:00:57.96,0:01:02.25,Default,,0000,0000,0000,,these numbers represent. So lets look at\NAIG which is right here. And JP Morgan Dialogue: 0,0:01:02.25,0:01:07.97,Default,,0000,0000,0000,,which is right here So now we see a number\Nof 466 between the two of those. What that Dialogue: 0,0:01:07.97,0:01:12.96,Default,,0000,0000,0000,,number represents is how correlated JP\NMorgan success is with AIG success. In Dialogue: 0,0:01:12.96,0:01:18.14,Default,,0000,0000,0000,,particular how correlated their failures are. So if\NAIG has a bad day, how likely is it that Dialogue: 0,0:01:18.14,0:01:23.45,Default,,0000,0000,0000,,JP Morgan has a bad day? And we see that\Nit is a really big number. Now if you look Dialogue: 0,0:01:23.45,0:01:28.83,Default,,0000,0000,0000,,up here at this 94, this represents the link between Wells\NFargo and Lehman Brothers. What that tells Dialogue: 0,0:01:28.83,0:01:34.14,Default,,0000,0000,0000,,us is that Lehman Brothers has a bad day,\Nwell it only has a small effect on Wells Dialogue: 0,0:01:34.14,0:01:38.34,Default,,0000,0000,0000,,Fargo and vice versa. So now you are the\Ngovernment and you got to decide, okay who do I Dialogue: 0,0:01:38.34,0:01:43.17,Default,,0000,0000,0000,,want to bail out? Nobody or somebody? Lets\Nlook at Lehman Brothers. There's only Dialogue: 0,0:01:43.17,0:01:48.24,Default,,0000,0000,0000,,three lines going in and out of Lehman\NBrothers and one is a 94. I guess four Dialogue: 0,0:01:48.24,0:01:53.38,Default,,0000,0000,0000,,lines, one is a 103, one is a 158 and one\Nis a 155. Those are relatively small Dialogue: 0,0:01:53.38,0:01:58.54,Default,,0000,0000,0000,,numbers. So if you're the government you\Nsay, okay Lehman Brothers has been around Dialogue: 0,0:01:58.54,0:02:03.62,Default,,0000,0000,0000,,a long time and its an important company,\Nthese numbers are pretty small, if they Dialogue: 0,0:02:03.62,0:02:08.69,Default,,0000,0000,0000,,fail it doesn't look like these other\Ncompanies would fail. But now lets look at Dialogue: 0,0:02:08.69,0:02:13.39,Default,,0000,0000,0000,,AIG. We've got a 466, we've got a 441,\Nwe've got a 456, we've got a 390 and a Dialogue: 0,0:02:13.39,0:02:19.34,Default,,0000,0000,0000,,490. So there are huge numbers associated\Nwith AIG. Because there is a huge number Dialogue: 0,0:02:19.34,0:02:23.17,Default,,0000,0000,0000,,you basically have to figure, you know\Nwhat we probably have to prop AIG back up. Dialogue: 0,0:02:23.17,0:02:27.10,Default,,0000,0000,0000,,Even if you don't want to because if you\Ndon't there is the possibility that this Dialogue: 0,0:02:27.10,0:02:30.79,Default,,0000,0000,0000,,whole system will fail. So what we see\Nhere is the incredible power of models, Dialogue: 0,0:02:30.79,0:02:34.58,Default,,0000,0000,0000,,right to help us make a better decision.\NThe government did let Lehman Brothers Dialogue: 0,0:02:34.58,0:02:37.74,Default,,0000,0000,0000,,fail, and terrible for Lehman\NBrothers, but the economy sort of Dialogue: 0,0:02:37.74,0:02:41.68,Default,,0000,0000,0000,,soldiered on. They didn't let AIG fail and\Nwe don't know for sure that it would've Dialogue: 0,0:02:41.68,0:02:45.88,Default,,0000,0000,0000,,and we don't know for sure that the whole\Nfinancial you know apparatus United Dialogue: 0,0:02:45.88,0:02:51.15,Default,,0000,0000,0000,,States, they propped up AIG and you know\Nwe made it, the country made it. It looks Dialogue: 0,0:02:51.15,0:02:55.53,Default,,0000,0000,0000,,they've made a reasonable decision.\NAlright so that is big financial Dialogue: 0,0:02:55.53,0:03:00.56,Default,,0000,0000,0000,,decisions. Lets look at something more\Nfun. This is a simple sort of logic puzzle Dialogue: 0,0:03:00.56,0:03:05.40,Default,,0000,0000,0000,,that will help us see how models can be\Nuseful. Now this is a game called, The Dialogue: 0,0:03:05.40,0:03:10.35,Default,,0000,0000,0000,,Monty Hall Problem and its named after\NMonty Hall was the host of a game show Dialogue: 0,0:03:10.35,0:03:15.20,Default,,0000,0000,0000,,called, Lets Make a Deal that aired during\Nthe 1970's. Now the problem I'm going to Dialogue: 0,0:03:15.20,0:03:20.11,Default,,0000,0000,0000,,describe to you is a characterization of a\Nevent that could happen on the show. Its Dialogue: 0,0:03:20.11,0:03:24.54,Default,,0000,0000,0000,,one of several scenarios on the show.\NHere's basically how it works. There's Dialogue: 0,0:03:24.54,0:03:29.39,Default,,0000,0000,0000,,three doors. Behind one of these doors is\Na prize, behind the other two doors there's some, you Dialogue: 0,0:03:29.39,0:03:34.12,Default,,0000,0000,0000,,know, silly thing like a goat right, or a\Nwoman dressed up in a ballerinas outfit. Dialogue: 0,0:03:34.12,0:03:38.83,Default,,0000,0000,0000,,So one of them had something fantastic\Nlike a new car or a washing machine. Now Dialogue: 0,0:03:38.83,0:03:44.32,Default,,0000,0000,0000,,what you get to do is you pick one door.\NSo maybe you pick door number one, right, Dialogue: 0,0:03:44.32,0:03:49.81,Default,,0000,0000,0000,,so you pick door number one. Now Monty\Nknows where the prize is so the two doors Dialogue: 0,0:03:49.81,0:03:54.59,Default,,0000,0000,0000,,you didn't pick, one of those always has\Nto go behind it, where you know, silly Dialogue: 0,0:03:54.59,0:03:59.07,Default,,0000,0000,0000,,prize behind it. So because one of us\Nalways has a silly prize behind it, he Dialogue: 0,0:03:59.07,0:04:03.90,Default,,0000,0000,0000,,can always show you one of those other two\Ndoors. So you pick door number one, right, Dialogue: 0,0:04:03.90,0:04:08.55,Default,,0000,0000,0000,,and what Monty does, you picked one and\Nwhat Monty does is he then opens up door Dialogue: 0,0:04:08.55,0:04:13.09,Default,,0000,0000,0000,,number three and says, here's a goat, then\Nhe says, hey, do you want to switch to Dialogue: 0,0:04:13.09,0:04:19.42,Default,,0000,0000,0000,,door number two? Well, do you? Alright,\Nthat's a hard problem so let's first try Dialogue: 0,0:04:19.42,0:04:23.84,Default,,0000,0000,0000,,to get the logic right then we'll right\Ndown a formal model. So, it's easier to Dialogue: 0,0:04:23.84,0:04:28.03,Default,,0000,0000,0000,,see the logic for this problem by\Nincreasing the number of doors. So let's Dialogue: 0,0:04:28.03,0:04:32.39,Default,,0000,0000,0000,,suppose there's five doors, and now\Nthere's five doors, let's suppose you pick Dialogue: 0,0:04:32.39,0:04:37.04,Default,,0000,0000,0000,,this blue door, this bright blue door. The\Nprobability that you're correct is 1/5th. Dialogue: 0,0:04:37.04,0:04:41.34,Default,,0000,0000,0000,,Right, one of the doors has prize, the\Nprobability you're correct is 1/5th. So Dialogue: 0,0:04:41.34,0:04:46.02,Default,,0000,0000,0000,,the probability that you're not correct Is\N4/5ths. So, there's a 1/5th chance you're Dialogue: 0,0:04:46.02,0:04:50.06,Default,,0000,0000,0000,,correct. There's a 4/5ths chance you're\Nnot. Now let's suppose that Monty Dialogue: 0,0:04:50.06,0:04:54.22,Default,,0000,0000,0000,,[inaudible] is also playing this game,\Nbecause he knows again, he knows the Dialogue: 0,0:04:54.22,0:04:58.48,Default,,0000,0000,0000,,answer. So Monty is thinking, okay, well,\Nyou know what, I'm gonna show you that Dialogue: 0,0:04:58.48,0:05:03.62,Default,,0000,0000,0000,,it's not behind the yellow door. And then\Nhe says, you know what else I'm going to Dialogue: 0,0:05:03.62,0:05:08.56,Default,,0000,0000,0000,,show you, that it's not behind the pink\Ndoor. [inaudible]. I'm gonna be nice, I'm Dialogue: 0,0:05:08.56,0:05:13.63,Default,,0000,0000,0000,,gonna show you it's not behind the green\Ndoor. Now he says, do you want to switch Dialogue: 0,0:05:13.63,0:05:18.52,Default,,0000,0000,0000,,to the light blue door to the dark blue\Ndoor. Well in this case, you should start Dialogue: 0,0:05:18.52,0:05:23.05,Default,,0000,0000,0000,,thinking, you know initially the\Nprobability I was right was only 1/5th And Dialogue: 0,0:05:23.05,0:05:26.59,Default,,0000,0000,0000,,he revealed all those other doors that\Ndoesn't seem to have the prize. It seems Dialogue: 0,0:05:26.59,0:05:30.32,Default,,0000,0000,0000,,much more likely that this is the correct\Ndoor than mine's the correct door and in Dialogue: 0,0:05:30.32,0:05:33.72,Default,,0000,0000,0000,,fact it is much more [inaudible]. The\Nprobability is 4/5ths it's behind that Dialogue: 0,0:05:33.72,0:05:37.36,Default,,0000,0000,0000,,dark blue door and only 1/5th it's behind\Nyour door. So you should switch and you Dialogue: 0,0:05:37.36,0:05:41.04,Default,,0000,0000,0000,,should also switch in the case of two. Now\Nlet's formalize this. This isn't so much, Dialogue: 0,0:05:41.04,0:05:44.46,Default,,0000,0000,0000,,this is, we'll use the simple decision\Nthree model. To show why in fact you Dialogue: 0,0:05:44.46,0:05:48.32,Default,,0000,0000,0000,,should switch. Alright, so let's start\Nout, we'll just do some basic probability. Dialogue: 0,0:05:48.32,0:05:52.22,Default,,0000,0000,0000,,There's three doors, you pick door number\None, the probability you're right is a Dialogue: 0,0:05:52.22,0:05:56.32,Default,,0000,0000,0000,,third and the probability that it's door\Nnumber two is a third and the probability Dialogue: 0,0:05:56.32,0:06:00.32,Default,,0000,0000,0000,,that it's door number three is a third.\NNow, what we want to do is break this into Dialogue: 0,0:06:00.32,0:06:04.60,Default,,0000,0000,0000,,two sets. There's a 1/3rd chance that\Nyou're right and there's a 2/3rds chance Dialogue: 0,0:06:04.60,0:06:09.57,Default,,0000,0000,0000,,that you're wrong. After you pick door\Nnumber one, the prize can't be moved. So Dialogue: 0,0:06:09.57,0:06:14.94,Default,,0000,0000,0000,,it's either behind door number two, number\Nthree or if you got it right, it's behind Dialogue: 0,0:06:14.94,0:06:20.36,Default,,0000,0000,0000,,door number one. So let's think about what\NMonty can do. Monty can basically show you Dialogue: 0,0:06:20.36,0:06:25.40,Default,,0000,0000,0000,,if it's behind door number one or door\Nnumber two, he can show you door number Dialogue: 0,0:06:25.40,0:06:30.20,Default,,0000,0000,0000,,three. He can say look, there's the goat.\NWell if he does that, because he can Dialogue: 0,0:06:30.20,0:06:34.79,Default,,0000,0000,0000,,always show you one of these doors,\Nnothing happened to your probability of Dialogue: 0,0:06:34.79,0:06:39.68,Default,,0000,0000,0000,,1/3rd. There's a 1/3rd chance you were\Nright before since he can always show you Dialogue: 0,0:06:39.68,0:06:44.21,Default,,0000,0000,0000,,a door, there's still only a 1/3rd chance\Nyou're right. Right, alternatively, Dialogue: 0,0:06:44.21,0:06:48.45,Default,,0000,0000,0000,,suppose that, It was behind door number\Nthree well then he can show you door Dialogue: 0,0:06:48.45,0:06:52.28,Default,,0000,0000,0000,,number two. He can say the goat's here.\NSo, it's still the case that nothing Dialogue: 0,0:06:52.28,0:06:56.38,Default,,0000,0000,0000,,happens to your probability. The reason\Nwhy when you think about these two sets, Dialogue: 0,0:06:56.38,0:07:00.31,Default,,0000,0000,0000,,you didn't learn anything. You learn\Nnothing about this other set right here, Dialogue: 0,0:07:00.31,0:07:04.30,Default,,0000,0000,0000,,the 2/3rds chance you're wrong because he\Ncan always show you a goat. So your Dialogue: 0,0:07:04.30,0:07:08.50,Default,,0000,0000,0000,,initial chan-, your initial probability\Nbeing correct was 1/3rd, your final chance Dialogue: 0,0:07:08.50,0:07:12.69,Default,,0000,0000,0000,,of being correct was probably 1/3rd. So\Njust this sort of idea of drawing circles Dialogue: 0,0:07:12.69,0:07:16.86,Default,,0000,0000,0000,,and writing probabilities allows us to see\Nthat the correct The correct decision on Dialogue: 0,0:07:16.86,0:07:20.20,Default,,0000,0000,0000,,the [inaudible] problem is to switch,\Nright. Just like when we looked at that Dialogue: 0,0:07:20.20,0:07:23.54,Default,,0000,0000,0000,,financial decision that the Federal\NGovernment had to make with the circles Dialogue: 0,0:07:23.54,0:07:27.10,Default,,0000,0000,0000,,and the arrows, you draw that out, and you\Nrealize the best decision is to let the Dialogue: 0,0:07:27.10,0:07:31.07,Default,,0000,0000,0000,,[inaudible] fail. Bailout AIG. Alright so\Nlets move on a look sort of the next Dialogue: 0,0:07:31.07,0:07:35.58,Default,,0000,0000,0000,,reason that models can be helpful and that\Nis comparative statics. What do I mean by Dialogue: 0,0:07:35.58,0:07:39.55,Default,,0000,0000,0000,,that? Well here is a standard model from\Neconomics, what we can think of is Dialogue: 0,0:07:39.55,0:07:43.79,Default,,0000,0000,0000,,comparative statics means you know you\Nmove from one equilibrium to another. So Dialogue: 0,0:07:43.79,0:07:47.97,Default,,0000,0000,0000,,what you see here is that S is a supply\Ncurve, that is a supply curve for some Dialogue: 0,0:07:47.97,0:07:52.32,Default,,0000,0000,0000,,good, and D, D1 and D2 are demand curves.\NSo what you see is demand shifting out. So Dialogue: 0,0:07:52.32,0:07:56.62,Default,,0000,0000,0000,,when this demand shifts out. In this way\Nwhat we get is that more goods are sold Dialogue: 0,0:07:56.62,0:08:00.99,Default,,0000,0000,0000,,the quantity goes up, and the price goes\Nup so people want more of something, more Dialogue: 0,0:08:00.99,0:08:05.31,Default,,0000,0000,0000,,is gonna get sold and the price is up. So\Nthis is where you start seeing how the Dialogue: 0,0:08:05.31,0:08:09.36,Default,,0000,0000,0000,,equilibrium moves so this is again a\Nsimple example of how. Models help us Dialogue: 0,0:08:09.36,0:08:13.75,Default,,0000,0000,0000,,understand how the world will change,\Nequilibrium world, just by drawing some Dialogue: 0,0:08:13.75,0:08:18.27,Default,,0000,0000,0000,,simple figures. Alright, reason number\Nthree. Counter factuals, what do I mean by Dialogue: 0,0:08:18.27,0:08:22.84,Default,,0000,0000,0000,,that? Well you can think you only get to\Nrun the world once, you only get to run Dialogue: 0,0:08:22.84,0:08:27.46,Default,,0000,0000,0000,,the tape one time. But if we write models\Nof the world we can sort of re-run the Dialogue: 0,0:08:27.46,0:08:31.92,Default,,0000,0000,0000,,tape using those models. So here is an\Nexample, in April of 2009, The spring of Dialogue: 0,0:08:31.92,0:08:36.39,Default,,0000,0000,0000,,2009, the Federal Government decided to\Nimplement a recovery plan. Well what you Dialogue: 0,0:08:36.39,0:08:40.37,Default,,0000,0000,0000,,see here is sort of the effect, this line\Nright here shows the effect with the Dialogue: 0,0:08:40.37,0:08:44.45,Default,,0000,0000,0000,,recovery plan, and this line shows, says,\Nthis is what a model shows what would of Dialogue: 0,0:08:44.45,0:08:48.47,Default,,0000,0000,0000,,happened without the recovery plan. Now we\Ncan't be sure that, that happened, but, Dialogue: 0,0:08:48.47,0:08:52.14,Default,,0000,0000,0000,,you know, at least we have some\Nunderstanding, perhaps, of what the effect Dialogue: 0,0:08:52.14,0:08:56.32,Default,,0000,0000,0000,,of recovery plan was, which is great. So\Nthese counter factuals are not going to be Dialogue: 0,0:08:56.32,0:09:00.46,Default,,0000,0000,0000,,exact, there going to be approximate, but\Nstill they help us figure out. After the Dialogue: 0,0:09:00.46,0:09:05.15,Default,,0000,0000,0000,,fact whether a policy was a good policy or\Nnot. Reason number four. To identify and Dialogue: 0,0:09:05.15,0:09:09.78,Default,,0000,0000,0000,,rank levers. So what we are going to do is\Nlook at a simple model of contagion of Dialogue: 0,0:09:09.78,0:09:14.18,Default,,0000,0000,0000,,failure, so this is a model where one\Ncountry might fail, so in this case that Dialogue: 0,0:09:14.18,0:09:18.82,Default,,0000,0000,0000,,country is going to be England. Then we\Ncan ask what happens over time, so you can Dialogue: 0,0:09:18.82,0:09:23.45,Default,,0000,0000,0000,,see that initially after England fails, we\Nsee Ireland and Belgium fail, and after Dialogue: 0,0:09:23.45,0:09:27.72,Default,,0000,0000,0000,,that we see France fail. And after that we\Nsee Germany fail. So what this tells us is Dialogue: 0,0:09:27.72,0:09:31.63,Default,,0000,0000,0000,,that in terms of its effect on the worlds\Nfinancial system, London is a big lever, Dialogue: 0,0:09:31.63,0:09:35.44,Default,,0000,0000,0000,,so London is something we care about a\Ngreat deal. Now lets take another policy Dialogue: 0,0:09:35.44,0:09:39.06,Default,,0000,0000,0000,,issue, climate change. One of the big\Nthings in climate change is the carbon Dialogue: 0,0:09:39.06,0:09:42.82,Default,,0000,0000,0000,,cycle, its one of the models that you use\Nall the time, simple carbon models. We Dialogue: 0,0:09:42.82,0:09:46.83,Default,,0000,0000,0000,,know that total amount of carbon is fixed,\Nthat can be up in the air or down on the Dialogue: 0,0:09:46.83,0:09:50.64,Default,,0000,0000,0000,,earth, if it is down on the earth it is\Nbetter because it doesn't contribute to Dialogue: 0,0:09:50.64,0:09:54.55,Default,,0000,0000,0000,,global warming So if you want to think\Nabout, where do you intervene, you wanna Dialogue: 0,0:09:54.55,0:09:58.49,Default,,0000,0000,0000,,ask, where in this cycle are there big\Nnumbers? Right, so you look here in terms Dialogue: 0,0:09:58.49,0:10:02.68,Default,,0000,0000,0000,,of surface radiation. That's a big number.\NWhere you think of solar radiation coming Dialogue: 0,0:10:02.68,0:10:06.72,Default,,0000,0000,0000,,in, that's a big number coming in. So, you\Nwanna, you think about where you want to Dialogue: 0,0:10:06.72,0:10:10.81,Default,,0000,0000,0000,,have a policy in fact, you want to think\Nabout it in terms of where those numbers Dialogue: 0,0:10:10.81,0:10:14.74,Default,,0000,0000,0000,,are large. So if you look at number, the\Namount of [inaudible] reflected by the Dialogue: 0,0:10:14.74,0:10:18.27,Default,,0000,0000,0000,,surface, that's only a 30, that's not a\Nvery big leber. Okay reason five, Dialogue: 0,0:10:18.27,0:10:22.07,Default,,0000,0000,0000,,experimental design. Now, what i mean by\Nexperimental design, well, suppose you Dialogue: 0,0:10:22.07,0:10:26.01,Default,,0000,0000,0000,,want to come up with some new policies.\NFor example, when the Federal Government, Dialogue: 0,0:10:26.01,0:10:30.16,Default,,0000,0000,0000,,when they wanted to, when they were trying\Nto decide how to auction off the federal Dialogue: 0,0:10:30.16,0:10:33.91,Default,,0000,0000,0000,,airwaves, right, for cell phones, they\Nwanted raise as much money as possible. Dialogue: 0,0:10:33.91,0:10:37.96,Default,,0000,0000,0000,,Well to test auction designer were best\Nthey ran some experiments. Well the thing Dialogue: 0,0:10:37.96,0:10:41.95,Default,,0000,0000,0000,,you want to do, you want to think about,\Nso here is the example of the experiment Dialogue: 0,0:10:41.95,0:10:45.85,Default,,0000,0000,0000,,and what you see is, this is a round from\Nsome auction and these are different Dialogue: 0,0:10:45.85,0:10:49.97,Default,,0000,0000,0000,,bidders and, you know, the cost for. That\Nthey paid. What you can do, you want to Dialogue: 0,0:10:49.97,0:10:54.35,Default,,0000,0000,0000,,think, how do I run the best possible\Nexperiment, the most informative possible Dialogue: 0,0:10:54.35,0:10:58.57,Default,,0000,0000,0000,,experiment? And one way to do that, right,\Nis to construct some simple models. Dialogue: 0,0:10:58.57,0:11:02.70,Default,,0000,0000,0000,,Alright, six, reason six. Institutional\Ndesign, now this is a biggie and this is Dialogue: 0,0:11:02.70,0:11:07.21,Default,,0000,0000,0000,,one that means a lot to me. The person you\Nsee at the top here, this is Stan Rider he Dialogue: 0,0:11:07.21,0:11:11.72,Default,,0000,0000,0000,,was one of my advisors in graduate school\Nand the man at the bottom is Leo Herwicks, Dialogue: 0,0:11:11.72,0:11:15.75,Default,,0000,0000,0000,,he was one of my mentors in graduate\Nschool and Leo won the nobel prize in Dialogue: 0,0:11:15.75,0:11:20.07,Default,,0000,0000,0000,,economics. Leo won the nobel prize for,\Nwhich is A field known as mechanism Dialogue: 0,0:11:20.07,0:11:24.95,Default,,0000,0000,0000,,design. Now this diagram is called the\NMount Rider, named after Stan Rider in the Dialogue: 0,0:11:24.95,0:11:29.64,Default,,0000,0000,0000,,previous picture and Ken Mount, one of his\Nco-authors. And let me explain this Dialogue: 0,0:11:29.64,0:11:33.84,Default,,0000,0000,0000,,diagram to you because it's very\Nimportant. What you see here is this Dialogue: 0,0:11:33.84,0:11:38.47,Default,,0000,0000,0000,,theta, here. What this is supposed to\Nrepresent is the environment, the set of Dialogue: 0,0:11:38.47,0:11:43.28,Default,,0000,0000,0000,,technologies, people's preferences, those\Ntypes of things. X over here represents Dialogue: 0,0:11:43.28,0:11:47.30,Default,,0000,0000,0000,,the outcomes, what we want to have happen.\NSo how we want to sort of use our Dialogue: 0,0:11:47.30,0:11:51.80,Default,,0000,0000,0000,,technologies and use our labor and use you\Nknow, whatever we have at our disposal to Dialogue: 0,0:11:51.80,0:11:55.98,Default,,0000,0000,0000,,create good outcomes. Now this arrow here\Nis sort of , it's what we desire, it's Dialogue: 0,0:11:55.98,0:12:00.21,Default,,0000,0000,0000,,like if we could sit around and decide\Ncollectively what kind of outcomes we'd Dialogue: 0,0:12:00.21,0:12:04.39,Default,,0000,0000,0000,,like to have given the technology, this is\Nwhat we collectively decide, this is Dialogue: 0,0:12:04.39,0:12:07.98,Default,,0000,0000,0000,,something called a social choice\Ncorrespondence or a social choice Dialogue: 0,0:12:07.98,0:12:12.15,Default,,0000,0000,0000,,function. Sort of, what would be the ideal\Noutcome for society? The thing is that Dialogue: 0,0:12:12.15,0:12:16.33,Default,,0000,0000,0000,,[inaudible] doesn't get the ideal outcome\Nbecause what happens is [inaudible] wants Dialogue: 0,0:12:16.33,0:12:20.37,Default,,0000,0000,0000,,though. Because the thing is to get those\Noutcomes you have to use mechanisms and Dialogue: 0,0:12:20.37,0:12:24.36,Default,,0000,0000,0000,,that what this m stands for, mechanisms.\NSo a mechanism might be something like a Dialogue: 0,0:12:24.36,0:12:28.69,Default,,0000,0000,0000,,market, a political institution, it might\Nbe a bureaucracy. What we want to ask is, Dialogue: 0,0:12:28.69,0:12:33.87,Default,,0000,0000,0000,,is the outcome we get to the mechanism,\Nright, which goes like this is that equal Dialogue: 0,0:12:33.87,0:12:38.98,Default,,0000,0000,0000,,to the outcome that we would get, right,\Nideally and the better mechanism is, the Dialogue: 0,0:12:38.98,0:12:44.09,Default,,0000,0000,0000,,closer it is to equal to what we ideally\Nwant. Example: so my with my undergraduate Dialogue: 0,0:12:44.09,0:12:48.83,Default,,0000,0000,0000,,students for a homework assignment one\Ntime I said, suppose we allocated classes Dialogue: 0,0:12:48.83,0:12:52.25,Default,,0000,0000,0000,,by a market So, you know, if you had to\Nbid for classes, would that be a good Dialogue: 0,0:12:52.25,0:12:55.65,Default,,0000,0000,0000,,thing or a bad thing? Well, currently the\Nway we do it is there's a hierarchy. So Dialogue: 0,0:12:55.65,0:12:58.87,Default,,0000,0000,0000,,seniors, you know fourth year students\Nregister first and then juniors then Dialogue: 0,0:12:58.87,0:13:02.10,Default,,0000,0000,0000,,sophomores and then freshmen. And the\Nstudents were asking, should we have a Dialogue: 0,0:13:02.10,0:13:05.32,Default,,0000,0000,0000,,market? And their first reaction is yes,\Nbecause markets work. Right. You have Dialogue: 0,0:13:05.32,0:13:08.76,Default,,0000,0000,0000,,this, you know, you have a market, what\Nyou get here is sort of what you expect to Dialogue: 0,0:13:08.76,0:13:12.08,Default,,0000,0000,0000,,get. Right, what you'd like to get, so\Nit's sort of equal. But when they thought Dialogue: 0,0:13:12.08,0:13:15.43,Default,,0000,0000,0000,,about choosing classes, everybody goes,\Nwait a minute, markets may not work well Dialogue: 0,0:13:15.43,0:13:18.58,Default,,0000,0000,0000,,and the reason why is, you need to\Ngraduate. And so seniors need specific Dialogue: 0,0:13:18.58,0:13:22.22,Default,,0000,0000,0000,,courses and that's why we let seniors\Nregister first and if people could bid for Dialogue: 0,0:13:22.22,0:13:25.90,Default,,0000,0000,0000,,courses then the fraction that had a lot\Nof money might bid away the courses from Dialogue: 0,0:13:25.90,0:13:30.05,Default,,0000,0000,0000,,seniors and people might never graduate\Nfrom college so a good institution markets Dialogue: 0,0:13:30.05,0:13:35.05,Default,,0000,0000,0000,,may be good in some settings they may not\Nbe in others. The way we figure that out Dialogue: 0,0:13:35.05,0:13:39.80,Default,,0000,0000,0000,,is by using models. Reason seven: To help\Nchoose among policies in institutions. Dialogue: 0,0:13:39.80,0:13:43.69,Default,,0000,0000,0000,,Simple example. Suppose [inaudible] a\Nmarket for pollution permits or a cap and Dialogue: 0,0:13:43.69,0:13:47.27,Default,,0000,0000,0000,,trade system. We can write down simple\Nmodel and you can tell us which one is Dialogue: 0,0:13:47.27,0:13:50.94,Default,,0000,0000,0000,,going to work better. Or here is another\Nexample, this is picture of the city of Dialogue: 0,0:13:50.94,0:13:54.62,Default,,0000,0000,0000,,Ann Arbor and if you look here you see\Nsome green areas, right, what these green Dialogue: 0,0:13:54.62,0:13:58.24,Default,,0000,0000,0000,,things are... Is green spaces. Their is a\Nquestion should the city of Ann Arbor Dialogue: 0,0:13:58.24,0:14:01.92,Default,,0000,0000,0000,,create more green spaces. You might think\Nof course, green space is a good thing. Dialogue: 0,0:14:01.92,0:14:05.69,Default,,0000,0000,0000,,The problem is when you, if you buy up a\Nbunch of green space like this area here Dialogue: 0,0:14:05.69,0:14:09.27,Default,,0000,0000,0000,,is all green. What can happen is people\Ncould say lets move next to that, lets Dialogue: 0,0:14:09.27,0:14:12.76,Default,,0000,0000,0000,,build little houses all around here\Nbecause it is always going to be green, Dialogue: 0,0:14:12.76,0:14:17.35,Default,,0000,0000,0000,,and that can actually lead to more sprawl.\NSo what can seem like really good simple Dialogue: 0,0:14:17.35,0:14:21.92,Default,,0000,0000,0000,,ideas may not be good ideas if you\Nactually construct a model to think Dialogue: 0,0:14:21.92,0:14:26.32,Default,,0000,0000,0000,,through it. [sound] okay, we've covered a\Nlot. So, let's give a quick summary here. Dialogue: 0,0:14:26.32,0:14:30.56,Default,,0000,0000,0000,,How can models help us? Well first thing\Nthey can do is become real time decision Dialogue: 0,0:14:30.56,0:14:34.79,Default,,0000,0000,0000,,makers. They can help us figure out when\Nwe intervene and when we don't intervene. Dialogue: 0,0:14:34.79,0:14:38.96,Default,,0000,0000,0000,,Second, they can help us with comparative\Nstatus. We can figure out, you know what, Dialogue: 0,0:14:38.96,0:14:42.96,Default,,0000,0000,0000,,what's likely to happen, right, if we make\Nthis choice. Third, they can help us with Dialogue: 0,0:14:42.96,0:14:46.67,Default,,0000,0000,0000,,counter-factuals, they can you know\Nappresent a policy, we can sort of run a Dialogue: 0,0:14:46.67,0:14:50.58,Default,,0000,0000,0000,,model and think about what would have\Nhappened if we hadn't chosen that policy Dialogue: 0,0:14:50.58,0:14:54.68,Default,,0000,0000,0000,,Fourth, we can use them to identify and\Nrank levers. Often as you've got lots of Dialogue: 0,0:14:54.68,0:14:58.89,Default,,0000,0000,0000,,choices to make models can figure out\Nwhich choice might be the best or the most Dialogue: 0,0:14:58.89,0:15:02.84,Default,,0000,0000,0000,,influenced. Fifth, they can help us with\Nexperimental design. They can help us Dialogue: 0,0:15:02.84,0:15:06.84,Default,,0000,0000,0000,,design experiments in order to develop\Nbetter policies and better strategies. Dialogue: 0,0:15:06.84,0:15:10.63,Default,,0000,0000,0000,,Sixth, they can help us design\Ninstitutions themselves figuring out if we Dialogue: 0,0:15:10.63,0:15:14.48,Default,,0000,0000,0000,,have a market here, should we have a\Ndemocracy, should we use a bureaucracy. Dialogue: 0,0:15:14.48,0:15:18.58,Default,,0000,0000,0000,,And seventh, finally, they can help us\Nchoose among policies and institutions so Dialogue: 0,0:15:18.58,0:15:22.84,Default,,0000,0000,0000,,if we are thinking about one policy or\Nanother policy we can use models to decide Dialogue: 0,0:15:22.84,0:15:25.60,Default,,0000,0000,0000,,among the two. All right. Thank you.