0:00:00.000,0:00:05.404 Hi, in this lecture we are going to look[br]at our fourth category of reasons about why 0:00:05.404,0:00:09.470 you'd want to take a course in modeling,[br]why modeling is so important. And that is 0:00:09.470,0:00:12.983 to help you make better decisions,[br]strategize better, and design things 0:00:12.983,0:00:16.948 better. So lets get started, this should[br]be a lot of fun. Alright, so first reason 0:00:16.948,0:00:20.963 why models are so useful. They are good[br]decision aides, they help you make better 0:00:20.963,0:00:25.044 decisions. Let me give you an example.[br]These get us going here. So what you see 0:00:25.044,0:00:29.383 is a whole bunch of different financial[br]institutions, these are companies like 0:00:29.383,0:00:33.054 Bear Sterns, AIG, CitiGroup, Morgan[br]Stanley and this represents the 0:00:33.054,0:00:37.393 relationship between these companies, in[br]terms of how one of their economic success 0:00:37.393,0:00:41.490 depends on another. Now imagine you are[br]the federal government and you've got a 0:00:41.490,0:00:45.340 financial crisis. So a lot of these[br]companies, or some of these companies are 0:00:45.340,0:00:49.546 starting to fail and you've got to decide[br]okay do I bail them out, do I save one of 0:00:49.546,0:00:53.802 these companies? Well now lets use one of[br]these very simple models to help make that 0:00:53.802,0:00:57.957 decision. So to do that we need a little[br]more of an understanding of what 0:00:57.957,0:01:02.254 these numbers represent. So lets look at[br]AIG which is right here. And JP Morgan 0:01:02.254,0:01:07.970 which is right here So now we see a number[br]of 466 between the two of those. What that 0:01:07.970,0:01:12.958 number represents is how correlated JP[br]Morgan success is with AIG success. In 0:01:12.958,0:01:18.140 particular how correlated their failures are. So if[br]AIG has a bad day, how likely is it that 0:01:18.140,0:01:23.451 JP Morgan has a bad day? And we see that[br]it is a really big number. Now if you look 0:01:23.451,0:01:28.828 up here at this 94, this represents the link between Wells[br]Fargo and Lehman Brothers. What that tells 0:01:28.828,0:01:34.139 us is that Lehman Brothers has a bad day,[br]well it only has a small effect on Wells 0:01:34.139,0:01:38.344 Fargo and vice versa. So now you are the[br]government and you got to decide, okay who do I 0:01:38.344,0:01:43.169 want to bail out? Nobody or somebody? Lets[br]look at Lehman Brothers. There's only 0:01:43.169,0:01:48.243 three lines going in and out of Lehman[br]Brothers and one is a 94. I guess four 0:01:48.243,0:01:53.382 lines, one is a 103, one is a 158 and one[br]is a 155. Those are relatively small 0:01:53.382,0:01:58.541 numbers. So if you're the government you[br]say, okay Lehman Brothers has been around 0:01:58.541,0:02:03.616 a long time and its an important company,[br]these numbers are pretty small, if they 0:02:03.616,0:02:08.691 fail it doesn't look like these other[br]companies would fail. But now lets look at 0:02:08.691,0:02:13.386 AIG. We've got a 466, we've got a 441,[br]we've got a 456, we've got a 390 and a 0:02:13.386,0:02:19.336 490. So there are huge numbers associated[br]with AIG. Because there is a huge number 0:02:19.336,0:02:23.171 you basically have to figure, you know[br]what we probably have to prop AIG back up. 0:02:23.171,0:02:27.102 Even if you don't want to because if you[br]don't there is the possibility that this 0:02:27.102,0:02:30.794 whole system will fail. So what we see[br]here is the incredible power of models, 0:02:30.794,0:02:34.581 right to help us make a better decision.[br]The government did let Lehman Brothers 0:02:34.581,0:02:37.745 fail, and terrible for Lehman[br]Brothers, but the economy sort of 0:02:37.745,0:02:41.677 soldiered on. They didn't let AIG fail and[br]we don't know for sure that it would've 0:02:41.677,0:02:45.883 and we don't know for sure that the whole[br]financial you know apparatus United 0:02:45.883,0:02:51.153 States, they propped up AIG and you know[br]we made it, the country made it. It looks 0:02:51.153,0:02:55.533 they've made a reasonable decision.[br]Alright so that is big financial 0:02:55.533,0:03:00.561 decisions. Lets look at something more[br]fun. This is a simple sort of logic puzzle 0:03:00.561,0:03:05.400 that will help us see how models can be[br]useful. Now this is a game called, The 0:03:05.400,0:03:10.346 Monty Hall Problem and its named after[br]Monty Hall was the host of a game show 0:03:10.346,0:03:15.196 called, Lets Make a Deal that aired during[br]the 1970's. Now the problem I'm going to 0:03:15.196,0:03:20.106 describe to you is a characterization of a[br]event that could happen on the show. Its 0:03:20.106,0:03:24.542 one of several scenarios on the show.[br]Here's basically how it works. There's 0:03:24.542,0:03:29.392 three doors. Behind one of these doors is[br]a prize, behind the other two doors there's some, you 0:03:29.392,0:03:34.123 know, silly thing like a goat right, or a[br]woman dressed up in a ballerinas outfit. 0:03:34.123,0:03:38.834 So one of them had something fantastic[br]like a new car or a washing machine. Now 0:03:38.834,0:03:44.322 what you get to do is you pick one door.[br]So maybe you pick door number one, right, 0:03:44.322,0:03:49.810 so you pick door number one. Now Monty[br]knows where the prize is so the two doors 0:03:49.810,0:03:54.592 you didn't pick, one of those always has[br]to go behind it, where you know, silly 0:03:54.592,0:03:59.070 prize behind it. So because one of us[br]always has a silly prize behind it, he 0:03:59.070,0:04:03.898 can always show you one of those other two[br]doors. So you pick door number one, right, 0:04:03.898,0:04:08.552 and what Monty does, you picked one and[br]what Monty does is he then opens up door 0:04:08.552,0:04:13.089 number three and says, here's a goat, then[br]he says, hey, do you want to switch to 0:04:13.089,0:04:19.416 door number two? Well, do you? Alright,[br]that's a hard problem so let's first try 0:04:19.416,0:04:23.835 to get the logic right then we'll right[br]down a formal model. So, it's easier to 0:04:23.835,0:04:28.028 see the logic for this problem by[br]increasing the number of doors. So let's 0:04:28.028,0:04:32.391 suppose there's five doors, and now[br]there's five doors, let's suppose you pick 0:04:32.391,0:04:37.037 this blue door, this bright blue door. The[br]probability that you're correct is 1/5th. 0:04:37.037,0:04:41.343 Right, one of the doors has prize, the[br]probability you're correct is 1/5th. So 0:04:41.343,0:04:46.023 the probability that you're not correct Is[br]4/5ths. So, there's a 1/5th chance you're 0:04:46.023,0:04:50.064 correct. There's a 4/5ths chance you're[br]not. Now let's suppose that Monty 0:04:50.064,0:04:54.218 [inaudible] is also playing this game,[br]because he knows again, he knows the 0:04:54.218,0:04:58.483 answer. So Monty is thinking, okay, well,[br]you know what, I'm gonna show you that 0:04:58.483,0:05:03.624 it's not behind the yellow door. And then[br]he says, you know what else I'm going to 0:05:03.624,0:05:08.565 show you, that it's not behind the pink[br]door. [inaudible]. I'm gonna be nice, I'm 0:05:08.565,0:05:13.630 gonna show you it's not behind the green[br]door. Now he says, do you want to switch 0:05:13.630,0:05:18.525 to the light blue door to the dark blue[br]door. Well in this case, you should start 0:05:18.525,0:05:23.049 thinking, you know initially the[br]probability I was right was only 1/5th And 0:05:23.049,0:05:26.593 he revealed all those other doors that[br]doesn't seem to have the prize. It seems 0:05:26.593,0:05:30.316 much more likely that this is the correct[br]door than mine's the correct door and in 0:05:30.316,0:05:33.725 fact it is much more [inaudible]. The[br]probability is 4/5ths it's behind that 0:05:33.725,0:05:37.358 dark blue door and only 1/5th it's behind[br]your door. So you should switch and you 0:05:37.358,0:05:41.037 should also switch in the case of two. Now[br]let's formalize this. This isn't so much, 0:05:41.037,0:05:44.465 this is, we'll use the simple decision[br]three model. To show why in fact you 0:05:44.465,0:05:48.318 should switch. Alright, so let's start[br]out, we'll just do some basic probability. 0:05:48.318,0:05:52.220 There's three doors, you pick door number[br]one, the probability you're right is a 0:05:52.220,0:05:56.319 third and the probability that it's door[br]number two is a third and the probability 0:05:56.319,0:06:00.320 that it's door number three is a third.[br]Now, what we want to do is break this into 0:06:00.320,0:06:04.599 two sets. There's a 1/3rd chance that[br]you're right and there's a 2/3rds chance 0:06:04.599,0:06:09.574 that you're wrong. After you pick door[br]number one, the prize can't be moved. So 0:06:09.574,0:06:14.936 it's either behind door number two, number[br]three or if you got it right, it's behind 0:06:14.936,0:06:20.363 door number one. So let's think about what[br]Monty can do. Monty can basically show you 0:06:20.363,0:06:25.403 if it's behind door number one or door[br]number two, he can show you door number 0:06:25.403,0:06:30.197 three. He can say look, there's the goat.[br]Well if he does that, because he can 0:06:30.197,0:06:34.787 always show you one of these doors,[br]nothing happened to your probability of 0:06:34.787,0:06:39.682 1/3rd. There's a 1/3rd chance you were[br]right before since he can always show you 0:06:39.682,0:06:44.210 a door, there's still only a 1/3rd chance[br]you're right. Right, alternatively, 0:06:44.210,0:06:48.452 suppose that, It was behind door number[br]three well then he can show you door 0:06:48.452,0:06:52.285 number two. He can say the goat's here.[br]So, it's still the case that nothing 0:06:52.285,0:06:56.377 happens to your probability. The reason[br]why when you think about these two sets, 0:06:56.377,0:07:00.313 you didn't learn anything. You learn[br]nothing about this other set right here, 0:07:00.313,0:07:04.301 the 2/3rds chance you're wrong because he[br]can always show you a goat. So your 0:07:04.301,0:07:08.497 initial chan-, your initial probability[br]being correct was 1/3rd, your final chance 0:07:08.497,0:07:12.692 of being correct was probably 1/3rd. So[br]just this sort of idea of drawing circles 0:07:12.692,0:07:16.855 and writing probabilities allows us to see[br]that the correct The correct decision on 0:07:16.855,0:07:20.197 the [inaudible] problem is to switch,[br]right. Just like when we looked at that 0:07:20.197,0:07:23.540 financial decision that the Federal[br]Government had to make with the circles 0:07:23.540,0:07:27.102 and the arrows, you draw that out, and you[br]realize the best decision is to let the 0:07:27.102,0:07:31.069 [inaudible] fail. Bailout AIG. Alright so[br]lets move on a look sort of the next 0:07:31.069,0:07:35.576 reason that models can be helpful and that[br]is comparative statics. What do I mean by 0:07:35.576,0:07:39.547 that? Well here is a standard model from[br]economics, what we can think of is 0:07:39.547,0:07:43.786 comparative statics means you know you[br]move from one equilibrium to another. So 0:07:43.786,0:07:47.971 what you see here is that S is a supply[br]curve, that is a supply curve for some 0:07:47.971,0:07:52.317 good, and D, D1 and D2 are demand curves.[br]So what you see is demand shifting out. So 0:07:52.317,0:07:56.624 when this demand shifts out. In this way[br]what we get is that more goods are sold 0:07:56.624,0:08:00.993 the quantity goes up, and the price goes[br]up so people want more of something, more 0:08:00.993,0:08:05.307 is gonna get sold and the price is up. So[br]this is where you start seeing how the 0:08:05.307,0:08:09.357 equilibrium moves so this is again a[br]simple example of how. Models help us 0:08:09.357,0:08:13.754 understand how the world will change,[br]equilibrium world, just by drawing some 0:08:13.754,0:08:18.266 simple figures. Alright, reason number[br]three. Counter factuals, what do I mean by 0:08:18.266,0:08:22.837 that? Well you can think you only get to[br]run the world once, you only get to run 0:08:22.837,0:08:27.465 the tape one time. But if we write models[br]of the world we can sort of re-run the 0:08:27.465,0:08:31.919 tape using those models. So here is an[br]example, in April of 2009, The spring of 0:08:31.919,0:08:36.393 2009, the Federal Government decided to[br]implement a recovery plan. Well what you 0:08:36.393,0:08:40.369 see here is sort of the effect, this line[br]right here shows the effect with the 0:08:40.369,0:08:44.446 recovery plan, and this line shows, says,[br]this is what a model shows what would of 0:08:44.446,0:08:48.473 happened without the recovery plan. Now we[br]can't be sure that, that happened, but, 0:08:48.473,0:08:52.142 you know, at least we have some[br]understanding, perhaps, of what the effect 0:08:52.142,0:08:56.322 of recovery plan was, which is great. So[br]these counter factuals are not going to be 0:08:56.322,0:09:00.461 exact, there going to be approximate, but[br]still they help us figure out. After the 0:09:00.461,0:09:05.150 fact whether a policy was a good policy or[br]not. Reason number four. To identify and 0:09:05.150,0:09:09.781 rank levers. So what we are going to do is[br]look at a simple model of contagion of 0:09:09.781,0:09:14.184 failure, so this is a model where one[br]country might fail, so in this case that 0:09:14.184,0:09:18.816 country is going to be England. Then we[br]can ask what happens over time, so you can 0:09:18.816,0:09:23.447 see that initially after England fails, we[br]see Ireland and Belgium fail, and after 0:09:23.447,0:09:27.716 that we see France fail. And after that we[br]see Germany fail. So what this tells us is 0:09:27.716,0:09:31.626 that in terms of its effect on the worlds[br]financial system, London is a big lever, 0:09:31.626,0:09:35.440 so London is something we care about a[br]great deal. Now lets take another policy 0:09:35.440,0:09:39.060 issue, climate change. One of the big[br]things in climate change is the carbon 0:09:39.060,0:09:42.825 cycle, its one of the models that you use[br]all the time, simple carbon models. We 0:09:42.825,0:09:46.832 know that total amount of carbon is fixed,[br]that can be up in the air or down on the 0:09:46.832,0:09:50.645 earth, if it is down on the earth it is[br]better because it doesn't contribute to 0:09:50.645,0:09:54.550 global warming So if you want to think[br]about, where do you intervene, you wanna 0:09:54.550,0:09:58.488 ask, where in this cycle are there big[br]numbers? Right, so you look here in terms 0:09:58.488,0:10:02.678 of surface radiation. That's a big number.[br]Where you think of solar radiation coming 0:10:02.678,0:10:06.717 in, that's a big number coming in. So, you[br]wanna, you think about where you want to 0:10:06.717,0:10:10.806 have a policy in fact, you want to think[br]about it in terms of where those numbers 0:10:10.806,0:10:14.744 are large. So if you look at number, the[br]amount of [inaudible] reflected by the 0:10:14.744,0:10:18.269 surface, that's only a 30, that's not a[br]very big leber. Okay reason five, 0:10:18.269,0:10:22.066 experimental design. Now, what i mean by[br]experimental design, well, suppose you 0:10:22.066,0:10:26.014 want to come up with some new policies.[br]For example, when the Federal Government, 0:10:26.014,0:10:30.161 when they wanted to, when they were trying[br]to decide how to auction off the federal 0:10:30.161,0:10:33.909 airwaves, right, for cell phones, they[br]wanted raise as much money as possible. 0:10:33.909,0:10:37.956 Well to test auction designer were best[br]they ran some experiments. Well the thing 0:10:37.956,0:10:41.953 you want to do, you want to think about,[br]so here is the example of the experiment 0:10:41.953,0:10:45.851 and what you see is, this is a round from[br]some auction and these are different 0:10:45.851,0:10:49.967 bidders and, you know, the cost for. That[br]they paid. What you can do, you want to 0:10:49.967,0:10:54.353 think, how do I run the best possible[br]experiment, the most informative possible 0:10:54.353,0:10:58.570 experiment? And one way to do that, right,[br]is to construct some simple models. 0:10:58.570,0:11:02.703 Alright, six, reason six. Institutional[br]design, now this is a biggie and this is 0:11:02.703,0:11:07.211 one that means a lot to me. The person you[br]see at the top here, this is Stan Rider he 0:11:07.211,0:11:11.720 was one of my advisors in graduate school[br]and the man at the bottom is Leo Herwicks, 0:11:11.720,0:11:15.746 he was one of my mentors in graduate[br]school and Leo won the nobel prize in 0:11:15.746,0:11:20.074 economics. Leo won the nobel prize for,[br]which is A field known as mechanism 0:11:20.074,0:11:24.947 design. Now this diagram is called the[br]Mount Rider, named after Stan Rider in the 0:11:24.947,0:11:29.636 previous picture and Ken Mount, one of his[br]co-authors. And let me explain this 0:11:29.636,0:11:33.839 diagram to you because it's very[br]important. What you see here is this 0:11:33.839,0:11:38.468 theta, here. What this is supposed to[br]represent is the environment, the set of 0:11:38.468,0:11:43.280 technologies, people's preferences, those[br]types of things. X over here represents 0:11:43.280,0:11:47.298 the outcomes, what we want to have happen.[br]So how we want to sort of use our 0:11:47.298,0:11:51.799 technologies and use our labor and use you[br]know, whatever we have at our disposal to 0:11:51.799,0:11:55.979 create good outcomes. Now this arrow here[br]is sort of , it's what we desire, it's 0:11:55.979,0:12:00.212 like if we could sit around and decide[br]collectively what kind of outcomes we'd 0:12:00.212,0:12:04.391 like to have given the technology, this is[br]what we collectively decide, this is 0:12:04.391,0:12:07.981 something called a social choice[br]correspondence or a social choice 0:12:07.981,0:12:12.147 function. Sort of, what would be the ideal[br]outcome for society? The thing is that 0:12:12.147,0:12:16.334 [inaudible] doesn't get the ideal outcome[br]because what happens is [inaudible] wants 0:12:16.334,0:12:20.371 though. Because the thing is to get those[br]outcomes you have to use mechanisms and 0:12:20.371,0:12:24.358 that what this m stands for, mechanisms.[br]So a mechanism might be something like a 0:12:24.358,0:12:28.688 market, a political institution, it might[br]be a bureaucracy. What we want to ask is, 0:12:28.688,0:12:33.867 is the outcome we get to the mechanism,[br]right, which goes like this is that equal 0:12:33.867,0:12:38.982 to the outcome that we would get, right,[br]ideally and the better mechanism is, the 0:12:38.982,0:12:44.091 closer it is to equal to what we ideally[br]want. Example: so my with my undergraduate 0:12:44.091,0:12:48.829 students for a homework assignment one[br]time I said, suppose we allocated classes 0:12:48.829,0:12:52.249 by a market So, you know, if you had to[br]bid for classes, would that be a good 0:12:52.249,0:12:55.647 thing or a bad thing? Well, currently the[br]way we do it is there's a hierarchy. So 0:12:55.647,0:12:58.872 seniors, you know fourth year students[br]register first and then juniors then 0:12:58.872,0:13:02.098 sophomores and then freshmen. And the[br]students were asking, should we have a 0:13:02.098,0:13:05.323 market? And their first reaction is yes,[br]because markets work. Right. You have 0:13:05.323,0:13:08.764 this, you know, you have a market, what[br]you get here is sort of what you expect to 0:13:08.764,0:13:12.076 get. Right, what you'd like to get, so[br]it's sort of equal. But when they thought 0:13:12.076,0:13:15.430 about choosing classes, everybody goes,[br]wait a minute, markets may not work well 0:13:15.430,0:13:18.581 and the reason why is, you need to[br]graduate. And so seniors need specific 0:13:18.581,0:13:22.216 courses and that's why we let seniors[br]register first and if people could bid for 0:13:22.216,0:13:25.896 courses then the fraction that had a lot[br]of money might bid away the courses from 0:13:25.896,0:13:30.054 seniors and people might never graduate[br]from college so a good institution markets 0:13:30.054,0:13:35.047 may be good in some settings they may not[br]be in others. The way we figure that out 0:13:35.047,0:13:39.796 is by using models. Reason seven: To help[br]choose among policies in institutions. 0:13:39.796,0:13:43.686 Simple example. Suppose [inaudible] a[br]market for pollution permits or a cap and 0:13:43.686,0:13:47.268 trade system. We can write down simple[br]model and you can tell us which one is 0:13:47.268,0:13:50.942 going to work better. Or here is another[br]example, this is picture of the city of 0:13:50.942,0:13:54.616 Ann Arbor and if you look here you see[br]some green areas, right, what these green 0:13:54.616,0:13:58.244 things are... Is green spaces. Their is a[br]question should the city of Ann Arbor 0:13:58.244,0:14:01.919 create more green spaces. You might think[br]of course, green space is a good thing. 0:14:01.919,0:14:05.686 The problem is when you, if you buy up a[br]bunch of green space like this area here 0:14:05.686,0:14:09.268 is all green. What can happen is people[br]could say lets move next to that, lets 0:14:09.268,0:14:12.756 build little houses all around here[br]because it is always going to be green, 0:14:12.756,0:14:17.350 and that can actually lead to more sprawl.[br]So what can seem like really good simple 0:14:17.350,0:14:21.921 ideas may not be good ideas if you[br]actually construct a model to think 0:14:21.921,0:14:26.325 through it. [sound] okay, we've covered a[br]lot. So, let's give a quick summary here. 0:14:26.325,0:14:30.558 How can models help us? Well first thing[br]they can do is become real time decision 0:14:30.558,0:14:34.790 makers. They can help us figure out when[br]we intervene and when we don't intervene. 0:14:34.790,0:14:38.957 Second, they can help us with comparative[br]status. We can figure out, you know what, 0:14:38.957,0:14:42.963 what's likely to happen, right, if we make[br]this choice. Third, they can help us with 0:14:42.963,0:14:46.672 counter-factuals, they can you know[br]appresent a policy, we can sort of run a 0:14:46.672,0:14:50.580 model and think about what would have[br]happened if we hadn't chosen that policy 0:14:50.580,0:14:54.684 Fourth, we can use them to identify and[br]rank levers. Often as you've got lots of 0:14:54.684,0:14:58.892 choices to make models can figure out[br]which choice might be the best or the most 0:14:58.892,0:15:02.840 influenced. Fifth, they can help us with[br]experimental design. They can help us 0:15:02.840,0:15:06.841 design experiments in order to develop[br]better policies and better strategies. 0:15:06.841,0:15:10.633 Sixth, they can help us design[br]institutions themselves figuring out if we 0:15:10.633,0:15:14.478 have a market here, should we have a[br]democracy, should we use a bureaucracy. 0:15:14.478,0:15:18.582 And seventh, finally, they can help us[br]choose among policies and institutions so 0:15:18.582,0:15:22.842 if we are thinking about one policy or[br]another policy we can use models to decide 0:15:22.842,0:15:25.596 among the two. All right. Thank you.