WEBVTT 00:00:00.000 --> 00:00:03.526 Hi, my name is Scotty Page. I'm a professor of complex systems, political 00:00:03.526 --> 00:00:07.550 science and economics at the University of Michigan in Ann Arbor. And I'd like to 00:00:07.550 --> 00:00:11.306 welcome you to this free online course called Model Thinking. In this opening 00:00:11.306 --> 00:00:14.939 lecture, I just want to do four things. The first thing I want to do is I want to 00:00:14.939 --> 00:00:18.480 sort of explain to you why you know, I personally think it's, it's so important 00:00:18.480 --> 00:00:21.731 and fun to take a course in models. Second, what I'd like to do is, I want to 00:00:21.731 --> 00:00:24.889 give you a sense of the outline of the course. Like, what we're gonna cover, a 00:00:24.889 --> 00:00:27.839 little bit, and how it's gonna be structured. Third thing is, I'll talk a 00:00:27.839 --> 00:00:31.080 little bit about an online course that you, I've never taught an online course 00:00:31.080 --> 00:00:34.362 before, you probably haven't taken one. So, let's talk a little bit how it's, just 00:00:34.362 --> 00:00:37.603 how it's structured, how it's set up, you know, what, what's out there on the web, 00:00:37.603 --> 00:00:40.844 that sort of thing. And then the last thing I'll do is I'll talk about, sort of, 00:00:40.844 --> 00:00:44.292 how I'm gonna structure particular units. Each unit will focus on a single model or 00:00:44.292 --> 00:00:47.616 a class of models. I wanna give you some sense of exactly how we're gonna unpack 00:00:47.616 --> 00:00:50.982 those things, analyze them and think about them, alright? Okay, so let's get started. 00:00:50.982 --> 00:00:54.468 Why model?. First reason, I think, is this. In order to be an intelligent 00:00:54.468 --> 00:00:58.042 citizen of the world, I think you have to understand models. Why do I say that? 00:00:58.042 --> 00:01:01.524 Well, when you think about, like, a liberal arts education, I think some of us 00:01:01.524 --> 00:01:05.237 classically think of, sort of, the great books, like, this long shelf of books that 00:01:05.237 --> 00:01:08.858 everyone should know. And when the great books curriculum was formed, right, and 00:01:08.858 --> 00:01:12.525 I'll talk about this some in the next lecture, you know, most of human knowledge 00:01:12.525 --> 00:01:16.038 didn't have models in it. Models are a relatively new phenomena. Right? So, if 00:01:16.038 --> 00:01:19.668 you take, you know, whether you go from anthropology to zoology, anywhere in 00:01:19.668 --> 00:01:23.495 between. When you go to college, you'll sorta find, like, oh my gosh, I'm learning 00:01:23.495 --> 00:01:27.370 models in this course. And we'll talk, in a minute, about some of the reasons why 00:01:27.370 --> 00:01:31.196 we're using models, right? But models are everywhere. And so in order to just be 00:01:31.196 --> 00:01:35.366 involved in a conversation, it's important these days that you can use and understand 00:01:35.366 --> 00:01:39.695 models. Alright. Reason number two. The reason models are everywhere, the reason 00:01:39.695 --> 00:01:43.614 they're everywhere from anthropology to zoology is, they're better, right? They 00:01:43.614 --> 00:01:47.686 sort of make us clearer, better thinkers. Anytime anybody's ever run a horse race 00:01:47.686 --> 00:01:51.758 between models making, you know, people using models to make decisions, and people 00:01:51.758 --> 00:01:55.728 not using models to make decisions, the people with models do better. So models 00:01:55.728 --> 00:01:59.800 just make you a better thinker. The reason why is that they sort of weed out the 00:01:59.800 --> 00:02:03.923 logical inconsistencies. They're a crutch, right, they just, you know, we sort of are 00:02:03.923 --> 00:02:07.486 crazy, you know, think silly things, can't think through all the logical 00:02:07.486 --> 00:02:11.934 consequences. Models sort of tie us to the mast a little bit. And in doing so, right, 00:02:11.934 --> 00:02:17.830 we think better. We get better at what we do. All right. Reason number three to use and 00:02:17.830 --> 00:02:22.581 understand data. So. There's just so much data out there, right? When I first became 00:02:22.581 --> 00:02:26.408 a social scientist, I mean, it was, it would be a real effort for somebody to go 00:02:26.408 --> 00:02:30.381 grab a data set. Now there's just a ton of it. I like to think of it as a fire hose 00:02:30.382 --> 00:02:34.209 of data, but my friends who are computer scientists, they call it a hairball of 00:02:34.209 --> 00:02:37.987 data, right, cuz it's just sort of all mangled and messed up. So, models, they'll 00:02:37.987 --> 00:02:41.765 just take that data, right, and sort of structure it into information, and then 00:02:41.765 --> 00:02:45.739 turn that information into knowledge. And so, without models, all we've just got is 00:02:45.739 --> 00:02:49.615 a whole bunch of numbers out there. With models, we actually get information and 00:02:49.615 --> 00:02:53.540 knowledge and eventually maybe even some wisdom. At least we can hope, right? Okay. 00:02:53.540 --> 00:02:56.856 Reason number four: last piece of main category reason. And by the way, the next 00:02:58.514 --> 00:03:00.172 four lectures I'm gonna un-, I'm gonna sort of work through and unpack each of 00:03:00.172 --> 00:03:03.446 these four reasons in more depth. But I just sorta want to lay them out there, 00:03:03.446 --> 00:03:06.550 this first lecture. So, reason number four: to decide, strategize, and design. 00:03:06.550 --> 00:03:10.143 So, when you've gotta make a decision, whether it's, you know, whether you're the 00:03:10.143 --> 00:03:13.459 President of the United States or whether you're running your local PTO 00:03:13.459 --> 00:03:17.006 organization, it's helpful to build or structure that information in a way to 00:03:17.006 --> 00:03:20.646 make better decisions. So we'll learn about things like decision trees and game 00:03:20.646 --> 00:03:24.285 theory models and stuff like that, to just help us make better decisions and to 00:03:24.285 --> 00:03:27.970 strategize better. And also, at the very end of the class, we'll talk about design 00:03:27.970 --> 00:03:31.241 issues, right? You can use models to design things like institutions and 00:03:31.241 --> 00:03:34.788 policies and stuff like that. So, models just make it better at making choices, 00:03:34.788 --> 00:03:38.789 better at taking actions. Okay, so those are the big four. Now let's talk a little 00:03:38.789 --> 00:03:42.882 bit about, the outline of the course, what it's like. So, this isn't gonna be a 00:03:42.882 --> 00:03:46.871 typical course, not just because it's online, but because the structure of the 00:03:46.871 --> 00:03:51.172 course is very different. So most courses, like if you take a math course, it sort of 00:03:51.172 --> 00:03:55.265 starts here and moves along, right, with each thing building on the thing before 00:03:55.265 --> 00:03:59.358 it. Now the difficulty with a course like that is if you ever, like fall off the 00:03:59.358 --> 00:04:02.330 train, right? Fall behind. That's it. You're just lost. Because you know, 00:04:02.330 --> 00:04:05.647 everything, what I'm doing in lecture six you need to know lecture five. And for 00:04:05.647 --> 00:04:09.005 lecture five you need to know lecture four. Well this course is going to be very 00:04:09.005 --> 00:04:12.280 different. This course is going to be a little bit more like, a trip to the zoo. 00:04:12.280 --> 00:04:16.300 [laugh] Right? So we're gonna learn about giraffes, and then we're gonna learn about 00:04:16.300 --> 00:04:20.127 rhinos, and then we go over the lion cage. So if you didn't quite understand the 00:04:20.127 --> 00:04:23.953 rhinos, it's not gonna hurt you too much when we go over the lion cage, right? So 00:04:23.953 --> 00:04:27.925 it's more like just moving from one topic to the next. They're somewhat related in 00:04:27.925 --> 00:04:31.849 that they're all sort of animals, but you don't need to fully know the giraffe to 00:04:31.849 --> 00:04:35.530 move to the rhino, but obviously like we're not gonna take like giraffes and 00:04:35.530 --> 00:04:39.212 rhinos, we're gonna study models and [inaudible]. So what kind of models. We're 00:04:39.212 --> 00:04:42.612 gonna study models like collective action models. These are models where individuals 00:04:42.612 --> 00:04:45.770 have to decide how much to contribute to something that's for the public good. 00:04:45.770 --> 00:04:48.805 We'll study things like the wisdom of crowds. Like, how is it that groups of 00:04:48.805 --> 00:04:51.962 people can be really smart? We'll study models that have, like, fancy names like 00:04:51.962 --> 00:04:54.998 [inaudible] models and Markov models. These are models of sort of processes, 00:04:54.998 --> 00:04:58.034 right? So they, they sound scary, but they're actually always sort of fun and 00:04:58.034 --> 00:05:01.110 interesting. We'll study game theory models. We'll study something called the 00:05:01.110 --> 00:05:04.470 Colonel Blotto game, which is a game where you have to decide how many resources to 00:05:04.470 --> 00:05:07.789 allocate across different front. So this can be thought of as a really interesting 00:05:07.789 --> 00:05:10.460 model of war. It can also be an interesting model of, you know, firm 00:05:10.460 --> 00:05:13.586 competition or sports, or legal defenses, all sorts of stuff. So. We're going to 00:05:13.586 --> 00:05:16.915 just you know, play with a whole bunch of moth. Everything from economic growth to 00:05:16.915 --> 00:05:20.285 tipping points. You know, a whole bunch in between. So it should be lots and lots of 00:05:20.285 --> 00:05:24.432 fun. Okay. What's the format for this? How's this going to work? What does an 00:05:24.432 --> 00:05:27.780 online course even look like? Okay. Well. Let's think about it. So first thing, 00:05:27.780 --> 00:05:31.397 there's these videos. You're watching one right now. I'm going to try to keep them 00:05:31.397 --> 00:05:34.807 between eight and fifteen minutes in length. Right? Sometimes I may sneak to 00:05:34.807 --> 00:05:38.434 sixteen but mostly I'll be creating fifteen minutes in length. And inside the 00:05:38.434 --> 00:05:42.060 videos there'll be questions. So I may, all of a sudden the video may stop and 00:05:42.060 --> 00:05:45.687 it'll say, what's the capital of Delaware? Well actually it won't say that, but 00:05:45.687 --> 00:05:49.502 something germane, hopefully, you know, to the, to the lecture. So, there'll be these 00:05:49.502 --> 00:05:53.269 fifteen eight to fifteen minute lectures. Each module, each section will have, you 00:05:53.269 --> 00:05:56.727 know, somewhere between like three and six of those. Right? Okay. In addition, 00:05:56.727 --> 00:06:00.085 there'll be readings. So on the wiki you'll find links to the reading. Not a 00:06:00.085 --> 00:06:03.578 bunch of these readings will come out of some books that I'm, I've written, and 00:06:03.578 --> 00:06:07.026 some, one that I'm about to write about actually this course, and it'll all be 00:06:07.026 --> 00:06:10.519 free. So, you'll know that Princeton University Press has been very generous in 00:06:10.519 --> 00:06:14.504 letting a lot of that content of my books be out there. So we're going to you'll be 00:06:14.504 --> 00:06:18.266 free to download whatever you need to look at. All right? There's some assignments. 00:06:18.266 --> 00:06:21.893 So there's an assignment on the web page. You'll see a little assignment thing, so 00:06:21.893 --> 00:06:25.271 all sorts of assignments. So, just make sure you're following. What's going on 00:06:25.271 --> 00:06:28.685 with the course, and then finally there will be some quizzes, right so there are 00:06:28.685 --> 00:06:32.100 some quizzes out there just to make sure you know hey am I really getting this. 00:06:32.100 --> 00:06:35.687 You'll, you'll watch me and you'll think, yea Scott gets these models but that's not 00:06:35.687 --> 00:06:39.144 what this is about, right this is about you understanding the models. So there'll 00:06:39.144 --> 00:06:42.222 be some quizzes, right but all in good fun. Okay, and finally, there's the 00:06:42.222 --> 00:06:45.445 discussion form. I mean, there's 40 to 50,000 people in this class, right, so, 00:06:45.445 --> 00:06:49.015 office hours can get sort of crowded. So, we're gonna have a discussion forum where 00:06:49.015 --> 00:06:52.107 people are gonna ask questions. I'll answer some. I've got some graduate 00:06:52.107 --> 00:06:55.330 students who'll answer some. Other students can answer things, but there'll 00:06:55.330 --> 00:06:58.639 be a place for people to sorta share ideas, share thoughts, give feedback and 00:06:58.639 --> 00:07:01.992 should be, hopefully, you know, really useful and structured in a way that will 00:07:01.992 --> 00:07:06.500 work for everybody. Okay so how does it work, what's one of these sections gonna 00:07:06.500 --> 00:07:10.758 look like, well each section which of course is gonna be 21. It's gonna be 00:07:10.758 --> 00:07:14.318 focused on, you know, particular model. Right, and so, we talk about the model and 00:07:14.318 --> 00:07:17.969 say, okay, what is the model? What are the assumptions? What are the parts? How does 00:07:17.969 --> 00:07:21.392 it work? You know, what are the main results? What are the applications? So, we 00:07:21.392 --> 00:07:25.089 just sort of you know talk through how the thing sort of plays out. Then it'll go 00:07:25.089 --> 00:07:28.648 into some technical details. Sometimes in the same lecture, I present the model. 00:07:28.648 --> 00:07:32.299 Sometimes in later lectures. This will be, you know, more technical stuff. A little 00:07:32.299 --> 00:07:36.042 bit more mathematics. Now, I'll try and be very clear about whether or not the math 00:07:36.042 --> 00:07:39.328 is, you know, easy, medium, or hard. You know, I'll let you know upfront. Like 00:07:39.328 --> 00:07:42.933 okay, this may require a lot of Algebra or this is just, you know, sort of simple 00:07:42.933 --> 00:07:46.250 logical thinking. Right, so. I'll be pretty clear about how much effort it's 00:07:46.250 --> 00:07:49.801 gonna take to get through, trudge through some of the examples. And there will be 00:07:49.801 --> 00:07:53.529 practice problems you can work on as well. And the other thing I'm gonna do in every 00:07:53.529 --> 00:07:57.129 one of these sections is talk about the fertility of the models. Tonight's memory 00:07:57.129 --> 00:08:00.706 [inaudible] is this kernel blotter model that was, could be used to model war or 00:08:00.706 --> 00:08:04.058 sports or legal defenses, right. Most of these models were developed for one 00:08:04.058 --> 00:08:07.455 purpose but we can apply them to other purposes. So we're going to talk a lot 00:08:07.455 --> 00:08:10.987 about how, okay, now that we just learned this model, where can we apply it. Where 00:08:10.987 --> 00:08:14.244 does it, you know, where else does it work? Right? Okay, so that's it. That's 00:08:14.244 --> 00:08:17.599 sort of how it's gonna work, right? Learning models is really important. It 00:08:17.599 --> 00:08:21.046 makes you a more intelligent citizen, probably just, you know, sort of, just a 00:08:21.046 --> 00:08:24.630 more engaged person out there in the world. ?Cause so much of how people think 00:08:24.630 --> 00:08:28.491 and what people do is now based on models. Makes you a clearer thinker. That's why so 00:08:28.491 --> 00:08:32.213 many people are using models. It helps you use and understand data, and it's gonna 00:08:32.213 --> 00:08:35.936 help you make better decisions, strategize better, and even, you know, design things 00:08:35.936 --> 00:08:39.566 better. So the course should be really, really useful. We're gonna cover a lot of 00:08:39.566 --> 00:08:43.059 topics. The don't necessarily build on the one before, right? There'll be some 00:08:43.059 --> 00:08:46.690 quizzes and videos and that sort of stuff. And this should be just a great time. 00:08:46.690 --> 00:08:50.720 Alright. Welcome, and let's get started. Thank you.