English subtitles

← 13. Sequential games: moral hazard, incentives, and hungry lions

Get Embed Code
1 Language

Showing Revision 1 created 07/26/2012 by Amara Bot.

  1. Professor Ben Polak:
    Okay, so I want to set up a new
  2. topic today, and to get us
    started I thought we'd play a
  3. game.
    So the game is going to be
  4. called cash in a hat,
    and we'll see what this game is
  5. like in a minute.
    So the idea is this,
  6. there's going to be two
    players.
  7. I'm going to pick on two of you.
    I hope I can find people who
  8. brought some money to class.
    So Player I will have a choice,
  9. Player I can put nothing,
    $1 or $3 into a hat.
  10. The hat will then be given to
    Player II and Player II can look
  11. at what's in the hat and either
    she can match what's in the hat,
  12. i.e., add the same amount in so
    1 if it's 1, or 3 if it's 3,
  13. or she can simply take the
    cash.
  14. The payoffs of this game will
    be as follows.
  15. Player I, if they put nothing
    in, they're not going to get
  16. anything.
    If they put 1 in and it's
  17. matched then they double their
    money, so they get 2 back.
  18. If they put 3 in and it's
    matched then they double their
  19. money, so they get 6 back so
    they'll net 3.
  20. However, if they put 1 in and
    Player II takes it they've just
  21. lost their dollar,
    and if they put $3 in and
  22. Player II takes it they've lost
    $3.
  23. Everyone understand that?
    Pretty simple.
  24. From Player II's point of view,
    if Player II matches then they
  25. get their investment back plus
    $1.50 if they matched 1,
  26. so the get 2.50 back in all;
    and they get their investment
  27. back plus $2 if they match 3,
    so they get 5 back in all.
  28. Of course, if they simply take
    the money out of the hat and put
  29. it in their pocket then that's
    what they get in the game.
  30. So once again,
    if Player II matches with $1,
  31. there's $1 in the hat and she
    matches, then she gets back 2.50
  32. for a profit of 1.50.
    If there's $3 and she matches
  33. 3, she gets back 5 for a profit
    of 2;
  34. and if she simply takes the
    money out of the hat and puts it
  35. in her pocket then she gets how
    ever much money was in the hat.
  36. So I here will be the provider
    of the hat, we'll use the same
  37. hat we used before,
    and if necessary I'll provide
  38. sheets of paper,
    we can write IOU's on,
  39. but I'm hoping we can play for
    real cash.
  40. Later on--it would be nice to
    get a chair here--but for now,
  41. let me just go down here and
    I'm going to grab this mike.
  42. Who here wants to be Player I?
    Who brought some money to class
  43. today?
    I'm quite willing to pick on
  44. somebody.
    All right good,
  45. so Player I is,
    what's your name again?
  46. Student: Justin.
    Professor Ben Polak:
  47. Justin.
    So hold onto that hat for a
  48. second and who wants to be
    Player II?
  49. All right, so this gentleman
    here whose name is?
  50. Student: Nate.
    Professor Ben Polak:
  51. Nate.
    All right so we have Justin and
  52. Nate.
    So Justin you have to decide
  53. how much money you're going to
    put in the hat.
  54. Did you bring some money too
    Nate?
  55. Better make sure you have,
    otherwise borrow some from your
  56. neighbors.
    Student: I'm going to
  57. put in a buck.
    Professor Ben Polak: All
  58. right, so the money's being put
    in the hat and delivered to
  59. Nate.
    Student: I'm going to
  60. put in a buck too so that I get
    2.50.
  61. Professor Ben Polak: All
    right, so if we empty out the
  62. hat, here's our beautiful pink
    hat.
  63. I should really have the hat
    passed across the room rather
  64. than have me do it,
    but we have $2 in the hat so
  65. I'm going to pay each - I'm
    going to pay--what am I going to
  66. pay?
    I'm going to pay Player I $2,
  67. I'm going to give them back
    their original $1 and give them
  68. $2 in a second,
    and I'm going to pay Player II
  69. $1.50.
    Just hold the hat.
  70. Student: $2.50.
    Professor Ben Polak:
  71. $2.50, you got your money back.
    Trying to cheat me is that it?
  72. So Player, I'll get you the
    rest in a second,
  73. hang on a second for now,
    so we double the money.
  74. So one reason to take this
    class is, if you play well,
  75. you get money.
    All right, now let's just play
  76. again, so we had one round of
    this, let's have another round.
  77. Who wants to be Player I now?
    All right, so Katie is Player I.
  78. Katie did you bring some money?
    Student: Yeah, sure.
  79. Professor Ben Polak: All
    right, and who wants to be
  80. Player II?
    Someone further away,
  81. let's have, Steven isn't it?
    Is that right.
  82. Student: Yeah.
    Professor Ben Polak: So
  83. Steven's going to be Player II.
    Did you bring some cash?
  84. Student: I think so.
    Professor Ben Polak: All
  85. right, otherwise borrow it from
    your neighbors,
  86. I have IOU's here.
    So here's the hat.
  87. You've got coins great,
    so pass the hat down.
  88. Let's see if it makes it there,
    test the honesty of the class
  89. here.
    It's working its way down,
  90. I'm going to have to,
    there we go,
  91. all the way down to Steven.
    Check how much is in there.
  92. Student: There's a
    dollar in here.
  93. Professor Ben Polak:
    There's a dollar there so you
  94. can decide whether to match or
    not.
  95. Student: Okay,
    I'll match it.
  96. Professor Ben Polak: All
    right, so Steven's going to
  97. match as well.
    Once again, we have this time,
  98. rather annoyingly $1 and some
    coins in here,
  99. which I will empty on the stage
    to prove to everybody and I
  100. will--all right there are coins
    in here--so I'll match this in a
  101. second and give it back to you
    guys.
  102. So everyone understand the game?
    Everyone understand how this
  103. game works?
    So we're going to spend a while
  104. discussing this.
    We'd go on playing except I'd
  105. eventually lose so much money
    I'm going to run out of lunch
  106. money and the T.A.'s would
    object.
  107. All right, so what I want to do
    is I want to analyze this game.
  108. I want to think about what this
    game's about,
  109. what's really going on here.
    So first this is just a little
  110. game involving putting some
    money in a hat.
  111. What I want to suggest to you,
    that this is a toy version of a
  112. much more important game.
    This is a toy version of a game
  113. involving a lender and a
    borrower, so you could think of
  114. our Player I's--that was Katie
    on the second round and is it
  115. Justin on the first round,
    Justin on the first round.
  116. Imagine them working in an
    investment bank,
  117. or perhaps in a venture
    capitalist firm,
  118. and what they're doing is
    they're giving a loan to
  119. somebody, some budding
    entrepreneur who's come up with
  120. a new project.
    So Steven in Round 2 and who
  121. was my budding entrepreneur in
    Round 1?
  122. What was your name?
    Student: Nate.
  123. Professor Ben Polak:
    Nate in Round 1 have come up
  124. with some budding project.
    They've left Yale,
  125. perhaps after their junior
    year.
  126. They've left early because
    they've got some great idea
  127. which is going to make them
    millions.
  128. So either it's a new mouse trap
    or a new version of Facebook or
  129. something, and they go to
    the--and that's Harvard isn't
  130. it,
    so something better than a new
  131. version of Facebook--and they go
    to this venture capitalist firm.
  132. Maybe it's some firm in New
    York or maybe it's the Yale
  133. Investment Trust or whatever,
    and they explain this great
  134. idea to them and they ask for
    some money to invest in this
  135. firm to buy machinery and to pay
    early wages and so on.
  136. The lender, the guy who works
    at the venture capitalist firm,
  137. has to decide how much money to
    invest in this project.
  138. After this money has been
    invested in the project,
  139. the person who borrowed it
    faces choices.
  140. They could go forward with
    their project,
  141. work hard, spend the money on
    the things they're supposed to
  142. spend,
    or they could just disappear to
  143. Mexico (or go back to Yale for
    that matter).
  144. There's actually something in
    between they could do which is
  145. less dramatic than disappearing
    to Mexico,
  146. they could just not work
    particularly hard,
  147. shirk, perhaps join in someone
    else's project and just let the
  148. money run down without the
    lender really knowing about it.
  149. So this is a very real problem
    that some of you,
  150. some of the budding
    entrepreneur's in the room are
  151. going to face when you leave
    Yale,
  152. and more of you,
    who are going to end up working
  153. for investment bank's,
    perhaps unfortunately,
  154. are going to face as well.
    Okay so we're going to spend
  155. the whole of today analyzing
    this, but before we even start
  156. let's notice that there's
    something different about this
  157. game,
    other than its sheer triviality.
  158. There's something different
    about this game than games we've
  159. played all the way up to the
    mid-term so far.
  160. What's different about this
    game?
  161. Can I get a mike in here?
    Yes what's different?
  162. Student: Moves aren't
    simultaneous.
  163. Professor Ben Polak: The
    moves are not simultaneous.
  164. This is a sequential move game.
    Put the rules up here and bring
  165. this down, so for the first time
    we're looking at a sequential
  166. move game and we're going to
    spend most of the rest of the
  167. term looking at sequential move
    games,
  168. or at least looking at games
    that involve sequential
  169. elements.
    Now what's really,
  170. what makes this a sequential
    move game?
  171. Let's be careful about this.
    What makes it a sequential move
  172. game is not simply that Player I
    moved first and Player II moved
  173. second,
    although that's true,
  174. what really made this a
    sequential move game is that
  175. Player II knew what Player I had
    done before he got to make his
  176. move.
    It isn't the timing per se that
  177. matters, it's going to turn out
    it's the fact that Player II
  178. could observe Player I's choice
    before having to make his or her
  179. choice that matters.
    Notice, just while we're on the
  180. subject, more than that Player I
    knew this was going to be the
  181. case.
    So let's just get that down.
  182. So in this game Player II knows
    Player I's choice before she
  183. chooses--she being Player II,
    let's say before II chooses.
  184. Second, Player I knows that
    this will be the case,
  185. and the way we dramatized this
    just now is Player I put the
  186. money in the hat,
    the hat was transported across
  187. the classroom with people very
    honestly not stealing the money
  188. to Player II,
    Player II could look in the
  189. hat, could see how much money
    was in there,
  190. before Steven or Nate had to
    make that choice.
  191. Is that right?
    So how do we go about analyzing
  192. games that have this sequential
    structure?
  193. So it turns out that a useful
    way to analyze them,
  194. and the way I want to get us
    used to today already,
  195. is to draw a tree.
    So in the first half of the
  196. class we drew a lot of matrices,
    now we're going to draw trees.
  197. For those people who haven't
    seen trees before,
  198. not a big deal.
    It's an object that looks like
  199. a tree, that's why it's called a
    tree.
  200. So let's have a look,
    so in this particular game,
  201. here is Player I making her
    choice and in fact she has three
  202. possible choices.
    She can put nothing,
  203. 1 or 3 into the hat.
    Player II then chooses,
  204. although really Player II only
    has choices to make if there's
  205. some money in the hat,
    and Player II has two choices
  206. in this particular game in
    either case.
  207. Either Player II is going to
    add $1 or $3 into the hat or
  208. they're going to disappear off
    to Mexico, taking our investors'
  209. lunch money with them.
    Let's put down the payoffs.
  210. So let's put down as payoffs,
    the net gains they make.
  211. So if nothing goes in,
    no one makes any money so (0,0)
  212. will be the payoff.
    If 1 is matched by 1 then the
  213. investor, the lender doubles her
    money so she makes a net profit
  214. of 1.
    If the money is taken out by
  215. Player II then she loses 1.
    Down here if II matches her
  216. investment of 3 she doubles her
    money again, so she makes a net
  217. profit of 3, and if the money is
    taken out she loses 3.
  218. Everyone all right with that?
    Let's do Player II's payoffs.
  219. In the case in which he matches
    1 he makes a net profit of 1.50.
  220. In the case in which he just
    steals the 1 he makes a net
  221. profit of 1.
    And down here,
  222. in the case where he matches 3
    he makes a net profit of 2--he
  223. gets 5 back but he'd invested 3
    to start with --,
  224. and here if he just takes the
    money out he makes a profit of
  225. 3.
    So this game is just
  226. illustrated on the board,
    and for some of you this is the
  227. first time you've seen a tree,
    but I'm going to claim it isn't
  228. such a complicated object and we
    could spend a lot of time
  229. analyzing this in terms of
    connected graphs and so on,
  230. but it isn't really worth it:
    it's just a tree.
  231. Okay, so let's just make sure
    we understand that the payoffs
  232. here represent Player I's payoff
    and Player II's payoff
  233. respectively.
    So how do we analyze what to do
  234. in these games?
    It's a good idea always to
  235. write out the tree like it's a
    good idea to write out the
  236. matrix, but how to analyze what
    we should do?
  237. Well before we do that,
    let me just grab the mike and
  238. find out from our players what
    they should do.
  239. So where was my first investor?
    So Justin why did you put $1 in
  240. there?
    Student: Well if I put
  241. $1 in then he has an incentive
    to put $1 in too.
  242. If I put in $3 he's just going
    to take the money and run.
  243. Professor Ben Polak:
    Okay, so what's Justin doing?
  244. Justin is looking forward,
    if you like.
  245. He's putting himself in Player
    II's shoes.
  246. Who was our receiver?
    Nate.
  247. Nate's shoes and anticipating
    what Nate's going to do.
  248. And, as Justin said,
    if he puts in 3 Nate's going to
  249. have an incentive to take the
    money and run.
  250. In fact, what would Nate have
    done?
  251. If 3 had been in the hat what
    would you have done?
  252. Student: I like Justin
    but I would have taken the
  253. money.
    Professor Ben Polak: All
  254. right, even though you know
    Justin, you would have taken the
  255. money and disappeared somewhere,
    we don't quite know right?
  256. Is the same true for the other
    player?
  257. So Katie, you also put 1 in,
    is that your explanation as
  258. well?
    Student: Same reasoning.
  259. Professor Ben Polak:
    Same reasoning,
  260. okay.
    So everyone figured out what
  261. was going on,
    and once again,
  262. we think probably,
    let's try, Steven if it had
  263. been $3 in the hat what would
    you have done?
  264. Student: I would have
    taken it.
  265. Professor Ben Polak:
    Taken it, all right.
  266. So we're basically getting
    what's predicted here out of the
  267. game.
    Let's just see that more
  268. formally, that idea.
    So the idea here is that the
  269. players who move early on in the
    game should do something we've
  270. always talked about before.
    They should put themselves in
  271. the shoes of the other players,
    but here that takes the form of
  272. anticipation.
    They're anticipating what
  273. players down the tree are going
    to do.
  274. So the key idea here is
    anticipation.
  275. What they're going to do is
    they're going to look forward
  276. down the tree;
    imagine themselves in the shoes
  277. of the later players;
    look at the incentives facing
  278. those later players;
    imagine that they do the best
  279. they can--those later players do
    the best they can --;
  280. and then walk backwards through
    the tree.
  281. So the key idea here is to look
    forward to the end of the tree
  282. and work back:
    look forward and work back.
  283. That's exactly the process that
    Katie and Justin described
  284. themselves as doing.
    Now in this tree,
  285. let's have a go at this.
    So let's imagine ourselves in
  286. Steven or Nate's shoes,
    having found--as they did in
  287. fact--having found $1 in the
    hat.
  288. They then are making a choice
    between adding $1 which nets
  289. them a $1.50 or taking the $1
    which nets them $1.
  290. 1.50 is bigger than 1,
    so we think they're going to go
  291. this way.
    Everyone happy with that?
  292. Conversely, if they found
    themselves with $3 in the hat
  293. then, if they add $3 --they
    match--they're going to get $2,
  294. they're going to net $2;
    and if they take the money out
  295. they're going to get away with
    $3.
  296. 3 seems bigger than 2 so we
    think they're going to go this
  297. way.
    Again, we're assuming that
  298. these are their real payoffs.
    Okay, so from Player I's point
  299. of view, since Player I knows
    that Player II will observe the
  300. choice before making her own
    choice,
  301. and since Player I can put him
    or herself in Player II's shoes
  302. and anticipate this choice,
    Player I is essentially
  303. choosing between what?
    If she puts in 0 she gets 0,
  304. if she puts in 1 she knows II
    will match and she'll double her
  305. money and get 1,
    and if she puts in 3,
  306. then she knows Player II will
    take the money and run,
  307. in which case our lender will
    have lost $3.
  308. So she's choosing between 0,1
    and -3 and she's going to choose
  309. $1, which is exactly what
    happened in the game.
  310. Everyone happy with how we did
    that?
  311. So this idea of starting at the
    last player, the player who
  312. moves last, solving out what
    they're going to and then
  313. working our way back through the
    tree has a name.
  314. The name for this is "backward
    induction."
  315. I apologize for the
    business-school students in the
  316. room who have had this hammered
    into them too many times already
  317. so I apologize for them in
    advance.
  318. Backward induction.
    Now it turns out there's
  319. precisely one controversial
    thing about backward induction,
  320. which is whether you should say
    backward induction or backwards
  321. induction.
    So I'm going to leave the "s"
  322. off and call it backward
    induction, but it's up to you
  323. which you think the correct
    English is.
  324. Okay, so since we've got this
    game here let's talk about it a
  325. bit.
    This idea of the lender and the
  326. borrower--it's not an
    unimportant situation.
  327. We've simplified it down to a
    very simple choice here,
  328. but it's a very basic choice
    out there in the real world.
  329. It's an important choice,
    after all, what keeps the
  330. economy running is the ability
    of lenders to lend money to
  331. businesses that then invest them
    profitability,
  332. and are able to make returns.
    So this underlies a whole bunch
  333. of stuff going on in the
    economy.
  334. So let's talk about this and
    the first thing I want to talk
  335. about is, I want to point out
    that there's a problem here.
  336. There's a bad thing here,
    much like there was a bad thing
  337. in the very first class of the
    semester.
  338. In the very first class of the
    semester the bad thing was a
  339. Prisoner's Dilemma,
    but here it's a little
  340. different but let's focus on it
    a while.
  341. So the bad thing is we ended up
    here.
  342. Not a disaster:
    the lender doubled her money.
  343. The borrower was able to go
    ahead with the project on a
  344. small scale and make some money.
    The bad thing,
  345. however, is that we would have
    liked to end up here.
  346. Had we been able to have a
    large project funded with $3 and
  347. had the borrower repaid those
    dollars,
  348. so matched in effort or
    whatever that happens to be,
  349. then we'd have ended up at an
    outcome that is better not just
  350. for the lender--it's obviously
    better for the lender--but it's
  351. also better for the borrower.
    Does everyone see that?
  352. So the outcome we would like to
    have reached gets the lender 3
  353. whereas she's only getting 1
    where we ended up,
  354. and it gets the borrow 2 where
    he's only getting 1.50 where we
  355. actually ended up.
    Why is it that we were unable
  356. through both of our pairs,
    both Justin and Nate,
  357. and Katie and Steven--why were
    we unable to get to here?
  358. Any takers?
    What got us,
  359. what prevents us from getting
    to this good outcome?
  360. Let's go back to Katie a second
    and see if we can,
  361. go ahead.
    I'll come it's all right.
  362. So Katie was saying why she
    invested 1.
  363. I picked on Justin before,
    so here's this good outcome,
  364. it's better for you,
    it's better for Steven who's
  365. your borrower,
    why aren't we getting to this
  366. 3,
    2 outcome?
  367. Student: Because if I
    play 3, him matching my money is
  368. dominated, strictly dominated.
    Professor Ben Polak: All
  369. right, so the problem here is
    Katie would like to invest 3 if
  370. she knew that in fact Steven was
    going to match,
  371. but she knows Steven's not
    going to match.
  372. Steven, in fact,
    is going to run off with the
  373. cash.
    It's going to end up being
  374. spent at Starbuck's or
    something.
  375. From Steven's point of view he
    would actually like to be able
  376. to borrow the $3.
    He would like to be able to end
  377. up at this 3,2 outcome as well.
    The problem is he knows that
  378. Katie knows, and Katie knows
    that he knows,
  379. that in fact he's going to run
    off with the cash.
  380. Is that right?
    So we're in a bit of a problem
  381. here.
    We'd like to get this good
  382. outcome but incentives are
    preventing us from getting
  383. there.
    What do we call this kind of
  384. situation?
    What do we call this kind of
  385. problem?
    Anybody?
  386. No takers?
    This is called "moral hazard."
  387. How many of you have heard the
    term moral hazard before?
  388. A number of people have heard
    of it.
  389. So moral hazard here is the
    problem.
  390. That the agent,
    in this case the borrower has
  391. incentives--will have
    incentives--to do things that
  392. are not in the interests of the
    principal.
  393. If we're not careful,
    for example,
  394. by giving too big a loan,
    the incentives of the borrower
  395. will be such that they will
    actually not be aligned with the
  396. incentives of the lender and
    that will prove to be bad for
  397. the lender.
    But notice the existence of
  398. this moral hazard problem isn't
    only bad for the lender.
  399. It ends up being bad for the
    borrower as well.
  400. Let me give you another
    example, in insurance there's a
  401. classic, probably the classic
    moral-hazard problem.
  402. If an insurance company insures
    me against having my car stolen,
  403. there's a moral hazard problem
    in that I might,
  404. if the car is fully insured,
    I might now have the incentive
  405. not to bother to lock my car or
    to leave it anywhere on the
  406. street in New Haven,
    because I'm not bearing the
  407. cost.
    So we need to be able to--the
  408. insurance company needs to worry
    about that in writing an
  409. insurance contract for me.
    And in practice what the
  410. insurance company does is it
    forces me to take a deductible.
  411. It doesn't allow me full
    insurance.
  412. It makes sure that some of the
    cost of having a car stolen is
  413. going to fall on me.
    That's bad for me by the way
  414. because it means I can't get
    full insurance.
  415. And notice that what happened
    here is rather similar.
  416. In this case,
    both the lender and the
  417. borrower would rather have a big
    project, a big loan,
  418. and have the agent pay back
    that loan.
  419. They'd prefer this (3,2)
    outcome, but they can't get
  420. there.
    And therefore rather than
  421. having a full scale project,
    Katie or Justin only offered
  422. the borrowers a small scale
    project.
  423. So the idea here in this
    example was--what happened was
  424. we kept the size of the project
    or the size of the loan--you can
  425. think of this as the project if
    they're borrowing
  426. something--small to reduce the
    temptation to cheat,
  427. to reduce the temptation not to
    repay the loan.
  428. So these problems are all over
    society but they're particularly
  429. prevalent in situations
    involving lenders and borrowers.
  430. So a good question is how might
    we solve this problem?
  431. A lot of you are about to go
    out and try and borrow money to
  432. set up your projects,
    run your new version of
  433. Facebook or whatever.
    And a lot more of you as we
  434. said before, are going to end up
    loaning money,
  435. perhaps someone else's money
    probably, at least for a while.
  436. And we want to know how you
    might solve this moral hazard
  437. problem.
    We've seen one way to solve it.
  438. One way is for me to keep the
    size of the project small.
  439. What else could we do?
    Let me come down and try and
  440. have a bit of a discussion.
    What else could we do?
  441. Steven what else could we do?
    You were the borrower,
  442. what else could you do?
    Student: Impose laws.
  443. Professor Ben Polak: We
    could impose laws.
  444. Okay, so we could impose laws
    much like we tried in the
  445. Prisoner's Dilemma,
    we could regulate this market.
  446. There's going to be a little
    bit of a problem we're going to
  447. run into in regulating the law,
    in using the law here.
  448. What's the problem we're going
    to run into here?
  449. We do have laws that regulate
    borrowers and lenders.
  450. What's the law that regulates
    borrowers and lenders:
  451. bankruptcy law right?
    So we have law that regulates
  452. to some extent what lenders and
    borrowers can do.
  453. One of the things that we do in
    that law is we limit the degree
  454. to which we're allowed to punish
    the borrowers.
  455. So bankruptcy law pretty much
    says we're going to put some
  456. limit on how much Katie can
    inflict pain on Steven for
  457. running off with the lunch
    money.
  458. Steven can simply say I can't
    repay, I'm bankrupt,
  459. and he's more or less allowed
    to have a fresh start.
  460. So the law would be good here
    but there's limits to what we
  461. can do.
    We could go back to the
  462. Dickensian world of throwing
    Steven in jail for not repaying
  463. the loan, but it's tough.
    There are down sides to that it
  464. turns out.
    So what else could we do?
  465. Yeah, I'm going to have to dive
    in there, so shout into the
  466. mike.
    Student: Venture
  467. capitalist's can impose
    restrictions on what the person
  468. can do with the money.
    Professor Ben Polak: So
  469. you could try and say there's
    only certain things you can do
  470. with the money.
    And, for example,
  471. to make that credible,
    what you might have to do is to
  472. say: show me what you're going
    to do with the money,
  473. show me the receipts.
    Or another thing you could say
  474. is, I will only give you the
    money if you show me exactly how
  475. you're going to spend this.
    That's a good solution.
  476. Notice that you could regard
    that as changing the order of
  477. play.
    It's a little bit like saying,
  478. let's have the borrower have to
    move first, let's have the
  479. borrower have to commit what
    their actions are going to be,
  480. how they're going to spend the
    money before--you know--show the
  481. contracts before the lender
    lends them the money.
  482. That's a good solution,
    but again there are limitations
  483. to this solution.
    What are the limitations to
  484. that solution in the real world?
    Let's go back to all of you who
  485. are thinking about making
    millions with your new Facebook,
  486. what are the limitations to
    having some contracts with the
  487. bank, with Chase,
    saying exactly how you're going
  488. to--not only is it saying
    exactly how you're going to
  489. spend the money but actually
    detailing it already with
  490. contracts ahead of time.
    What are the problems with that?
  491. Someone else who's thinking of
    going into business here--let me
  492. pick on somebody a second.
    So what's your name sir?
  493. Student: Kelly
    Professor Ben Polak:
  494. Kelly, so imagine you have some
    project,
  495. you're going to borrow some
    money from Chase or Mr.
  496. Swenson to set up this fund,
    why might you worry about
  497. having everything written out
    contractually ahead of time?
  498. Student: I mean then it
    restricts your freedom on what
  499. your ideas are,
    I guess.
  500. Professor Ben Polak:
    Right, so one problem is just
  501. sheer lack of flexibility.
    If, in fact,
  502. the entrepreneur is not going
    to be flexible on how they
  503. actually run the project,
    there's not much point of being
  504. an entrepreneur in the first
    place.
  505. There's a limit to how much
    control you want to give to the
  506. lender.
    The lenders don't really want
  507. to run the project for you.
    There's also a severe problem
  508. of timing.
    It's simply not going to be
  509. possible to specify all the
    expenditures of a project up
  510. front.
    You're going to need some money
  511. to do some things ahead of time.
    So it's a good idea--changing
  512. the order of play is a good
    idea--but there's going to be
  513. limitations to this.
    What else could you do as the
  514. lender?
    What else could you do?
  515. Thinking of loaning some money
    in here, loaning some money to
  516. one of your classmates for some
    project.
  517. Let's try here.
    Student: Something like
  518. loan money over time in stages
    and observe progress.
  519. Professor Ben Polak:
    Good, so one possibility is to
  520. break the loan up and give
    it--let it come in small
  521. installments.
    If you do well on the first
  522. installment I'll give you a
    bigger installment next time.
  523. Notice that that's a little bit
    like something we talked about
  524. in the very first week.
    It's a little bit like taking
  525. this one shot game and turning
    it into a repeated game.
  526. It's turning a one shot
    interaction into a repeated
  527. interaction.
    We'll come back and talk about
  528. repeated interactions later on
    in the course.
  529. Again, notice that there might
    be some limitations to this.
  530. I mean I think it's a good
    idea, and in general you should
  531. be trying to do this,
    but it could just be that the
  532. set up--for Nate to set up his
    new Facebook and for Steven to
  533. set up his new mousetrap
    factory--they just need a chunk
  534. of money at the beginning.
    There may be some limit to how
  535. little money you can give them
    at the beginning since they have
  536. to hire workers,
    and set up the factory,
  537. and make some big splash in
    advertising, and so on and so
  538. forth.
    So again, it's a good idea but
  539. there may be limitations here.
    What else could you do?
  540. Let's have a look at the
    numbers on the board.
  541. Right now, the numbers on the
    board say that if Steven's
  542. mousetrap project--were you the
    Facebook or the mousetrap?--if
  543. Steven's mousetrap project goes
    well and it ends up being a big
  544. project with lots of money
    available.
  545. There are five units of money
    available in profit,
  546. 3 plus 2.
    The way we've designed things
  547. so far, our lender--I gave you
    these payoffs in the game--the
  548. lender is going to double her
    money.
  549. She's going to get 6 back to
    net 3 and Steven's going to get
  550. 2.
    So the division of that 5 is 3
  551. and 2.
    But we don't have to divide the
  552. money like that.
    If you're a lender you may want
  553. to think about how you divide up
    the spoils of a successful
  554. project.
    So how else could you divide up
  555. those $5, that 3 plus 2,
    in a way that might be better
  556. for everybody?
    Nate?
  557. Student: Give the
    investor $3.01 if they do that
  558. other choice and then take a
    little bit of a hit but that way
  559. you'll still have something
    that's pretty optimal to the
  560. other choices that we came to
    before.
  561. Professor Ben Polak: I'm
    guessing you all didn't hear
  562. that but that was right,
    so let me just say it.
  563. That's exactly right.
    So what you could do here is
  564. you could redesign the payoffs
    of this project in the event
  565. that it's successful.
    What you could do is take these
  566. 5 units of profit and instead of
    having (3,2)--let's write that
  567. in a different color--we could
    give the borrower,
  568. the entrepreneur 3.1,
    leaving 1.9 to the lender.
  569. All right, so we could change
    the payoffs of this contract to
  570. be (1.9,3.1) in the event of a
    big project.
  571. Why does that help?
    Because now our borrower,
  572. Nate or Steven,
    were they to end up with a hat
  573. with $3 in it,
    they now will find it in their
  574. interest to match the $3 and get
    3.1 rather than 3.
  575. So what's this an example of?
    This idea of changing the
  576. contract to give the borrower
    incentives to repay,
  577. incentives not to shirk,
    incentives not to disappear
  578. with the cash,
    is an example of what's called
  579. "incentive design."
  580. So the idea here is,
    in a lot of businesses,
  581. incentives are not given by
    God.
  582. They're designed by the people
    who write the contracts.
  583. If you're either the borrower
    or the lender,
  584. you should be thinking about
    writing a contract that's going
  585. to achieve the end that you
    want.
  586. Now notice in this particular
    example, exactly as Nate said,
  587. the lender's going to take a
    bit of a hit here,
  588. they're not going to get a 100%
    return, but they're ending up
  589. doing better than they would
    have done by giving a small
  590. loan.
    They've ended up getting 1.9
  591. rather than 1,
    so in this particular
  592. example--it won't always be the
    case but in this
  593. example--they're taking a
    smaller share of a larger pie.
  594. Sometimes a smaller share of a
    larger pie is bigger than a
  595. larger share of a smaller pie:
    not always but sometimes.
  596. The reason the pie is larger is
    we're able now in an incentive
  597. compatible way--we're able to go
    ahead and have a large project.
  598. So in this example,
    the smaller share of the larger
  599. pie actually ended up being
    bigger than the larger share of
  600. the smaller pie.
    So a smaller share of a larger
  601. pie can be bigger than a large
    share of a small pie,
  602. not always of course.
    This is just an example.
  603. Now let's just be a little bit
    careful here.
  604. So in this particular example,
    even though this 1.9--this
  605. return that our lender is able
    to make by accepting a smaller
  606. share of a larger pie--even
    though it's a bigger absolute
  607. return,
    it's smaller in another sense.
  608. In what sense is it smaller?
    Let's be careful here.
  609. We're tempted to say look it's
    1.9.
  610. 1.9 is bigger than 1.
    That's got to be a good thing.
  611. But since some of you actually
    are going to go out into
  612. investment banks,
    and you might be posed this
  613. question,
    let me just make sure we don't
  614. make a blunder that's going to
    spoil your interview.
  615. So what might you be worried
    about as an investment banker
  616. about this deal,
    about this contract?
  617. Somebody else, anybody else?
    I'll come down again,
  618. that's all right.
    What might you be worried about?
  619. Who here, be honest,
    who here has been interviewing
  620. with either investment banks or
    venture capital firms,
  621. or private equity firms?
    Raise your hands.
  622. Oh come on there has got to be
    more than that.
  623. There was a hand.
    Where was the hand at the back?
  624. There was a hand way back here.
    No, yeah?
  625. What might be wrong with this
    project from your point of view?
  626. Can I get the mike on that side?
    This gentleman here,
  627. what might be wrong with the
    project from your point of view?
  628. Student: If you could
    actually just find three small
  629. projects then you could still
    get a 100% returns just by
  630. dividing it up.
    Professor Ben Polak:
  631. Good, you're going to make lots
    of money for Morgan Stanley or
  632. whatever it is,
    good.
  633. Your name is?
    Brian, right so it's a good ad
  634. to hire Brian at Morgan Stanley.
    So what Brian said was it's
  635. true that you're making a bigger
    absolute return here.
  636. This is a bigger absolute
    return.
  637. But this return up here is a
    100% return on your capital.
  638. The rate of return on capital
    is higher making a 100% return
  639. on an investment of 1,
    than making a 1.9 return on an
  640. investment of 3.
    The rate of return is better in
  641. the smaller case.
    So which is the right answer?
  642. Should we worry about the rate
    of return, the 100% rate of
  643. return or should we worry about
    the absolute return?
  644. If you're in the business of
    investment banking,
  645. who thinks we should worry
    about the absolute return?
  646. Who thinks we should worry
    about the rate of return?
  647. Who thinks it depends?
    What does it depend on?
  648. (Keep me from falling off this
    thing.) Wwhat does it depend on?
  649. Somebody said it depends.
    What does it depend on?
  650. Depends is only a good
    compromise vote if you know what
  651. it depends on.
    Yeah, the gentleman here.
  652. Student: Since an
    investment banking firm only has
  653. limited amount of money to
    invest,
  654. you want to see if you can get
    a higher rate of return
  655. investing elsewhere.
    Professor Ben Polak: All
  656. right, so there's two--I think
    you're on the right line.
  657. So there's two things to worry
    about, one is what is your
  658. supply of funds to invest and
    the other is--what's the other
  659. supply you are worried about as
    an investment banker or venture
  660. capitalist firm,
    or a private equity investor?
  661. What's the other supply coming
    your way?
  662. The number of projects coming
    across your door.
  663. The demand for your cash.
    The supply of projects.
  664. What we have to worry about
    here is what is the true
  665. opportunity cost of that money?
    If there were very few projects
  666. out there then you basically
    just want to go down the list of
  667. projects picking off absolute
    return,
  668. however, you go to the other
    extreme if there's an infinite
  669. supply of projects,
    all coming in,
  670. all of which will offer a 100%
    return, then as our banker at
  671. the back told us,
    you should just go for three
  672. small projects rather than one
    large one.
  673. But the question is,
    which is better?
  674. It isn't as simple as saying,
    yes, we should always go for
  675. three small projects.
    There may not be three small
  676. projects out there that offer a
    100% return.
  677. So what's going to matter here
    is what's the true opportunity
  678. cost of the capital,
    and that's going to depend on
  679. what projects are available to
    you.
  680. This was just an example of
    incentive design,
  681. just one example of incentive
    design.
  682. Let's talk about other examples
    of incentive design.
  683. Incentive design's been a lot
    in the news lately.
  684. Who else tends to have
    incentive contracts other than
  685. borrowers?
    Who else out there tends to
  686. have incentive contracts?
    It's a big topic in the news
  687. these days.
    So who has incentives written
  688. into their contracts?
    Way over here,
  689. let me run over at the risk of
    blurring the camera.
  690. So who else has incentives in
    their contracts?
  691. Student: Managers
    Professor Ben Polak:
  692. Managers all right.
    So CEO's have huge incentive
  693. clauses in their contracts,
    sometimes in the form of
  694. options.
    Why do they have those
  695. incentive clauses in their
    contract?
  696. Well there's two
    interpretations of this.
  697. One interpretation is,
    it's just a bad thing and
  698. they're just trying to screw the
    world.
  699. But there's a more moderate
    view of this.
  700. The more moderate view of this
    is that those incentive clauses
  701. in their contracts are attempts
    to align the interest of the
  702. managers with the interest of
    the shareholders,
  703. just as this contract is
    attempting to align the interest
  704. of the lender and the interest
    of the borrower.
  705. So by managers I took him to
    mean CEOs, but what other kind
  706. of managers recently in the
    press,
  707. very recently,
    this weekend,
  708. have been talking about
    incentive contracts?
  709. Way over there,
    I'm caught on the wrong side as
  710. usual.
    Can I squeeze through?
  711. I'll try and squeeze through.
    Can I squeeze through?
  712. Sorry.
    Where was it,
  713. where was the answer to my
    question?
  714. Here we go.
    Student: Managers of
  715. baseball teams.
    Professor Ben Polak: All
  716. right, so managers of sports
    teams also have incentive
  717. contracts these days:
    incentives to achieve certain
  718. levels.
    Here it's not quite clear to me
  719. exactly what's going on.
    We call those incentive
  720. contracts but you might want to
    worry a little bit.
  721. I mean do we really think that
    the manager has an incentive to
  722. lose otherwise?
    So part of what's going on with
  723. incentive contracts isn't
    actually incentive contracts,
  724. part of it is about sharing the
    risk.
  725. Particularly in the kind of
    contracts that have been written
  726. about over the weekend with Joe
    Torres' contract for the
  727. Yankees,
    which he turned down,
  728. is partly perhaps to give him
    incentives to try and win,
  729. but perhaps it's more to do
    with sharing the risk among the
  730. general manager and the manager
    of the team if they don't make
  731. it to the playoffs.
    But all right,
  732. they're still incentive
    contracts and we see incentive
  733. contracts elsewhere,
    and we've seen them since the
  734. Middle Ages.
    So in the Middle Ages,
  735. incentive contracts took a
    particular form,
  736. and the form was called two
    things,
  737. piece rates and in agriculture
    it took the form of share
  738. cropping.
    What are piece rates?
  739. Somebody, what are piece rates?
    Anybody who knows what a piece
  740. rate is?
    What is share cropping?
  741. Somebody must be from a farming
    state.
  742. Somebody out here,
    the guy in orange.
  743. Student: Share cropping
    is where the landowner takes a
  744. portion of the farmer's crop at
    the end of the farming season.
  745. Professor Ben Polak: All
    right, so share cropping is a
  746. contract whereby part of the
    output of the farm goes to the
  747. landowner and part is kept by
    the farmer.
  748. Going back to the Middle Ages,
    part of the output would stay
  749. with the peasant and part of it
    would go to the lord of the
  750. manor.
    Piece rates are a similar idea.
  751. Rather than paying wages to
    workers, just bundles of cash,
  752. the amount which they're paid
    depends on output.
  753. So piece rates would say if you
    produce seventeen yards of cloth
  754. then you'll get paid the revenue
    that's derived from three of
  755. those.
    Both of these forms of
  756. contracting are there partly to
    create incentives.
  757. They create incentives because
    they create the incentive for
  758. the farmer to increase yields
    and they create incentives on
  759. the part of the worker to
    increase output.
  760. So piece rates are a big part
    of the story.
  761. Incentive designs are a pretty
    big topic and we don't really
  762. have time to go into it much in
    this course.
  763. But it's worth mentioning that,
    in fact, the Nobel Prize that
  764. was won last week by those three
    economists we talked about was
  765. partly about incentive design.
    So it's regarded as an
  766. important enough topic,
    the design of incentives across
  767. society as a whole to have been
    worthy of winning the Nobel
  768. Prize,
    and my guess is that there's
  769. another Nobel Prize that's going
    to come out in this same area in
  770. the next five or ten years.
    Just to make it clear,
  771. you can have problems:
    you can write incentives badly.
  772. It's a classic story that
    you've all heard of I'm sure,
  773. that often in the provision of
    public goods,
  774. let's say bridges or highways,
    or tunnels, the incentives for
  775. the construction companies tend
    to be rather badly written.
  776. They're meant to have incentive
    clauses but they never seem to
  777. quite get them right.
    It's easy to get these
  778. incentives wrong.
    Just to prove this just doesn't
  779. happen to management,
    let me just tell a story
  780. against myself.
    So a year ago,
  781. a year and a bit ago,
    I was picking fruit with my
  782. then three year old daughter and
    I wanted to make sure that we
  783. got some fruit home to eat for
    dinner,
  784. and she didn't just eat all the
    fruit as we picked it.
  785. These were raspberries.
    So I decided to have a rule
  786. that sounds efficient.
    I said any item of fruit that
  787. gets picked that's in good
    condition we'll take it home for
  788. dinner,
    but if it's squashed,
  789. then we'll eat it then and
    there while we're picking the
  790. fruit.
    So my three year old proceeded
  791. to go along the bushes of
    raspberries, picking the
  792. raspberry,
    turning around to me and saying
  793. "squashed" and putting it in her
    mouth: "squashed,"
  794. "squashed," which shows that a
    three year old knows more than a
  795. professor about moral hazard at
    some level.
  796. So one thing we can do is worry
    about incentive design in the
  797. design of the contract.
    There's another thing we can do
  798. here.
    What's the other thing we can
  799. do here?
    We're missing out on something
  800. very basic.
    What is it that our borrowers
  801. Steven and Nate can do?
    To put it more generally,
  802. for all of you,
    if you go out and start a
  803. company,
    and you borrow some money to
  804. start your company when you
    leave Yale, what are you going
  805. to have to do?
    What's the borrower--what's the
  806. lender, what's the bank going to
    insist--what is it going to
  807. insist on?
    Anybody?
  808. They're going to insist on
    collateral.
  809. What is collateral?
    I'm somewhat charmed to see how
  810. few of you are going to end up
    borrowing money,
  811. but for those of you who
    haven't got trust funds,
  812. you're going to end up
    borrowing money.
  813. So what is collateral?
    Yes, what's collateral?
  814. Student: You give
    someone the whole--You let the
  815. bank hold onto something.
    Professor Ben Polak:
  816. Right, you let the bank hold
    onto something in the event that
  817. you default on the loan.
    You're going to let the bank
  818. hold onto something in the event
    that you default on the loan.
  819. Now for most people here,
    what are they going to post as
  820. collateral?
    Let's assume that we're back to
  821. our original contract here--we
    can't do this--we're back to our
  822. original case.
    Wwhat are most of you going to
  823. end up using as collateral if
    you borrow money?
  824. Everyone's saying a house,
    I'm guessing that for most of
  825. you --I'm not going to look
    around the room because I don't
  826. know how many trust fund babies
    I've got in the room--but for
  827. most of you,
    it's going to be what?
  828. It's going to be your parent's
    house.
  829. So why is it that your parent's
    house being posted as collateral
  830. is going to prevent you from
    disappearing to Mexico with a
  831. loan?
    Well let's have a look at how
  832. that changes the payoffs.
    Now if you take the money and
  833. run, you disappear to Mexico,
    but you end up with $1 minus a
  834. house here, and $3 minus a house
    there.
  835. Of course, we're relying here
    on you liking your parents
  836. enough not to mind about the
    fact that they'll be sleeping in
  837. the rain,
    but as long as that's the case,
  838. I think with high probability
    you're now going to decide to
  839. return the loan,
    at least if you can.
  840. Is that right?
    Now notice that the way in
  841. which this collateral works.
    There's a subtlety to how this
  842. collateral works.
    You might think that the way in
  843. which collateral works is that
    now the lender feels safer,
  844. because even if you end up
    defaulting on the loan,
  845. at least they get this house.
    That's true.
  846. There's some truth in that,
    but that's really not the key
  847. here.
    Frankly, most bank lenders
  848. don't want your parent's house.
    Your parents might have very
  849. nice houses but this isn't a
    good housing market right now
  850. and they don't particularly want
    to see your parents on the
  851. streets.
    The way in which this works is
  852. not so much that it gives an
    extra positive return to the
  853. lender,
    the way in which it works is it
  854. gives an extra negative return
    to the borrower.
  855. So let's just do this backward
    induction again with the house
  856. in place.
    So with this house in place,
  857. as before, you'll return the
    loan up here,
  858. but now provided you like your
    parents enough,
  859. you would also return the loan
    here because you prefer $2 to $3
  860. minus your parent's house,
    in which case you'll get a loan
  861. of 3.
    So the way in which the
  862. collateral worked was by hurting
    you enough (or in this case
  863. hurting your parents enough) in
    the event of default,
  864. in such a way as to change your
    behavior.
  865. But notice it helped you.
    So here's a subtlety coming in
  866. here.
    By issuing collateral,
  867. we actually lowered the payoff
    of the borrower at certain end
  868. points of the tree.
    We lowered the payoff of the
  869. borrower at certain end points
    of the tree.
  870. But as a consequence we were
    able to end up at an outcome
  871. that was better for the
    borrower.
  872. Sometimes having lower payoffs
    can actually make you better
  873. off, and the reason it makes you
    better off,
  874. is it changes the behavior of
    other players in the game in
  875. such a way as may benefit you;
    in this example,
  876. by inducing the lender to give
    you a bigger loan.
  877. So the way the collateral works
    is it lowers your payoffs if you
  878. do not repay,
    but it leads to you being
  879. better off.
    The reason it makes you better
  880. off is it changes the choices of
    others in a way that helps you.
  881. We're going to see lots of
    examples of that in the next
  882. couple of days.
    Now collateral is an example,
  883. we talked about incentive
    design, collateral is also an
  884. example of something larger.
    The larger thing is the idea of
  885. commitment.
    Collateral is a commitment
  886. strategy.
    It commits you to repaying the
  887. loan.
    I want to give another example
  888. of a commitment strategy.
    I think this idea of
  889. commitments is an important one,
    and we have just about time to
  890. do it.
    So I wanted to go completely
  891. away from our context of our
    lender borrower game a second
  892. and look somewhere else
    entirely,
  893. so we get rid of this for now.
    I want to take us back to 1066,
  894. what happened in 1066?
    What happened in 1066,
  895. I know this is America and no
    one gets an education anymore,
  896. but still, what happened in
    1066?
  897. The Norman Conquest took place
    in 1066.
  898. So the Norman Army invades from
    Normandy led by William the
  899. Conqueror and lands on the
    Sussex beaches at Hastings.
  900. What I want to imagine is a
    game involving the Norman Army,
  901. or William the Conqueror in
    particular, and the Saxon Army.
  902. So here's the little game.
    So we're in the area of
  903. medieval military strategy,
    and here is the game.
  904. We'll start with the Saxon's
    here, the Norman's have already
  905. invaded, so I guess the Norman's
    have already moved but here's
  906. the Saxon Army making a
    decision.
  907. They're deciding between
    fighting and the important
  908. medieval military tactic known
    as running away.
  909. Of course the Normans are going
    to know whether the Saxons are
  910. actually going to charge down
    the beach and attack them,
  911. and they are going to have to
    decide whether to fight or run
  912. away.
    Let's put the payoffs--now
  913. leave some space below here,
    leave some space down here on
  914. our notes--let's put the payoffs
    in.
  915. So I'm going to put the Norman
    payoffs first and the Saxon
  916. payoffs second.
    Norman payoffs first,
  917. Saxon payoffs second.
    So the payoffs are going to be
  918. like this.
    If both sides fight,
  919. a lot of people end up dead and
    we'll call that (0,0).
  920. If the Saxons fight and the
    Normans run away,
  921. the Saxons get 2 and the
    Normans get 1 because at least
  922. they're alive.
    If the Saxons run away and the
  923. Normans fight,
    then the Normans get 2 and the
  924. Saxons get 1.
    I just flip things around.
  925. And if both lots run away then
    ultimately it's good for the
  926. Saxons.
    So this is the game.
  927. Let's be careful,
    I've put the Saxon payoff
  928. second.
    If we analyze this game by
  929. backward induction we can see
    that the Normans are in trouble.
  930. Why are the Normans in trouble?
    Because if they are attacked by
  931. the Saxons then fighting yields
    0 and running away yields 1,
  932. so they're going to run away.
    Conversely, if the Saxons run
  933. away then in fact,
    the Normans will stay and fight
  934. because it's easy to stay and
    fight against people who are
  935. running away.
    Even I can do that.
  936. So they're going to go for 2
    rather than 1.
  937. But now the Saxons face this
    choice, they know that if they
  938. attack the Normans the Norman's
    will run away and the Saxons
  939. will get 2.
    But if the Saxons run away then
  940. the Norman's will stay and
    fight, and the Saxons will only
  941. get 1.
    So the Saxons are going to end
  942. up fighting.
    Everyone see how we did the
  943. backward induction?
    So this is not a good situation
  944. if you happen to be the Norman
    commander, if you're the guy who
  945. led these troops across the
    channel.
  946. So what did William the
    Conqueror do?
  947. He burned his ships.
    So let's now make this game a
  948. little bit more like the real
    historical situation,
  949. here's a move that William the
    Conqueror, the King of the
  950. Normans can do.
    He can either not burn his
  951. ships or he can burn them.
    If he burns them then the game
  952. is rather different.
    Once again, the Saxons can
  953. decide whether to fight or run
    away, but now the Normans really
  954. don't have any choice.
    They have to stay and fight
  955. because swimming back to France
    wearing armor is difficult.
  956. So the payoffs become (0,0) and
    (2,1) as before.
  957. So we know now that the Normans
    are going to stay and fight.
  958. That's all that's available to
    them.
  959. I've put it that they don't
    even have the option of running
  960. away.
    If you want,
  961. you could put running yields
    them minus infinity or whatever
  962. it is to get drowned.
    So now if the Saxons,
  963. if the Norman's have burnt the
    boats, the Saxons know that if
  964. they fight the Saxons will get
    0,
  965. if they run away they'll get
    -1, so now they'll run away.
  966. Then from William the
    Conqueror's point of view,
  967. what's he choosing between?
    He knows that if he doesn't
  968. burn his boats the Saxons will
    fight, and his troops will run
  969. away.
    But if he does burn his boats,
  970. the Saxons will run away and
    his troops will fight so he
  971. wants to burn his boats.
    Everyone okay with the example?
  972. Now this is probably the
    classic example of a commitment
  973. strategy.
    William the Conqueror,
  974. in this example,
    or Cortez in perhaps the more
  975. famous example,
    burned the boats to get rid of
  976. some strategies,
    to get rid of some choices
  977. altogether.
    And you might think how can
  978. getting rid of choices make me
    better off?
  979. The reason, in a game,
    particularly in a sequential
  980. game, in a game generally,
    the reason getting rid of
  981. choices can make me better off
    is it changes the behavior of
  982. people on the other side of the
    game in such a way as benefits
  983. me.
    In this particular example,
  984. it makes the other side less
    likely to fight.
  985. So once again this is a
    commitment and the way it works
  986. is--this example of a commitment
    is actually to have fewer
  987. options and it changes the
    behavior of others.
  988. If it didn't change the
    behavior of others it wouldn't
  989. be worth doing.
    So let's just talk about this a
  990. little bit more.
    There's another famous example
  991. of a military commitment
    strategy.
  992. It occurs in the movie Dr.
    Strangelove.
  993. How many of you have seen the
    movie Dr.
  994. Strangelove?
    How many of you have not seen
  995. the movie Dr.
    Strangelove?
  996. Good homework exercise for the
    weekend: rent the movie Dr.
  997. Strangelove.
    It's a very good movie.
  998. Stanley Kubrick I think.
    So in Dr.
  999. Strangelove,
    there's a famous commitment
  1000. strategy.
    What is the famous commitment
  1001. strategy in Dr.
    Strangelove?
  1002. Somebody?
    Let's get a mike.
  1003. Ale can we get a mike down here?
    There's one on the way.
  1004. The guy in white, there you go.
    Student: They lose radio
  1005. contact with the base,
    which means that they're going
  1006. to end up dropping the bomb.
    Professor Ben Polak: All
  1007. right, that's true.
    I was thinking of something
  1008. more deliberate.
    What's the more deliberate--the
  1009. guy in red--while you're there,
    somebody in red here.
  1010. Here we go.
    Student: The Soviets
  1011. construct a doomsday device so
    if they're attacked they
  1012. automatically launch all of
    their nuclear weapons.
  1013. Professor Ben Polak:
    Good, so the worry that the
  1014. Soviets have is they think the
    Americans might attack them,
  1015. and they worry that the
    Americans might attack them
  1016. thinking that they won't
    retaliate,
  1017. because they won't really have
    any incentive to retaliate once
  1018. their cities are destroyed.
    So they construct a device they
  1019. called the doomsday device that
    will automatically launch
  1020. nuclear missiles back on the
    Americans if the Russians are
  1021. attacked.
    It's a commitment device to
  1022. make it credible that they will
    actually respond were they to be
  1023. attacked by the crazy Americans.
    So there's a mistake,
  1024. however, that the Russians
    make.
  1025. What is the mistake that the
    Russians make?
  1026. The guy in the Yale sweatshirt
    here.
  1027. Student: They don't tell
    anyone about the machine.
  1028. Professor Ben Polak:
    Right, they don't tell the
  1029. Americans that they've built the
    doomsday device.
  1030. So this is really an important
    idea hidden in what's really a
  1031. great movie.
    Commitment strategies are a
  1032. good idea.
    They can make things credible.
  1033. But the only reason that having
    fewer strategies is good for you
  1034. is because it changes the
    behavior on the other side.
  1035. In this case,
    it changes the behavior of the
  1036. Saxons.
    It doesn't work if the other
  1037. side doesn't know about it.
    It's crucial that the other
  1038. side knows.
    So the other players must know.
  1039. Just to hammer this home,
    notice that the expression in
  1040. the English language is "burning
    your boats."
  1041. It's not "going over to your
    boats and quietly drilling a
  1042. little hole in them," it's
    burning your boats.
  1043. You want the other side to see
    this big bonfire on the beach
  1044. and to know that the boats are
    no longer available.
  1045. Just as in Dr.
    Strangelove,
  1046. it's crucial,
    and it's a terrible mistake for
  1047. the Russians not to tell the
    Americans that they've built
  1048. this machine.
    So knowledge here is crucial,
  1049. knowledge of the other side.
    Now I should just complete the
  1050. history of this.
    This didn't work in the case of
  1051. the Battle of Hastings.
    The Saxons attacked anyway and
  1052. it just turned,
    it was a big blood bath,
  1053. and the Normans really only won
    this battle because somebody got
  1054. a lucky shot and hit King Harold
    in the eye.
  1055. So this is a good story but as
    a military tactic not actually a
  1056. true story.
    I'm adding that in,
  1057. because otherwise I'm going to
    get hundreds of letters from
  1058. people who watched the video.
    Now we've had a bunch of
  1059. lessons today,
    but I'm not done,
  1060. I want to play one more game.
    Before I do though I want to
  1061. make sure we understand--we're
    going to play another game so
  1062. everybody stay put.
    I want to make sure.
  1063. There's a bunch of lessons on
    the board, and I want you to
  1064. understand what's important and
    what's more important.
  1065. Everything is important,
    but some things are more
  1066. important.
    So the idea of commitment is an
  1067. important idea,
    and the idea of incentive
  1068. design is an important idea,
    but there's one idea here that
  1069. is more important than anything
    else, and that idea is backward
  1070. induction.
    I'll put a little sun around it
  1071. okay.
    How important is backward
  1072. induction?
    Backward induction is the most
  1073. important thing I'm going to
    teach you in the second half of
  1074. the semester.
    It's probably the most
  1075. important thing I'm going to
    teach you in the whole semester,
  1076. and it may be the most
    important thing you're going to
  1077. learn at Yale.
    It's the answer to all
  1078. questions.
    If I ask you a question,
  1079. and, as happens from time to
    time, you've been sleeping in
  1080. class and you wake up startled
    because I've now asked you a
  1081. question,
    and that camera is on you,
  1082. and you don't know what the
    question is, then the answer is
  1083. backward induction.
    I'm stressing this because six
  1084. weeks from now,
    whatever it is,
  1085. in December,
    there's going to be a final
  1086. exam and I guarantee now that
    there will be a question on the
  1087. final exam that involves
    backward induction,
  1088. and I guarantee that at least
    one person in the room will
  1089. forget to solve it by backward
    induction.
  1090. So are you going to use
    backward induction on the final
  1091. exam?
    Yes, you're all going to
  1092. remember it.
    Okay good, now we'll play one
  1093. more game just for fun.
    But before we play one more
  1094. game, I'll do one other thing,
    sorry one other thing before we
  1095. play one more game,
    I apologize.
  1096. The one other thing I want to
    do here is I put up a tree here
  1097. but I forgot to tell you
    anything about the tree.
  1098. This was a tree,
    and I would be remiss if I
  1099. didn't tell you a few things
    about trees.
  1100. So there's a certain amount of
    jargon that goes along with
  1101. trees, and I want you to go home
    and read the textbook and know
  1102. what that jargon is.
    So these spots,
  1103. these dots in the tree,
    these things are called nodes.
  1104. And in fact the end points of
    the tree where the payoffs are,
  1105. these are also called nodes but
    they're called end nodes.
  1106. These branches in the tree,
    the branches,
  1107. here they are,
    the branches are called edges.
  1108. So a tree consists of nodes,
    edges, and these missing dots
  1109. are the end nodes.
    The end nodes are associated
  1110. with payoffs.
    Do we need to write that?
  1111. That's obvious right:
    they're associated with
  1112. payoffs.
    And every other node belongs to
  1113. somebody whose turn it is to
    make a decision at that node.
  1114. These other nodes,
    the ones that aren't end nodes
  1115. are called decision nodes.
    It's perfectly possible,
  1116. for example,
    in our Norman Conquest game,
  1117. for somebody to have several
    nodes.
  1118. Somebody could move once and
    then later on again.
  1119. There may be also nodes that
    are never reached in a game.
  1120. One last piece of jargon,
    a route through a tree,
  1121. from the beginning to an end is
    called a path,
  1122. a way of getting through the
    tree from beginning to end.
  1123. So we'll have a pop quiz on the
    jargon later and now let's play
  1124. the last game I want to play.
    We've got ten minutes right?
  1125. Everyone understand the jargon?
    I want to play a game now.
  1126. So the game we're going to play
    is a game we're going to call
  1127. the "Hungry Lion Game."
    And what I'm going to do is I'm
  1128. going to pick on a row of the
    class, actually I'm going to
  1129. pick on this row of the class,
    and this row of the class,
  1130. everyone can see this row?
    From up there,
  1131. lean over the balcony.
    This row of the class,
  1132. we're going to imagine that
    everybody in this row is a lion.
  1133. Do they all look like lions?
    Not much, but never mind.
  1134. Pretend their lions:
    suspension of disbelief.
  1135. They're all lions and they're
    all hungry except for one
  1136. person, who's this guy at the
    end whose name is?
  1137. So Alex is a sheep.
    Alex had the misfortune of
  1138. wandering into this pack
    of--pride of lions and is liable
  1139. to get eaten because they're
    hungry.
  1140. However, one thing stands
    between Alex the sheep and being
  1141. eaten.
    The one thing that stands
  1142. between him and being eaten is
    that in lion society,
  1143. the only lion who is allowed to
    eat the sheep is the head lion,
  1144. the big lion whose name is
    Ryan.
  1145. Ryan is the head lion here.
    Only Ryan can eat Alex the
  1146. sheep.
    No other lion can eat Alex the
  1147. sheep.
    Now there's a catch if Ryan
  1148. eats Alex--if Ryan the lion eats
    Alex the sheep.
  1149. The catch is,
    that if Ryan the lion eats Alex
  1150. the sheep, then Ryan the lion
    will fall into a postprandial
  1151. stupor,
    fall asleep,
  1152. at which point the second
    largest lion whose name is Chris
  1153. can eat Ryan.
    Now notice Chris can never eat
  1154. the sheep.
    He can only ever eat the lion,
  1155. the big lion,
    and only then if the lion has
  1156. eaten the sheep and falls
    asleep.
  1157. This is very strict.
    Lion society is very
  1158. hierarchical like England.
    If, however,
  1159. Ryan the lion eats Alex the
    sheep, and is then eaten,
  1160. falls asleep and is then eaten
    by Chris the lion,
  1161. then and only then he will fall
    into a stupor at which point he
  1162. can be eaten by Isabella,
    who's the third largest lion
  1163. and so on.
    Everyone understand the game?
  1164. No questions,
    no talking among yourselves.
  1165. Write down what you would do if
    you were our key decision maker,
  1166. who is Ryan the lion.
    But before we write it down can
  1167. I get Alex to stand up a second,
    stand up, camera on him,
  1168. this is the menu.
    Thank you.
  1169. So you have to write down in
    your notepad whether,
  1170. you, Ryan the lion,
    in Ryan's shoes or paws or
  1171. whatever, whether you would eat
    Alex the sheep.
  1172. Everybody written something
    down?
  1173. Let's ask Ryan the lion what
    he's going to do,
  1174. so Ryan are you going to eat
    Alex the sheep?
  1175. Student: No.
    Professor Ben Polak: He
  1176. says no, he's not going to eat
    it, why not?
  1177. Student: Well then
    because I would be eaten as
  1178. well.
    Professor Ben Polak:
  1179. Then he'd be eaten.
    Well let's see if he would have
  1180. been eaten.
    So had he eaten Alex the sheep,
  1181. then he would have fallen
    asleep, and then he'd have been
  1182. in danger of being eaten by
    Chris.
  1183. Chris would you have eaten Ryan?
    Student: No.
  1184. Professor Ben Polak: So
    he wouldn't have eaten him in
  1185. fact.
    In fact, it looks like Ryan
  1186. would have got a free lunch.
    There are no free lunches in
  1187. Economics, so something's wrong.
    Why is Chris worried about it?
  1188. Chris is worried about it
    because--Chris is worried about
  1189. Isabella eating him,
    and what am I doing wrong?
  1190. What's wrong with this analysis?
    By the way, before we say
  1191. what's wrong,
    how many of you in Ryan's shoes
  1192. would have eaten the sheep?
    How many of you would not have
  1193. eaten the sheep?
    Okay, what am I doing wrong?
  1194. I'm in totally the wrong part
    of the room.
  1195. I shouldn't be here at all.
    I shouldn't be at all in this
  1196. part of the classroom.
    I can't get around.
  1197. Wait a second.
    Excuse me.
  1198. I should be,-- which row were
    my lions?
  1199. Where's my lion row?
    I should be way over here with
  1200. the baby lion,
    whose name is Agay.
  1201. So Agay is a little baby
    lion--very cute--but Agay,
  1202. if he gets a chance to eat is
    going to eat.
  1203. Why is he going to eat?
    Because there's no one to eat
  1204. him.
    So if Agay gets the chance to
  1205. eat the second babiest lion
    whose name is John,
  1206. then Agay will eat.
    So if John gets a chance to eat
  1207. Ben, is John going to eat Ben?
    John are you going to eat Ben?
  1208. No, John's not going to eat Ben
    because John knows he'll get
  1209. eaten by Agay.
    So Ben, if you get a chance to
  1210. eat Vidur, are you going to eat
    Vidur?
  1211. Wait a second.
    We just argued that if you eat
  1212. then the second largest lion
    whose name was John is not going
  1213. to eat you.
    Yes.
  1214. So we know the littlest lion is
    going to eat.
  1215. So the second littlest is not
    going to eat.
  1216. So the third littlest lion can
    eat safely.
  1217. So the fourth littlest lion
    should not eat.
  1218. So the fifth littlest lion
    should eat.
  1219. We've got eat,
    not eat, eat,
  1220. not eat, eat,
    not eat, eat,
  1221. not eat, eat,
    not eat,
  1222. eat, not eat,
    eat, not eat,
  1223. eat, not eat,
    eat, not eat,
  1224. eat, not eat,
    eat, is that right?
  1225. So in fact he should have eaten
    him, but the more important
  1226. point is how should we have
    analyzed this game?
  1227. What should we have used?
    Backward induction.
  1228. How many of you used backward
    induction?
  1229. Be honest, so what's the point
    here?
  1230. The point is this:
    five minutes ago I said the
  1231. most important thing you'll ever
    learn at Yale is backward
  1232. induction,
    then I distracted you with some
  1233. fairly irrelevant nonsense,
    and five minutes later nobody
  1234. used backward induction.
    So the main lesson from today,
  1235. this is where we're going to
    pick up on Wednesday is to use
  1236. something, what are we going to
    use?
  1237. Backward induction.
    All right we'll see you on
  1238. Wednesday.