English subtitles

← 20. Subgame perfect equilibrium: wars of attrition

Get Embed Code
1 Language

Showing Revision 1 created 08/17/2012 by Amara Bot.

  1. Professor Ben Polak:
    Last time we looked at how
  2. to apply our new idea of
    sub-game perfect equilibrium to
  3. a whole bunch of games,
    and our general idea for how to
  4. solve the sub-game perfect
    equilibrium is as follows.
  5. We looked at each sub-game.
    We solved for the Nash
  6. equilibrium in the sub-game,
    that is something we learned to
  7. do long ago.
    And then we rolled back the
  8. payoffs: we rolled them back up
    the tree.
  9. And towards the end we learned
    something interesting.
  10. I'm not going to go back to it
    today--I just want to emphasize
  11. it.
    We learned that strategic
  12. effects matter.
    So in that investment game we
  13. looked at last time,
    when you're considering whether
  14. to rent a new piece of
  15. it made a very big difference
    whether you considered how this
  16. action would affect the actions
    of the other side;
  17. in this case,
    how it affects your
  18. competition.
    This is a very general idea,
  19. a very general point.
    So just to give you a couple of
  20. more examples,
    when you're designing tax
  21. systems--I mentioned this last
    time--when you're designing a
  22. tax system,
    to make some changes in the
  23. U.S.
    tax system, it's not good
  24. enough to look at how people are
    behaving in the old tax system
  25. and just calculate in an
    accounting manner how much more
  26. money you're going to raise,
    or how much money it's going to
  27. cost you.
    You have to take into account
  28. how that's going to lead to
    changes in behavior.
  29. Once again, that's a strategic
    effect, and in the homework that
  30. you're handing in today,
    all of you will have had a nice
  31. example of that in the toll
    booth problem.
  32. So in the toll booth problem,
    when you're putting tolls on
  33. roads--or more generally,
    when you're building new roads,
  34. new bridges,
    new flyovers,
  35. new bypasses,
    you need to take into account
  36. how those new tolls,
    how those new roads will affect
  37. all of traffic flow.
    Traffic flow down the tree will
  38. form a new equilibrium and you
    need to consider that in
  39. designing your tolls and
    designing your road system.
  40. So that's another example of
  41. So today I want to do something
    quite different,
  42. a little bit like what we did
    with duel,
  43. I want to play a game today,
    and probably spend the whole of
  44. today analyzing this one game.
    So it's quite a complicated
  45. game, but it's quite a fun game.
    So what's the game we're going
  46. to look at?
    The game is going to involve
  47. two players, and each player in
    each period, they choose--or
  48. each chooses I should say--each
  49. whether to fight or to quit.
    So F means fight and Q means
  50. quit, and they make this choice
  51. The game ends as soon as
    someone quits.
  52. So there's good news and bad
    news for this game.
  53. Let's do the good news first.
    The good news is that if the
  54. other player quits first you win
    a prize.
  55. Generally we'll call this prize
    V, but we'll play for some cash
  56. in a minute.
    The bad news is,
  57. each period in which both
    fight--so each period in which
  58. both players choose to
    fight--each player pays a cost,
  59. so they pay -C.
    Just to keep things interesting
  60. let's fill in the other thing
    here which is if both quit at
  61. once--so if both quit at once
    then they get 0 that period.
  62. So this is a game we've seen a
    little bit before.
  63. We saw a little bit under the
    auspices of Hawk Dove.
  64. Those people in the MBA class
    saw a game a lot like this.
  65. But we're going to analyze this
    in much more detail then we did
  66. before.
    As I said, we're going to spend
  67. the whole of today talking about
  68. So, to start this out,
    let's actually play this game.
  69. So I want two volunteers.
    Let me just,
  70. since this is college football
    season, let me see if I can play
  71. off the rivalries.
    So do I have anybody here from
  72. the great state of Texas?
    A whole bunch of Texans,
  73. keep your hands up.
    I want to use you a second and
  74. I guess the rivalry here is
    Oklahoma.: Anybody from
  75. Oklahoma?
    No Oklahomans?
  76. What we'll do is we'll pick two
    Texans then.
  77. We'll assume this is Texas and
    Texas A&M.
  78. So Texans raise their hands
  79. All right, we're going to pick
    out two Texans and I'm going to
  80. give you a mike each.
    So your name is?
  81. Student: Nick.
    Professor Ben Polak:
  82. Why don't you keep hold of
    the mike and just point it
  83. towards you when you speak,
    but still shout because
  84. everyone here wants to hear you.
    So this is Nick,
  85. where was my other Texan back
  86. Why don't I go for the closer
  87. We'll start here.
    And your name is?
  88. Student: Alec.
    Professor Ben Polak:
  89. Alec, so shout it out.
    Student: Alec.
  90. Professor Ben Polak:
    That's better,
  91. okay.
    So the game is this.
  92. They're going to have to write
    down for the first period
  93. whether they choose fight or
  94. Each player will have a referee.
    So the person behind Alec is
  95. going to be Alec's referee to
    make sure that Alec is actually
  96. saying what he says he's going
    to do.
  97. And what happened to my other
  98. I've lost my other Texan.
    There he is.
  99. Your name again was?
    Student: Nick.
  100. Professor Ben Polak:
    Nick is going to write down
  101. fight or quit.
    And to make this real,
  102. let's play for some real cash.
    So we'll make the prize--why
  103. don't we make the prize equal to
    a $1 and the cost equal to $.75.
  104. So I've got some dollars here.
    Here we go.
  105. What do we call this?
    In Texas we call this a fist
  106. full of dollars,
    is that right?
  107. So where are my players?
    Why don't you stand up,
  108. you guys.
    So everyone can see you.
  109. I made this difficult,
    because now you are going to
  110. have to write down your
  111. So you are going to have to
    grab a pen.
  112. I didn't make that easy for you.
    Let me come down to make it
  113. easier for the camera person.
    So why don't you write down
  114. what your strategy is going to
    be, and tell your neighbor what
  115. it's going to be,
    and we'll see what happened.
  116. Show your referee.
    Have you shown your referee?
  117. Nick, speaking into the
    microphone what did you do?
  118. Student: I quit.
    Professor Ben Polak:
  119. He quit.
    Student: I quit as well.
  120. Professor Ben Polak:
    What happened to remember
  121. the Alamo?
    All right, so Texas didn't work
  122. very well.
    Let's try a different state.
  123. I have to say my wife's from
    Texas and my wife's family is
  124. from Texas, and I thought they
    had more fight in them than
  125. that.
    Maybe that's why they are
  126. sliding in the polls.
    Let's try somebody from Ohio,
  127. anyone from Ohio?
    Nobody from Ohio in the whole
  128. class, that's no good.
    I was going to pick Ohio
  129. against Michigan.
    How about some people from some
  130. of our own teams?
    Are there any players on the
  131. football team other than the two
    I've picked on before?
  132. There we go.
    I need a different team,
  133. anybody from the hockey team?
    Anybody from the baseball team?
  134. Okay good.
    So our friend from the baseball
  135. team, your name is?
    Student: Chris.
  136. Professor Ben Polak:
    Chris and our new football
  137. team player is?
    Student: Ryland.
  138. Professor Ben Polak:
  139. Okay so Ryland and Chris are
    going to play this.
  140. And neither of you is from
    Texas, I take it,
  141. so we have some hope of
    something happening here.
  142. So write down what it is you're
    going to choose.
  143. Have you both written something
  144. Yeah, all right Ryland what did
    you choose?
  145. Student: Fight.
    Professor Ben Polak:
  146. Chris?
    Student: I'm going to
  147. quit.
    Professor Ben Polak:
  148. Well that was easy too.
    So the football team is looking
  149. pretty good here.
    So we're not getting much in
  150. the way of action going here.
    Anyone else want to try here?
  151. Another little state rivalry
    here, I don't suppose I've got
  152. anyone from Oregon,
    that's asking too much,
  153. anyone from Oregon?
    You guys must be from somewhere.
  154. There's got to be a state where
    at least one of you is from.
  155. Well let's try something like
    New Jersey, how about that?
  156. There's some players from New
    Jersey that's good.
  157. Here we go.
    And we'll try New Jersey and
  158. New York.
    That seems like there's a bit
  159. of a rivalry there.
    Are you from New York?
  160. Excellent, here we go,
    and your name is?
  161. Student: Geersen.
    Professor Ben Polak:
  162. Your name is?
    Student: Andy.
  163. Professor Ben Polak:
  164. Okay so Geersen and Andy.
    So stand up so everyone can see
  165. where you are.
    Let's see if there's any fight
  166. in New York and New Jersey.
    So write down your strategies.
  167. Andy what did you choose?
    Student: I'm going to
  168. fight.
    Student: Fight.
  169. Professor Ben Polak:
    Here we go,
  170. this is better now.
    I was getting worried there for
  171. a second.
    I know it's near the
  172. Thanksgiving break,
    but there has to be some sort
  173. of spark left in the class.
    So we have both people fighting
  174. which means right now they're
    down $.75 but the prize is $1.
  175. So the game goes on,
    so write down again.
  176. What you're going to do second
  177. The $.75 is gone so now we're
    just looking at this game for
  178. $1.
    Let's go to New York,
  179. what does New York?
    Student: I'm going to
  180. fight.
    Student: Fight.
  181. Professor Ben Polak:
    Fight okay.
  182. So you have to stay on the east
    coast to get life.
  183. There's no point going west is
  184. That makes sense.
    So write down again what you're
  185. going to do, and let's go the
    other way around,
  186. to New Jersey?
    Student: Fight.
  187. Student: Fight.
    Professor Ben Polak:
  188. Fight again all right,
    so right now we're down three
  189. $.75 whatever that is,
    and there's still this prize of
  190. $1, plus perhaps a bit of pride
  191. So write down again.
    Let's try again,
  192. so let's go with New York this
  193. Student: I'm going to
  194. Student: Fight.
    Professor Ben Polak:
  195. Fight okay,
    I'm guessing we could keep this
  196. going for quite a while,
    is that right?
  197. Now it might make a difference,
    by the way, if they're allowed
  198. to talk to each other here,
    so let's see if it does.
  199. So let's allow New Jersey and
    New York to talk to each other.
  200. You can't insult each other
    about bridges and tunnels,
  201. just regular talk.
    So anything you want to say to
  202. your friend from New Jersey
  203. Student: I can't let New
    Jersey win, that's just New York
  204. pride.
    You guys are just worse in
  205. every realm so I'm sorry.
    It's just pride.
  206. Professor Ben Polak:
    Anything you want to say in
  207. reply?
    Student: Well I'm going
  208. to keep fighting,
    so your best choice is to give
  209. up.
    Professor Ben Polak:
  210. Let's see if that works.
    Did they get anything out of
  211. that?
    So choose the strategies again.
  212. New York?
    Student: I just can't
  213. let Jersey win:
  214. Student: Bring it on:
  215. Professor Ben Polak:
    All right,
  216. so it's clear that if we kept
    this going for a while,
  217. it would pay for my lunch,
    is that right?
  218. We'll hold it here for a second.
    We'll talk about it a bit,
  219. but thank you.
    A round of applause for our two
  220. feistier players.
    So what's going on here?
  221. So clearly we can see what can
    happen in this game.
  222. You can get people quitting
    early, and it could be that one
  223. side quits and the other side
    doesn't quit.
  224. That is also something that can
  225. That can happen pretty quickly.
    But it's possible--we just saw
  226. it happen--it's possible that a
    fight could go on quite a while
  227. here.
    Now why?
  228. What's going on here?
    I mean the prize here was what?
  229. Was $1, and the cost was $.75.
    I could have raised the stakes
  230. maybe on these guys and see if
    that made a difference,
  231. but I think $1 and $.75 will
  232. And by the time they had fought
    the second time,
  233. they'd exhausted the possible
    prize of $1.
  234. So it's true that if you won
    this in the first period then
  235. that's fine because you just get
    a $1 and it wouldn't cost you
  236. anything.
    And even if you won in the
  237. second period you'd be okay,
    you'd only cost yourself $.75
  238. for fighting in the first
  239. but you're getting $1 so that's
  240. But there on--and "there on"
    went on for plenty of time in
  241. this case--there on you're just
    accumulating losses,
  242. so what's going on?
    There are various conclusions
  243. possible here.
    One is that people from New
  244. York and New Jersey are crazy.
    That's a possible thing.
  245. But what else is going on.
    Why did we get involved in this
  246. fight.
    What happened here?
  247. Why do we tend to see fights
    like this emerging?
  248. I'm claiming this isn't such an
    implausible situation.
  249. Why do we see it emerging?
    Let's talk to our friend from
  250. New York, shout out.
    Student: By the time she
  251. fought with me on the second
    round, I knew I was going to be
  252. losing money anyway so why not
    just keep going and then,
  253. there was no reason,
    I wasn't going to win anyway,
  254. so I might as well just keep
    fighting until she quit.
  255. Professor Ben Polak:
    All right,
  256. I think there's two things.
    There's two parts to that
  257. answer.
    Part of the answer is:
  258. I have lost the money anyway,
    let's hold that piece of it.
  259. And the other part of it is
  260. The other part of it is I'm
    really determined to win this
  261. thing.
    So there's two things going on
  262. there, and they're quite
  263. Let's take the second one first.
    It's possible that the reason
  264. these fights emerge and can go
    on for quite a while is that
  265. even though the prize is only $1
    in money,
  266. it could be that the actual
    thing that the players care
  267. about is what?
    What do the players actually
  268. care about here?
    Somebody just raise your hand,
  269. I'll put you on the mike.
    What do people tend to care
  270. about in these situations?
    Winning, they care about
  271. winning, or they care about
  272. Is that right?
    That's why I started with
  273. Texas, but I couldn't find any
    pride in Texas,
  274. so we had to go to New York.
    So people care about winning
  275. per se.
    It's a pride thing.
  276. So it could be that $1 simply
    isn't a good description of the
  277. true payoffs here.
    It could be that the payoffs
  278. are actually about winning.
    It could also be that both of
  279. these guys know that they're
    going to be interacting with you
  280. at other times in the class,
    or other times at Yale.
  281. And they want to establish a
    reputation, both of them,
  282. as being the kind of guys who
  283. In particular,
    when they got to talk about it,
  284. both of them said:
    "look I'm a fighter":
  285. something which we've seen
    before with Ale and his pizza
  286. shop.
    Both of them said:
  287. "I'm a fighter.
    You better back out."
  288. So both of them tried to signal
    the fact that they were going to
  289. fight to try and get the other
    side to quit.
  290. So that's about reputation,
    and that reputation could
  291. extend beyond this game.
    It could be that they're going
  292. to be involved in this kind of
    conflict later on in life.
  293. So both of those things are
  294. There's another element to
    this, and it's the other part of
  295. what our friend from New York
    said which is about the costs.
  296. What's true about the costs in
    this game as we move from period
  297. to period?
    Somebody said it.
  298. Say it again.
    Say it loudly.
  299. Student: Sunk cost.
    Professor Ben Polak:
  300. It's a sunk cost.
    So all of those costs that you
  301. accumulate as the game goes on,
    they're irrelevant looking
  302. forward because they're sunk.
    The fact I've played this game
  303. for ten periods and hence lost
    ten times $.75--which even I can
  304. do,
    that's $7.50--the fact that
  305. I've lost $7.50 is irrelevant
    because I've lost it anyway.
  306. I can't get that back.
    That's a sunk cost.
  307. So the game ten periods through
    looks exactly the same as the
  308. game did at the beginning,
    when fighting seemed a good
  309. option.
    So ten periods through the
  310. game, you have the same view
    about fighting as you did at the
  311. beginning.
    Now that's not quite true
  312. because at some point you're
    going to run out of money,
  313. but if we ignore that,
    basically, those sunk costs are
  314. irrelevant.
    So what we're seeing here is
  315. reasons why people fight and
    some of these reasons seem to be
  316. for standard economic reasons
    like sunk costs,
  317. and some of them seem to be
    about things that are outside
  318. the game like pride or possibly
  319. It's certainly the case within
    the real world,
  320. we do see fights like this.
    Let's just spell out what the
  321. key feature of this is.
    The key feature of this is,
  322. in these fights over a period
    of time, even though you may
  323. only be losing a little piece in
    each period,
  324. over a period of time you could
    lose a lot.
  325. In fact, you could lose far
    more than the prize that was
  326. originally at stake.
    So the losses you could
  327. accumulate--and our friend from
    New Jersey and New York,
  328. the losses that they
    accumulated vastly outweighed
  329. the prize that was at stake
    after a while.
  330. That's a worry.
    So this can occur in real life
  331. not just in the classroom.
    What do we call these kinds of
  332. fights;
    these fights where they're
  333. holding out for this possibly
    small prize and incurring,
  334. possibly small--but they
    accumulated to being
  335. large--costs each period.
    What do we call those fights?
  336. Let's think about some examples.
    Let's see if the word comes out
  337. of examples.
    So one example--let's do some
  338. examples here.
    One example is what happened in
  339. World War I.
    So in World War I,
  340. as I'm assuming most of you
    know, on the western front at
  341. least,
    the German and allies to
  342. Germany armies faced off with
    the British and French and
  343. allied armies,
    for an extraordinarily long
  344. time, fighting over
    extraordinarily small patches of
  345. land,
    little pieces of northern
  346. France and Belgium.
    You could argue that these
  347. pieces of northern France and
    Belgium--I don't wish to offend
  348. anyone French or Belgian
    here--but you could argue that
  349. those few acres of northern
    France and Germany weren't worth
  350. a whole lot anyway.
  351. the two sides kept on fighting
    from 1914 to 1918,
  352. and enormous losses of life
    were accumulated in that period.
  353. So that was a really costly,
    long term battle.
  354. Neither side would quit.
    Each year huge numbers of lives
  355. were lost.
    If you doubt that go and look
  356. at the war memorial in Yale that
    shows how many Yale American
  357. lives were lost,
    and America was only in that
  358. war for about a year.
    Okay, so that's an example.
  359. Another example--a more
    business example,
  360. an example we talked about in
    our MBA class but not in this
  361. class so far--is examples in
    business where there's a market
  362. and that market is really only
    going to hold one firm.
  363. That market is only going to
    hold one firm.
  364. You can end up in an extremely
    long fight about who's going to
  365. end up being the one firm in
    that market.
  366. So a famous example--actually
    it's a famous business school
  367. case--is the fight that ocurred
    to control satellite
  368. broadcasting in Europe.
    So there was a fight between
  369. Sky Television and the British
    Satellite Broadcasting Company
  370. that went on for a number of
  371. And these companies were doing
    things like charging zero
  372. prices, and giving away
    satellite dishes,
  373. and this that and the other.
    And over the course of the
  374. fight they accumulated so many
    losses that, if you did the
  375. accounting,
    the entire future projected
  376. profit flow of winning this
    fight was vastly outweighed by
  377. the amount of money that they'd
    lost during the fight.
  378. That was a fight that involved
    on one side Rupert Murdock and
  379. you can argue maybe Rupert
    Murdock is a little crazy and
  380. had a reputation to keep up,
    but still it looks like another
  381. example of this.
    So that example was British
  382. Satellite Broadcasting versus
  383. So with those two examples
    there, anyone think of a general
  384. term we call these fights?
    How do people refer to the
  385. method of fighting in World War
    I--or for that matter during the
  386. American Civil War?
    Somebody in the back,
  387. shout it out.
    Student: War of
  388. attrition.
    Professor Ben Polak:
  389. It's a war of attrition.
    So these are wars of attrition.
  390. These are wars of attrition.
    And what we know about wars of
  391. attrition is that they can go on
    a long time, and a lot can be
  392. lost.
    A lot of life can be lost in
  393. the case of real wars.
    A lot of money can be lost in
  394. the case of business wars.
    Actually it turns out,
  395. a lot of games have this
    structure of a war of attrition.
  396. Let me give you one more
  397. Suppose that two companies are
    competing for a market not in
  398. the manner of BSB and Sky by the
    advertising or whatever,
  399. but in the form of paying
  400. So suppose there's a company
    let's say in France and a
  401. company let's say in Britain,
    and these two companies are
  402. trying to win a contract in some
    country where paying bribes is a
  403. successful strategy.
    And here I'm going to be
  404. careful about the film and not
    mention any real companies,
  405. so let's call this imaginary
    country Freedonia,
  406. which comes from a Marx
    Brothers film.
  407. So here's this French company
    and this British company,
  408. and they both want this
    contract to build a bridge in
  409. Freedonia.
    And they start paying bribes to
  410. the general who controls
  411. And what happens?
    Well you're not going to get
  412. the bribe back.
    So both sides pay a few
  413. thousand dollars to this
    general, and then the general
  414. comes back and says,
    well you both paid $1,000.
  415. Which of you wants to pay the
  416. So they go on,
    and they put more money in and
  417. more money in.
    And you can see once again this
  418. is a war of attrition.
    Those bribes that they've paid,
  419. they're never getting back,
    but once you've paid them
  420. they're a sunk cost.
    Once you've paid that bribe,
  421. good luck saying:
    I paid you this bribe.
  422. You didn't let me build the
  423. Give me my money back.
    There isn't a court in the
  424. world that's going to enforce
  425. So these bribery contests look
    a lot like this.
  426. There's a technical name for
    these bribe contests,
  427. they're sometimes called all
    pay auctions.
  428. So what do we want to establish
  429. We want to establish-- we want
    to talk about why fighting
  430. occurs here and we wanted to do
    so in some detail.
  431. So there may be informal
    reasons why fighting occurs and
  432. we've talked about that.
    It could be that one side is
  433. crazy.
    It could be that both sides are
  434. crazy.
    It could be that national or
  435. regional pride is at stack.
    All of these things could
  436. affect why we get fighting.
    But what I want to try and
  437. establish today is that you can
    get long fights emerging in
  438. potential wars of attrition even
    if everybody is rational,
  439. even if the payoff is just that
    one dollar, and even if there's
  440. no reputation at stake.
    So again, my goal today is to
  441. try and convince you that you
    can get huge loss of life in
  442. World War I or huge loss of
    money in these business contexts
  443. without having to argue
    something outside the model like
  444. irrationality or reputation.
    Even rational players can get
  445. themselves in trouble in wars of
  446. So for the rest of today,
    I want to try and analyze this.
  447. To get us started I want to
    look at a version of this game,
  448. at a simplified version,
    which only lasts for two
  449. periods.
    So we'll do a two period
  450. version of the game we played
    just now.
  451. Eventually, by the end of
    today, I want to look at the
  452. infinite version.
    So we'll start small.
  453. We'll start with a two period
  454. The trick in analyzing these
    things is to be able to come up
  455. with a tree and to be able to
    come up with payoffs,
  456. and be able to apply the
    analysis that we know about to
  457. get us to where we want to be.
    So here's the game I claim.
  458. I claim that it has the
    following tree.
  459. So first of all Player A
    chooses and Player A can either
  460. Fight or Quit.
    And let me put a [1],
  461. and we'll see what the [1]
    is in a second.
  462. Then we're going to model
    Player 2.
  463. But of course this is a
    simultaneous move game.
  464. This is a simultaneous move,
    so this is an information set.
  465. Let's not call him Player 2.
    Let's call him Player B.
  466. So Player B doesn't know what A
    has done that first period when
  467. B is making her choice.
    This is a simultaneous move.
  468. And B is choosing between
    fighting or quitting.
  469. Just to distinguish them,
    let me use small letters for B,
  470. so once again fight or quit,
    and fight or quit.
  471. Now, if both sides fight then
    the game continues.
  472. And everyone knows that both
    sides fought at that stage.
  473. So at this point on,
    we're actually at a singleton
  474. information node and it's Player
    A's turn again.
  475. So here we go again.
    So in the second period,
  476. once again, we've got A
    choosing whether to Fight or
  477. Quit.
    And this time we'll put a [2]
  478. to indicate we're in the second
  479. And after Player 2 has moved
    once again--after Player A has
  480. moved--once again Player B is
  481. That's a simultaneous move.
    And once again they're choosing
  482. fight or quit.
    I'll put [2]
  483. to indicate that we're in the
    second stage.
  484. So that's the structure of this
    two period game.
  485. And let's write down what the
    payoffs are, starting with the
  486. easy payoffs.
    So if both people quit in the
  487. first stage, they get nothing.
    If A quits and B fights,
  488. then A gets nothing and B gets
  489. If A fights and B quits,
    then A gets V and B gets
  490. nothing.
    And if they both fight we go
  491. into the second stage.
    So let's write down the payoffs
  492. in the second stage.
    So in the second stage,
  493. if they both quit in the second
    stage then their payoffs are
  494. going to be,
    for A, -C, the costs they
  495. accumulated in the first stage,
    plus 0.
  496. And, for B, -C + 0.
    If A quits and B fights in the
  497. second stage then the payoffs
    are -C + 0 and -C + V.
  498. If A fights and B quits then
    the payoffs are -C + V and -C +
  499. 0.
    And if they both fight for two
  500. periods, we have a decision to
    make about how we're going to
  501. end the game in this two period
  502. but let's just assume that what
    we'll get here is -C -C and -C
  503. -C.
    We'll assume if they both fight
  504. the game ends and no one gets
    the prize, just to make life
  505. simple.
    So this is a two period version
  506. of the game and the only change
    I've made, other than making it
  507. two periods, is I had to put in
    a payoff.
  508. I had to see what happened if
    the game didn't resolve.
  509. And I've assumed that if the
    game didn't resolve,
  510. no one got the prize.
    Now there's another assumption
  511. I'm going to have to make before
    we analyze this,
  512. there are two possible cases to
    consider here.
  513. There's the case when V >
    C which is the case we just
  514. played in the class;
    and there's also the converse
  515. case when C >
  516. So today we'll focus on the
    case V > C which is the case
  517. we just played in class.
    I'm going to leave you to
  518. analyze the other case,
    the case when the cost is
  519. bigger than the prize,
    as a homework assignment.
  520. So V > C is our assumption.
    So everyone okay with the tree?
  521. This tree I hope describes the
    game, at least a two period
  522. version of the game.
    Now the first thing I want to
  523. point out here is,
    if we look at the payoffs that
  524. are incurred at the end of the
    second period game,
  525. we notice that they all contain
    a –C.
  526. There's a -C everywhere.
    What is that -C?
  527. It's the cost that you
    accumulated from having fought
  528. in the first stage.
    But the observation I want to
  529. make straight away is that this
    cost--so here it is,
  530. here it is, here it is,
    here it is, here it is,
  531. here it is, and here it is,
    and here it is--this cost is
  532. sunk.
    These objects here are sunk
  533. costs.
    There's nothing you can do once
  534. you're in the second period of
    the game to get these sunk costs
  535. back.
    They're just gone.
  536. They're there,
    but the fact that they enter
  537. everywhere is going to make them
    strategically irrelevant.
  538. Okay, so what we want to do
    here is we want to find all of
  539. the sub-game perfect equilibria
    of this little game.
  540. Let me get rid of the rules.
    We all know the rules by now.
  541. So our goal here is to use our
    solution concept,
  542. which is sub-game perfect
    equilibrium, to try and analyze
  543. this game.
    So how are we going to analyze
  544. this game in terms of sub-game
    perfect equilibria?
  545. How are we going to start that
  546. We've got a lot of work to do
    here, where are we going to
  547. start in finding sub-game
    perfect equilibria?
  548. What's the first thing we
    should do?
  549. Well, I claim the first thing
    we should do is just figure out
  550. what the sub-games are.
    Let's start with that.
  551. So having just pushed it far
    away I'm going to need to use
  552. the pointer.
    I claim that the obvious
  553. sub-game to analyze first is
    this sub-game.
  554. It's the sub-game if you should
    end up in period [2].
  555. And notice that it is a
    sub-game: it starts from a
  556. singleton node;
    it doesn't break up any
  557. information set;
    and it contains all of the
  558. descendants of the node from
    which it starts.
  559. So that is genuinely a sub-game.
    So we're going to start our
  560. analysis by considering the
    second sub-game.
  561. So let's write down the matrix
    that corresponds to that second
  562. sub-game.
    And I'm going to write it down
  563. in the following way.
    So I claim, in this second
  564. sub-game, each player has two
    choices, they can fight or quit.
  565. And I'm going to write the
    payoffs in a particular way.
  566. I'm going to write the payoffs
    as -C plus this thing.
  567. So rather than keep that -C in
    all the boxes,
  568. which it's going to get boring
    after a while,
  569. I'm going to pull that -C out
    and just put it in front.
  570. Is that okay?
    So we had this sunk cost box
  571. everywhere and I'm going to pull
    out this sunk cost box and put
  572. it in front.
    So here it is.
  573. If you get into the second
    period of the game you've
  574. incurred this sunk cost.
    And your payoffs in this game,
  575. if you both fight then you
    incur -C for the second time.
  576. If A fights and B quits,
    then A is going to win the
  577. prize so they'll get V and
    Player B will get nothing.
  578. If B fights and A quits,
    then conversely,
  579. A gets nothing and Player B
    gets the prize.
  580. And if they both quit they just
    get nothing.
  581. So just notice what I did here,
    I could have written out the
  582. box with -C-C here;
    -C-C here;
  583. -C+V here;
    -C+0 here;
  584. -C+0 here, etc..
    But I just pulled out that -C
  585. because it's just distracting
  586. So I pulled out that -C and put
    it in the front.
  587. Okay, so now we can analyze
    this little game,
  588. and let's start off by talking
    about pure strategy equilibria
  589. in this game.
    So again, our goal is to find
  590. sub-game perfect equilibria,
    so the way in which we find
  591. sub-game perfect equilibria is
  592. We start at the last sub-games,
    we look for Nash equilibria in
  593. those last sub-games,
    and then eventually we're going
  594. to roll those back.
    So there's our last sub-game.
  595. There's the matrix for it.
    Let's just find the Nash
  596. equilibria.
    So if Player B is fighting then
  597. Player A's best response is to
    quit and if Player A is fighting
  598. then Player B's best response is
    to quit.
  599. Conversely, if Player B is
    quitting, Player A's best
  600. response is to fight,
    and if B is fighting--sorry:
  601. if A is quitting then Player
    B's best response is to fight.
  602. I didn't say that right.
    Let me try again.
  603. So if A is fighting,
    if the other side is fighting
  604. you want to quit.
    If the other side is quitting
  605. you want to fight.
    Is that clear?
  606. So there are actually
    two--let's be careful here--pure
  607. strategy equilibria,
    there are two pure strategy
  608. Nash equilibria in this
  609. What are they?
    They are (Fight,quit) and
  610. (Quit, fight).
    So if we get into the second
  611. sub-game and if we know we're
    going to play a pure strategy in
  612. the second sub-game,
    this is what's going to happen,
  613. that's our claim.
    Now notice that it didn't
  614. matter, the sunk cost didn't
    matter there.
  615. I could have included the sunk
    cost in the payoffs,
  616. but I would have found exactly
    the same thing with or without
  617. the sunk costs.
    So, as we'd expect,
  618. the sunk cost is irrelevant.
    So we've got both the
  619. equilibria in the sub-game.
    Let's roll these back into the
  620. first stage of the game.
  621. The payoffs associated with
    this one are V and 0,
  622. and the payoff associated with
    this one is 0 and V.
  623. Everyone okay with that?
  624. Okay, let's revisit the first
    stage of this game.
  625. Now, things get a little bit
    more complicated,
  626. so what I'm going to do is I'm
    going to redraw the first stage
  627. of this game.
    So here it is.
  628. A can fight or quit just as
  629. And, following this,
    B can fight or quit just as
  630. before.
    So this is a picture of the
  631. first stage of the game,
    but I'm going to chop off the
  632. second stage.
    Let's put the payoffs in.
  633. So the payoffs down here are
    the same as they were before:
  634. (0,0);
    working up, (0, V);
  635. if A fights and B quits then
    its V,0.
  636. But what about the payoff if
    they both fight?
  637. So what I want to do now is I
    want to look at the payoff when
  638. they both fight by considering
    what they would get,
  639. if they both fight,
    in the second period of the
  640. game.
    So our idea is find the Nash
  641. equilibrium in the second period
    of the game and roll back these
  642. possible payoffs.
    So here the payoffs are going
  643. to be -C plus stage [2]
    Nash equilibrium payoffs for
  644. Player A;
    and -C plus the same thing,
  645. stage [2]
    Nash equilibrium payoffs for
  646. Player B.
    So just to understand what I've
  647. written here then,
    I put the same payoffs as
  648. before but I've replaced that
    enormous thing that was the
  649. second stage of the game,
    just with the payoffs that we
  650. know that we're going to get in
    the second stage of the game,
  651. if we play Nash equilibrium in
    the second stage.
  652. So these objects have a name,
    and the name is continuation
  653. payoffs.
    These objects are the
  654. continuation payoffs.
    They are the payoffs I'm going
  655. to get tomorrow--and possibly
    forward in a more complicated
  656. case--if,
    in this case,
  657. if we both fight in the first
  658. Now what we want to do is we
    want to draw up the matrix that
  659. corresponds to this first stage
    game, and we're going to have to
  660. do so twice.
    We're going to have to do so
  661. once for the case where the
    continuation payoffs are (V,
  662. 0), where the equilibrium we're
    playing tomorrow is (Fight,
  663. quit).
    And we're going to have to do
  664. again for the case where the
    continuation payoffs tomorrow
  665. are (0, V), namely the
    continuation play is (Quit,
  666. fight).
    So we have to do it twice.
  667. [So, you've got the
    continuation payoffs down so
  668. let's delete this to give
    ourselves a little room.]
  669. So the matrix is going to look
    as follows: a nice big matrix,
  670. 2x2.
    Player A is choosing Fight or
  671. Quit, and Player B is choosing
    fight or quit.
  672. That much is easy.
    It's what we put in here that
  673. matters.
    So let's do all the easy cases
  674. first.
    So (Quit, quit) is (0,0)'
  675. (Quit, fight) is (0,V);
    (Fight, quit) is (V, 0);
  676. and in here it's going to
    depend which of these two
  677. games--which of these two
    equilibria is being played
  678. tomorrow.
    So what we're going to do here
  679. is we're going to do the case
    for the equilibrium (Fight,
  680. quit) in stage two.
    We're going to write out the
  681. matrix for the case where we're
    going to play (Fight,
  682. quit) tomorrow.
    [-Thank you,
  683. try and keep me consistent
    today, because it is very easy
  684. to slip up.
    So once again I'm going to use
  685. capital letters for Player A and
    small letters for Player B.
  686. ]
    So what happens if we both
  687. fight?
    We both incur costs of C from
  688. fighting, and tomorrow we're
    going to get the payoffs from
  689. the equilibrium (Fight,
  690. That's this equilibrium,
    so we're going to get V and 0
  691. tomorrow.
    So let's add those in.
  692. So this will be +V and this
    will be +0.
  693. And let's just put some chalk
    around here to indicate that
  694. these are going to be
    continuation payoffs.
  695. So this is the matrix we're
    going to use to analyze the
  696. first stage of the game in the
    case where the equilibrium we're
  697. playing tomorrow is (Fight,
  698. And as I promised,
    we have to do this twice.
  699. So the other case,
    of course, is if we're in the
  700. other equilibrium tomorrow.
    So let's just do that.
  701. So once again we have Fight
    [1], Quit [1];
  702. fight [1], quit [1];and the
    payoffs are--same as we
  703. had--(0,V); (0,0);
    (V, 0);
  704. and then here,
    this time, we're going to have
  705. -C + 0 and -C + V.
    And the reason for the change
  706. is that we're now looking at the
    continuation game,
  707. the continuation play where
    it's Player A who quits in the
  708. second stage.
    So this is for the case (Quit
  709. [2], fight [2]) in period 2.
    So let's just pause,
  710. let's make sure everyone's got
    that down, everyone okay?
  711. So what we've done here is we
    started off by analyzing what's
  712. going to happen in period 2,
    and that really wasn't very
  713. hard, is that right?
    That was a pretty simple game
  714. to analyze, pretty easy to find
    the equilibria.
  715. Then what we did was we rolled
    back the equilibrium payoffs
  716. from period 2 and we plunked
    them on top of the relevant
  717. payoffs in period 1.
    So in particular,
  718. if you both fight and you know
    you're going to play the (Fight,
  719. quit) equilibrium tomorrow,
    then you're payoffs will be -C
  720. + V and -C + 0.
    If you both fight and you know
  721. you're going to play the (Quit,
    fight) equilibrium tomorrow
  722. then you're payoffs will be -C +
    0 and -C + V.
  723. And just to emphasize once
    again, these four boxes we
  724. created correspond to the stage
    2 Nash equilibrium payoffs,
  725. so the continuation payoffs of
    the game.
  726. Okay, so now we're ready to
    analyze each of these games.
  727. So this isn't going to be too
  728. Let's try and find out the Nash
    equilibrium of this game.
  729. So let's start with the left
    hand one.
  730. This is the case where Player A
    is going to fight and win in
  731. period 2.
    So if Player B is going to quit
  732. in period 1 then,
    if Player A fights,
  733. she gets V;
    if she quits,
  734. she gets 0: so she's going to
    want to fight.
  735. Everyone okay with that?
    If Player B fights in period 2
  736. [error; 1]
    then, if Player A fights,
  737. she gets -C + V and if she
    quits she gets 0,
  738. and here's where our assumption
    is going to help us.
  739. We've assumed,
    what did we assume?
  740. We assumed V is bigger than C,
    just like we played in class.
  741. So because V is bigger than C
    this is going to be the best
  742. response: again fighting is
    going to be the best response.
  743. So we know that,
    in fact, Player A here has a
  744. dominant strategy,
    the dominant strategy is to
  745. fight in period 1 in this
    analysis of the game.
  746. And since A is fighting,
    not surprisingly,
  747. we're going to find that B is
    going to quit.
  748. So B's best response is to quit.
    So here is our Nash equilibrium
  749. in this sub-game.
    This game has a Nash
  750. equilibrium, it only has one,
    and the equilibrium is (Fight
  751. [1], quit [1]).
    Now let's just talk about it
  752. intuitively for a second.
  753. if I know, if I'm playing Jake,
    and I know that Jake is going
  754. to fight tomorrow--sorry,
    other way round--I know that
  755. Jake's going to quit tomorrow
    and I'm going to fight
  756. tomorrow--I know that tomorrow
    I'm going to win.
  757. So that prize is there for me
    tomorrow, so why would I want to
  758. quit today?
    I'm going to get $1 tomorrow if
  759. I just fight in this period.
    So why would I want to quit
  760. today when, at worst case
    scenario, I'm only going to lose
  761. $.75 today.
    So if I know Jake is quitting
  762. tomorrow I'm going to stay on
    and fight now.
  763. And conversely,
    if Jake knows he's quitting
  764. tomorrow, hence,
    he knows I'm going to fight
  765. now, he may as well just quit.
    So what we're learning here is
  766. in this particular example,
    if we know that tomorrow I'm
  767. going to win the war,
    I'm actually going to win it
  768. today.
    Say it again,
  769. if we know that tomorrow I'm
    going to win the war,
  770. I'm actually going to win it
  771. The converse is true for the
    case where I'm the quitter and
  772. Jake's the fighter tomorrow.
    So once again it's pretty quick
  773. to see that from Jake's point of
    view if I'm going to fight he's
  774. going to want to fight.
    If I'm going to quit,
  775. he's going to want to fight.
    So in either case he's going to
  776. want to fight.
    So I'm going to want to quit.
  777. So the Nash equilibrium in this
    game is (Quit [1],
  778. fight [1]).
  779. So at this stage,
    we found all of the pure
  780. strategy sub-game perfect
    equilibria in the game.
  781. Let's describe them before I
    write them up.
  782. One pure strategy Nash
    equilibrium has me fighting in
  783. period 1 and Jake quitting in
    period 1;
  784. and if we got to period
    2--which in fact we won't--then
  785. I fight again and he quits
  786. So let's write that equilibrium
    up and we'll do it here.
  787. We'll do it on the top board
  788. So let me get it right,
    when I write it up as a whole
  789. equilibrium.
    So I claim that I've now found
  790. all of the pure strategy SPE in
    this game.
  791. One of them involves my
    fighting in the first period and
  792. fighting in the second period,
    and Jake quitting in the first
  793. period and quitting in the
    second period.
  794. The other one just flips it
    around, I quit in the first
  795. period, and if I got there I
    would quit in the second period
  796. and Jake fights in the first
    period and,
  797. if he got there,
    he would also fight in the
  798. second period.
    So these are perfectly natural
  799. equilibria to think about.
    If you want to get it
  800. intuitively, each of these
    equilibria involves a fighter
  801. and a quitter.
    The fighter always fights,
  802. the quitter always quits.
    If I know that I'm playing a
  803. quitter, I'm always going to
    fight, so that's a best
  804. response.
    If I know I'm facing a fighter,
  805. I'm going to want to quit,
    so that's a best response and
  806. those are two very simple
  807. That's the good news.
    What's the bad news here?
  808. The bad news is we haven't
    achieved our goal.
  809. Our goal was to argue that
    rational players might get
  810. involved in a fight,
    and notice that in each of
  811. these two pure strategy sub-game
    perfect equilibria,
  812. in each of them,
    no real fight occurs.
  813. Is that right?
    In each of them one person
  814. fights for the first period,
    but the other person just runs
  815. away.
    That isn't much of a fight.
  816. Let me say it again.
    In each of these equilibria,
  817. one side is willing to fight,
    but the other side isn't,
  818. so no fight occurs.
    In particular,
  819. no costs are incurred in either
    of these equilibria.
  820. But I claimed at the beginning,
    I wanted to explain how we
  821. could have costs occur in
  822. Rational players are going to
    incur costs.
  823. So what am I missing here?
    What should I do to try and
  824. find a more costly equilibrium?
    I claim I'm still missing some
  825. equilibria here.
    What kind of equilibria am I
  826. missing?
    I'm missing the mixed strategy
  827. equilibria.
    So far, all we've done is solve
  828. out the pure-strategy equilibria
    but we need to go back and
  829. re-analyze the whole game
    looking now for mixed strategy
  830. equilibria.
  831. So we're going to do the
    entire--take a deep breath,
  832. because we're going to take the
    whole analysis we just did,
  833. we're going to repeat the
    entire analysis we just did,
  834. but this time we're going to
    look at mixed strategy
  835. equilibria.
    Everyone happy with what we're
  836. doing?
    So first of all,
  837. we're going to go back to the
    second sub-game.
  838. Here's the second sub-game,
    and we already found the pure
  839. strategy equilibria,
    so let me get rid of them,
  840. and in your notes you probably
    want to rewrite this matrix.
  841. But I'm not going to rewrite it
    here because we're a little bit
  842. short of time.
    This is exactly the same payoff
  843. matrix we saw before,
    but now I want to look for a
  844. mixed strategy equilibrium in
    this game.
  845. How do I go about finding--it's
    good review this--how do I go
  846. about finding a mixed strategy
    equilibrium in a game like this?
  847. What's the trick for finding
    mixed strategy equilibria?
  848. Should we try our guys from New
    Jersey and New York?
  849. Let's try our guys from New
    Jersey and New York.
  850. Where's my New Yorker?
    We'll have the true battle here
  851. between New York and New Jersey,
    how do we find a mixed strategy
  852. equilibrium?
    Student: You use the P's
  853. and Q's and set them equal to
    one another.
  854. That's a very crude explanation.
    Professor Ben Polak:
  855. That's a crude thing okay.
    So the answer was we find the
  856. P's and Q's and "set them equal
    to one another."
  857. What is it we're actually
    setting equal to what?
  858. Let's try and get some response
    on this.
  859. Did our New Jersey guy flee?
    Where's my New Jersey person?
  860. They fled.
    We could give our Texans
  861. another chance.
    Where's our Texan?
  862. Tthere was a Texan down here
    somewhere, what is it they set
  863. equal to what?
    Student: I guess the
  864. chances that one would quit and
    the other would fight.
  865. Professor Ben Polak:
    Not quite.
  866. The remark about using P's and
    Q's was right.
  867. This is good review for the
  868. What is it I'm going to do with
    those P's and Q's?
  869. Shout it out.
    Student: You use the
  870. other player's payoffs.
    Professor Ben Polak:
  871. Use the other player's
    payoffs and?
  872. Student: Make them
    indifferent between their
  873. strategies.
    Professor Ben Polak:
  874. Good.
    I'm going to choose Player B's
  875. mix in such a way as to make
    Player A indifferent between
  876. choosing fight and quit.
    So as to make it plausible that
  877. A is actually mixing.
    So again the intuition is for A
  878. to be mixing they must be
    indifferent between fight and
  879. quit.
    So I'm going to choose the mix
  880. of B to make A indifferent.
    So that's good review.
  881. Let's do that.
    So here I've usually used the
  882. letter Q but to avoid confusion
    here, let me use the letter P.
  883. We're going to choose P to make
    Player A indifferent.
  884. So if A fights then their
    payoff is what?
  885. Let's have a look.
    It's -C with probability P,
  886. and V with probability of 1 -
  887. This should be coming back now.
    This is before the mid-term,
  888. but you guys were alive before
    the mid-term so you should
  889. remember this.
    So if they fight,
  890. they get -C P + V [1--P].
    If they quit then they get 0
  891. with probability P,
    and 0 again with probability 1
  892. - P.
    So we know that if A is mixing,
  893. B must be mixing in such a way
    as to make these two numbers
  894. equal.
    So we know these two must be
  895. equal to one another.
    Since they're equal we can now
  896. solve for P, so what's that
    going to give us?
  897. It's going to give us V [1--P]
    = P .
  898. C and that I think is P = V /
    [V + C].
  899. Is that right?
    Someone just check my algebra.
  900. If you remember the game of
    Hawk Dove that we saw just
  901. before the mid-term--it was a
    game we looked at when we looked
  902. at evolution--this is
    essentially the same game more
  903. or less as that game,
    and that you'll notice it's the
  904. same kind of mixture we've got
  905. So P = V / [V + C]
    which means 1 - P = C / [V +
  906. C].
    I'm leaving it up there for a
  907. bit hoping that one of the
    T.A.'s is just going to do my
  908. algebra for me.
    I think that's right though.
  909. So this game is symmetric so we
    could do the same for B but
  910. we'll find the same thing:
    it's a symmetric game.
  911. So the mixed strategy
    equilibrium, the mixed Nash
  912. equilibrium in this game has
    both mix, both fight with
  913. probability equal to V / [V +
  914. Now this is good news,
    because at least we are getting
  915. some fighting going on,
    but we need to do something.
  916. We need to take this Nash
    equilibrium we've just found,
  917. which is a Nash equilibrium in
    the second sub-game.
  918. It's a Nash equilibrium in the
    sub-game way up here.
  919. And we need to roll back the
    payoffs from this sub-game,
  920. the equilibrium payoffs from
    this sub-game into the first
  921. stage.
    That's our method.
  922. How are we going to do that?
    Well we better first of all
  923. figure out what those payoffs
  924. So what are the payoffs in this
  925. The payoffs in this mixed Nash
    equilibrium are what?
  926. Anyone see what the payoffs are
    going to be if they're both
  927. playing this mix?
    Well presumably the payoff from
  928. fight and the payoff from quit
    must be the same,
  929. is that right?
    So we may as well choose the
  930. easier one.
    So I claim that the payoff from
  931. quit is 0 x P + 0 x 1 - P which
    is 0.
  932. V / [V + C]
    + 0 .
  933. C / [V + C]
    but that's equal to what?
  934. 0, okay good.
    So it's got to be the case
  935. (kind of conveniently) that if
    they do play this mixed strategy
  936. equilibrium in stage 2,
    the payoff they'll get from
  937. playing it is 0.
    That's going to make life a
  938. little bit easier later on.
    That's our new equilibrium in
  939. the second sub-game.
    Now let's roll that back to the
  940. first game.
    Here's our first game again,
  941. and everything about this is
    correct except what's down here.
  942. So let's get rid of what's down
  943. Our analysis from before is
    more or less still intact.
  944. It's still the case,
    that if they both quit they'll
  945. get 0.
    If they (Quit,
  946. fight) they'll get (0,
  947. or (V, 0) and it's still the
    case if they both fight they'll
  948. both incur costs of C and
    they'll both then get stage 2
  949. continuation Nash payoffs.
    Is that right?
  950. But now instead of those
    continuation Nash payoffs being
  951. (V, 0) or (0,
    V), those continuation Nash
  952. payoffs are going to be what?
    They're going to be 0.
  953. So what we're going to do here
    is we're going to backward
  954. induct, or roll back,
    those zero payoffs and come up
  955. with the corresponding matrix to
    describe the first stage of the
  956. game.
    Here it is.
  957. Fight, Quit--I'll try to get it
    right without Jake having to
  958. correct me this time--little f,
    little q.
  959. This is A.
    This is B.
  960. And the payoffs here are (0,0)
  961. (0, V);
    (V, 0) just as before;
  962. and, in this box now,
    we've got -C + 0 and -C + 0.
  963. So it's exactly the same box we
    saw before, but now the
  964. continuation payoffs are just 0.
    Again, what is this?
  965. This is for the Nash
    equilibrium--let's just say for
  966. the mixed Nash equilibrium in
    period 2.
  967. Now what I want to do is I want
    to find the mixed equilibrium in
  968. period 1.
    We found the mixed equilibrium
  969. in period 2.
    Now I want to find the mixed
  970. equilibrium in period 1.
    So what I could do here is I
  971. could spend a lot of time.
    I could put in a P and a 1 - P
  972. and I could work out what mix of
    Player B will make A
  973. indifferent.
    I could work out what mix of
  974. Player A would make B
  975. But has anybody noticed
    something about this matrix?
  976. What do you notice about this
  977. Somebody help me out?
    Somebody's got to help me out
  978. here.
    Tell me something about this
  979. matrix.
    What's true about this matrix?
  980. Student: It's the same
    as the one above.
  981. Professor Ben Polak:
    It's the same as the one
  982. above.
    The matrix I just drew,
  983. when I rolled back the payoffs
    is exactly the same matrix that
  984. I had here.
    It's exactly the same matrix.
  985. So we already know what the
    mixed strategy equilibrium is in
  986. this.
    The mixed Nash equilibrium in
  987. this matrix is both fight with
    probability P = V / [V + C]
  988. So now we're ready to show our
    new sub-game perfect
  989. equilibrium,
    let's drag it down.
  990. Here's our whole game,
    we found the pure SPEs but now
  991. we're ready to find the mixed
  992. The mixed sub-game perfect
    equilibrium has Player A--before
  993. I do that let me just give this
    P a name.
  994. Let me call this P, P*.
    So V / [V + C],
  995. let's call it P*.
    So the mixed sub-game perfect
  996. equilibrium has Player I mixing,
    fighting with probability of P*
  997. in the first stage;
    and in the second stage,
  998. again mixing,
    fighting with probability of
  999. P*.
    So this is Player 1 and Player
  1000. 2 does exactly the same thing.
    While we're here,
  1001. what's the expected payoff for
    each player if they're playing
  1002. this mixed sub-game perfect
  1003. It's 0--the payoff from
    this--the expected payoff is 0.
  1004. So now we're actually getting
    somewhere, now we're really
  1005. getting somewhere.
    So let's just take a deep
  1006. breath and see where we are.
    We broke this game down,
  1007. this complicated game we played
    in class, that conceivably--for
  1008. example,
    when New York is playing New
  1009. Jersey--conceivably it could go
    on all night.
  1010. Apparently not when the Texans
    are playing each other or the
  1011. football team is playing the
    baseball team,
  1012. but when we have New York and
    New Jersey it could go on all
  1013. night.
    We curtailed it to a two period
  1014. game, but in a minute,
    we're going to go back to the
  1015. infinite game.
    In this two period game,
  1016. I tried to argue--I'm trying to
    convince you--that you could get
  1017. fighting occurring just in
    equilibrium with absolutely
  1018. standard rational players:
    nothing to do with pride,
  1019. nothing to do with reputation,
    nothing to do with the fact
  1020. that these guys are crazy guys
    who drunk the water in New York
  1021. and New Jersey,
    God help them.
  1022. You can get fighting with
    ordinary people in equilibrium.
  1023. What we've shown is the way in
    which you can get fighting is in
  1024. a mixed strategy equilibrium.
    In each period of the game,
  1025. people fight with probability
  1026. That's just enough fight to
    give the other side an incentive
  1027. to quit and just enough
    probability of the other side
  1028. quitting to give the other side
    an incentive to fight;
  1029. just exactly enough.
    If they play that equilibrium
  1030. in every period,
    there's some chance of the game
  1031. ending but with probability P,
    the game goes forward to the
  1032. next period.
    So you could potentially have
  1033. fights for two periods.
    By the way, with what
  1034. probability would there be a
    fight in both periods?
  1035. That's a good homework
    question, I won't answer it
  1036. here, you can work it out at
  1037. Should we do it here?
    Anybody want to tell me?
  1038. So okay, with what probability
    do we get a fight in the first
  1039. period;
    a real fight,
  1040. a fight involving both players?
    We need both players to fight,
  1041. each are fighting with
    probability of P,
  1042. so the probability of both of
    them fighting is what?
  1043. P².
    So to get a fight in the first
  1044. period, the probability is
  1045. To get a fight in both periods
    is what then?
  1046. P^(4).
    But we get a fight with
  1047. probability of P^(4) going
  1048. We get fighting in equilibrium.
    Moreover, we get some very
  1049. intuitive things that we already
    learned in the Hawk-Dove game.
  1050. The probability of fight--so in
    this equilibrium--the
  1051. probability of fights occurring
    goes up as V goes up.
  1052. So the prize gets bigger,
    you're more likely to see
  1053. fights occur:
    that seems right.
  1054. It goes down in C.
    So the probability of fights
  1055. occurring goes up in the size of
    the prize--that seems
  1056. intuitively right--and down in
    the cost of fighting.
  1057. Now, okay, that's reasonable,
    but I claimed I could show you
  1058. this not in a two period game,
    but in an infinite period game.
  1059. So let me spend the last five
    minutes taking us to infinite
  1060. period games.
    So everybody take a deep breath.
  1061. We're now going to consider
    something, we've never done
  1062. before.
    We're going to consider a game
  1063. that could go on forever.
    It could go on forever.
  1064. The way we're going to do that
    is to use the following idea,
  1065. the following picture.
  1066. So I can't really draw a true
    tree for the infinite period
  1067. game.
    The reason I can't draw a true
  1068. tree for the infinite period
    game is: (1) I would run out of
  1069. chalk;
    and (2) I'd run into lunch time.
  1070. But you can imagine what it
    looks like.
  1071. It looks like this,
    roughly speaking.
  1072. The infinite period game looks
    something like this.
  1073. And then it goes again,
    and then it goes again,
  1074. and so on, and it would go
    right the way through the board
  1075. and work right the way across
    whatever street that is,
  1076. right across campus.
    That's what the infinite tree
  1077. would look like,
    so I clearly can't really
  1078. analyze that object.
    But what I want to show you is
  1079. that we can still solve this
    game even though it's an
  1080. infinite period game.
    How are we going to do that?
  1081. Let's look at a particular
  1082. Let's call the stage Stage
    4,503 whatever that number was:
  1083. 4,503, whatever it was.
    So here is the stage,
  1084. this arbitrary stage,
    and the tree for this arbitrary
  1085. stage looks like this.
    What I'm going to do is:
  1086. this is Stage 4,503,
    and what I'm going to add to
  1087. this is that before you get into
    this stage,
  1088. you're going to incur sunk
  1089. If you go on playing after this
    stage then you're going to get
  1090. continuation values.
    So going into the beginning of
  1091. the game, you've incurred some
    sunk costs and if you come out
  1092. on the other side and go on
  1093. you're going to play some
    equilibrium and get continuation
  1094. values.
    But otherwise everything else
  1095. is the same.
    We still have (0,0) here,
  1096. we still have (0,
    V) here, we still have (V,
  1097. 0) here and here we still have
    -C plus continuation values and
  1098. -C plus continuation values.
    This is something we've seen
  1099. before.
    This little box is something
  1100. we've seen before.
    Essentially we've got sunk
  1101. costs in front,
    but they're irrelevant.
  1102. We've got continuation values
    at the end but we know how to
  1103. handle them, we just put them
    into the payoffs.
  1104. So suppose now that in the
    continuation game,
  1105. people play the mixed strategy
    that we just found.
  1106. Suppose that in the
    continuation game people mixed
  1107. with probability P:
    so they fight with P* and quit
  1108. with probability 1 - P*.
    Suppose in the continuation
  1109. game they're playing a mixed
  1110. In that case,
    what is the continuation value
  1111. of the game?
    What is it?
  1112. It's 0 right.
    If they're mixing in the
  1113. future, they always have the
    option to quit so it must be
  1114. that the continuation value is
  1115. So if they mix in the future
    then the continuation value is
  1116. (0,0).
    So now let's go back to this
  1117. board.
    To make this board equivalent
  1118. to the board above,
    all I need to do is one thing.
  1119. I need to add on some sunk
    costs at the front.
  1120. I've got sunk costs at the
  1121. I'm going to play the game.
    And then I'm going to get,
  1122. instead of stage 2 values,
    I'm going to get stage--what
  1123. was it?--4,503 and all stages in
    the future values in here and
  1124. here, but otherwise it's the
    same thing.
  1125. And what's convenient is:
    all of those are 0 anyway.
  1126. Since they're all 0 anyway,
    this matrix is still correct,
  1127. the continuation values are 0
    and 0, these are now the
  1128. continuation values.
  1129. And so if I look for a
    mixed-strategy equilibrium in
  1130. this game, it's something I've
    solved already.
  1131. What's the mixed strategy
    equiliibrium in this game?
  1132. Anybody?
    It's exactly what we found
  1133. before.
    Just as before,
  1134. I'm going to mix with
    probability of V / [V + C].
  1135. Let's summarize,
    we did something today--just
  1136. now--that we've never done
  1137. We've looked at an infinite or
    at least potentially infinite
  1138. period game: a game that could
    go on forever.
  1139. The way in which we handled the
    game that could go on forever
  1140. was what?
    We noticed two things.
  1141. We noticed that part of the
    game that comes before,
  1142. that part of the game that's
    passed already,
  1143. anything that happened there is
    just a sunk cost.
  1144. It's irrelevant.
    It hurts if it's a cost.
  1145. It's nice if it's a gain.
    But it's sunk,
  1146. you can't affect it now.
    Anything in the future can be
  1147. summarized by the value,
    the payoff I'm going to get in
  1148. the future by playing the
    equilibrium in the future.
  1149. In this case,
    the future meant mixing.
  1150. Mixing gave me the value of 0.
    So here I am getting 0 from the
  1151. future.
    Then I can just analyze the
  1152. game in the stage in which I'm
    in, just as if it was an
  1153. ordinary, bog-standard,
    simultaneous move game.
  1154. When we did so,
    in this particular
  1155. example--we're going to see more
    examples like this after the
  1156. break--but in this particular
  1157. we found out something quite
  1158. This thing we found out was,
    in these war of attrition
  1159. settings, there is an
    equilibrium with rational
  1160. players--more than that,
    common knowledge of
  1161. rationality: everybody's
    rational, everyone knows
  1162. everyone else is rational--there
    are equilibria in which,
  1163. not only people fight but they
    could fight forever.
  1164. In every period they fight with
    some probability and we got an
  1165. extra prediction out of it,
    a prediction that we weren't
  1166. expecting.
    Let me just give you that
  1167. prediction, and then we'll leave
    the class with that.
  1168. The extra prediction is this,
    if we look at these wars of
  1169. attrition, and we keep track of
    the time in which the--hang on
  1170. guys don't rush to the back
    yet--one more thing.
  1171. If we look at the time in which
    the games have gone on and keep
  1172. track of the probability that a
    war will end at that time.
  1173. So imagine this is World War I.
    You could imagine World War I
  1174. going for one year,
    or two years,
  1175. or three years,
    or 20 years or whatever.
  1176. The probability distribution in
    this war of attrition is going
  1177. to look like this.
    In every period,
  1178. the probability of continuing
    is just P*².
  1179. So, in every period,
    the chance that--as you get
  1180. further into the future there's
    a greater chance the war will
  1181. end.
    You can get very long,
  1182. very costly wars;
    that's the bad news.
  1183. The good news is it doesn't
    happen very often.
  1184. I guess we're all involved in a
    rather large and costly war
  1185. right now, so I'll leave you
    with that pleasant thought over
  1186. Thanksgiving.
    Have a good break and we'll see
  1187. you afterwards.