## ← 16. Backward induction: reputation and duels

• 1 Follower
• 1469 Lines

### Get Embed Code x Embed video Use the following code to embed this video. See our usage guide for more details on embedding. Paste this in your document somewhere (closest to the closing body tag is preferable): ```<script type="text/javascript" src='https://amara.org/embedder-iframe'></script> ``` Paste this inside your HTML body, where you want to include the widget: ```<div class="amara-embed" data-url="http://www.youtube.com/watch?v=SE7kP7XZuV4" data-team="veduca"></div> ``` 1 Language

Showing Revision 1 created 07/17/2012 by Amara Bot.

1. Professor Ben Polak: So
this is what we did last time:
2. we looked at a game involving
an entrant and an incumbent in a
3. market;
and the entrant had to decide
4. whether to enter that market or
not;
5. and if they stayed out the
incumbent remained a monopolist;
6. and the monopolist made 3
million in profit.
7. If the entrant goes in,
then the incumbent can decide
8. whether to accommodate the
entrant and just settle for
9. duopoly profits,
making a million each;
10. or the incumbent can fight,
in which case the incumbent
11. makes no money at all and the
entrant loses a million dollars.
12. We pointed out a number of
13. One was that when we analyzed
it in a matrix form we quickly
14. found that there were two Nash
Equilibria, that Nash
15. Equilibrium were:
in and not fight;
16. and out and fight.
But we argued that backward
17. induction tells us that the
sensible answer is in and not
18. fight.
Once the incumbent knows the
19. entrant is in they're not going
to fight because 1 is bigger
20. than 0, and the entrant
anticipating this will enter.
21. When we talked a little bit
more we said this other
22. equilibrium, this out fight
equilibrium--it is an
23. equilibrium because if the
entrant believes the incumbent's
24. going to fight then the entrant
is going to stay out,
25. and it's costless for the
incumbent to "fight" if in fact
26. the entrant does stay out
because they never get called
27. upon to fight anyway.
So the idea of this was that
28. for the incumbent to say they're
going to fight is an "incredible
29. threat."
That's terrible English.
30. It's the way it is always
taught in the textbooks.
31. It needs to be called a "not
credible" threat.
32. And that "not credible" threat
is: he's not really going to
33. fight if the entrant comes in,
and therefore,
34. the entrant should come in and
in fact the incumbent will
35. accommodate it.
So what we've shown here is
36. that, if we believe this
argument, then the entrant will
37. come in and the incumbent is
going to let him in.
38. At the end, we started talking
39. elaborate setting,
so let's just remind you of
40. what that more elaborate setting
is.
41. The more elaborate setting is
suppose that there is one firm,
42. one monopolist,
and that monopolist holds a
43. monopoly in ten different
markets.
44. So we'll have our monopolist be
Ale.
45. So here's Ale.
He's our monopolist,
46. and he owns pizzeria monopolies
in ten different markets.
47. And each of these ten different
markets are separate,
48. they are different towns.
And in each of those ten
49. markets he thinks--he knows he's
going to face an entrant and
50. those entrants are going to come
in order.
51. So let's just talk about who
those entrants are going to be.
52. The entrants are going to be
this person, this person and so
53. on.
Let's find out who they are,
Student: Isabella
55. Professor Ben Polak: Where
are you from?
56. Student: Miami.
Professor Ben Polak:
57. Miami, so Miami is one of the
markets.
Student: Scott.
59. Professor Ben Polak:
From where?
60. Student: Wisconsin.
Professor Ben Polak:
61. Where in Wisconsin?
62. Professor Ben Polak:
63. We're just going to do towns
now.
64. Student: My name is
Lang.
65. I'm from Bridgeport,
Connecticut.
66. Professor Ben Polak:
Okay, we've got three towns.
67. Student: I'm from Miami
too.
68. Professor Ben Polak:
69. Well we'll pretend you're from
somewhere else.
70. Put him in New Orleans or
something.
71. Student: Chris from
Boston.
72. Professor Ben Polak:
From Boston, all right.
73. Student: From Orange,
Connecticut.
74. Professor Ben Polak:
From Orange, Connecticut so just
Student: St.
76. Louis, Missouri.
Professor Ben Polak: All
77. right, have we got ten yet?
I'm not quite at ten.
78. One, two, three,
four, five, six,
79. seven.
Student: Saffron,
80. New York.
Professor Ben Polak: All
81. right.
Student: Hong Kong.
82. Professor Ben Polak:
Hong Kong, that's way away.
83. Student: Long Island.
Professor Ben Polak:
84. Long Island.
I think I've probably got ten
85. markets here.
So Ale owns a pizza shop.
86. He's the monopoly pizza shop
owner in each of these ten
87. markets.
And what we're going to see is
88. we're going to see what happens
as, sequentially,
89. these entrants try to enter.
The way that this game's going
90. to work is that they're lined
up--we know the order in which
91. the entrants are going to come.
They're going to start off,
92. the first person who is going
to have to make a decision is-
93. Student: Enter.
Isabella.
94. Professor Ben Polak: Is
Isabella, right?
95. We're going to see how our
monopolist responds.
96. So let's have a look at this.
So Isabella who is in which
97. market again?
Student: Miami.
98. Professor Ben Polak: In
Miami, okay what are you going
99. to do?
Student: Enter.
100. Professor Ben Polak:
What are you going to do?
101. Student: I will fight.
Professor Ben Polak: Oh
102. dear, so you owe me a million
dollars.
103. So one person's down a million
dollars, let's see what happens
104. next.
Student: I'm going to
105. stay out.
Professor Ben Polak:
107. Professor Ben Polak:
108. Student: I'm going to
stay out.
109. Professor Ben Polak:
Staying out.
110. So Bridgeport stayed out.
Student: I guess I'll
111. stay out.
Professor Ben Polak:
112. Stayed out again.
Student: Stay out.
113. Professor Ben Polak:
Which market are we up to now,
114. somewhere in Orange County
wasn't it?
115. Where were we?
Student Orange,
116. Connecticut.
Stay out.
117. Professor Ben Polak:
Stay out.
118. Student: I think I'll
stay in.
119. Professor Ben Polak:
You'll come in okay,
120. and which market is this?
Student: St.
121. Louis, Missouri.
Professor Ben Polak: St.
122. Louis, Missouri.
So you owe me a million dollars
123. as well, okay.
A couple of million dollars is
124. a good class.
We're going to have plenty of
125. money for lunch.
Student: I'm also going
126. to fight.
Professor Ben Polak:
127. You're going to fight,
which market is that?
128. Student: Saffron,
New York.
129. Professor Ben Polak:
130. Student: Saffron.
Professor Ben Polak:
131. Saffron, New York.
Where are we at?
132. One, two, three,
four, five, six,
133. seven, eight,
Ale?
134. TA: Fight.
Professor Ben Polak: You
135. fight?
So you owe me a million dollars
136. too.
That was eight, nine?
137. Student: Out.
Professor Ben Polak: Out
138. and ten?
Student: Stay out.
139. Professor Ben Polak:
Stays out, okay.
140. Now let's just notice something
here, which was the tenth
141. market?
What town were you?
142. Student: Long Island.
Professor Ben Polak:
Student: Huntington.
144. Professor Ben Polak: So
if Huntington,
145. Long Island our last market had
come in, suppose you'd said in,
146. what would Ale have said?
TA: I would not have
147. fought.
Professor Ben Polak:
148. Would not have fought,
aha!
149. Okay, so what happened here?
When we analyzed this last time
150. as an individual market,
we argued that each entrant
151. should come in just as our first
entrant came in,
152. and our monopolist should not
fight:.
153. That's what we have up on the
board.
154. That's what backward induction
suggests.
155. But in fact, Ale fought.
A whole bunch of people came
156. in, and a whole bunch of them
stayed out, is that right?
157. Now why?
Why was Ale fighting these guys
158. and why were they staying out?
159. and what market were you again?
160. Wisconsin.
Professor Ben Polak: So
Wisconsin stay out?
162. Student: Well we talked
about it last time how he has an
163. incentive to fight now because
there's more that just our
164. analysis up there in terms of
establishing that he'll fight to
165. keep people out.
Professor Ben Polak: All
166. right, so it looks like there
might be some reason for
167. fighting to keep you out.
So let's just talk about it a
168. little bit more,
let's go to the third guy.
169. Which market are you again?
Student: Bridgeport.
170. Professor Ben Polak:
Bridgeport, so why did you stay
171. out?
Student: Because I knew
172. he was going to fight.
Professor Ben Polak: You
173. knew he was going to fight.
Now how did you know he was
174. going to fight?
Student: Because he has
175. an incentive to establish,
he established that he was
176. going to fight for every single
market and so I was going to
177. lose out.
Professor Ben Polak: All
178. right, so we know--we think we
know that Ale is--you know he's
179. this tough Italian pizzeria
owner and we think he's going to
180. try and establish what:
a reputation as being a tough
181. pizzeria owner by fighting these
guys,
182. perhaps fighting a few guys
early on in order to keep these
183. guys out.
In fact, he had to fight the
184. first person but he kept out
2,3, 4,5, 6 and this person came
185. in, so 7 and 8 came in,
but then 9 and 10 he kept out.
186. So he kept a lot of people out
of the market by fighting early
187. on.
Now this argument sounds right:
188. it seems to ring true.
189. reputation, but now I want to
show you that there's a worry
190. with this argument.
The worry is this is a
191. sequential game and like all
sequential games of perfect
192. information we've seen in the
class, we should analyze this
193. game how?
Now that wasn't loud enough,
194. how?
Backward induction.
195. So where's the back?
Where's the back of this game?
196. Way back here--sorry for the
guys in the balcony.
197. Way back here we have the last
market in town--which was the
198. last market?
And if we look at this last
199. market, we in fact saw that if
the last market came in,
200. Ale in fact gave in.
Ale gave in.
201. Now why did Ale give in,
in the last market?
202. Let's have a look back on the
board.
203. So on the board we can see what
that last market looks like.
204. With ten markets this is a very
complicated game.
205. This would be the first market,
and then there's three versions
206. of the second market depending
on what Ale did in the first
207. market,
and so there's nine versions of
208. the third one.
The tree for this game is
209. horrendous.
But nevertheless,
210. once we get to the end of the
game, the tenth market--which
211. was what?
Bridgeport or something,
212. I've forgotten where it was at
now--wherever it was.
213. Once we get to that last market
this tree pretty well describes
214. that last market--is that
correct?
215. There isn't another market
afterwards.
216. There's only ten markets.
So, in this last market,
217. what do we know Ale's going to
do?
218. In this last market if the
entrant enters Ale is going to
219. not fight, which is what exactly
what Ale did do.
220. So Ale is that right?
So when in fact we discussed
221. the tenth guy coming in,
you chose to?
222. TA: I would have chosen
not to fight.
223. Professor Ben Polak:
Would have chosen not to fight.
224. That's exactly what the model
predicts.
225. He has no incentive to
establish a reputation for the
226. eleventh market because there
isn't an eleventh market.
227. He's done at ten.
Is that right?
228. So we know that in the last
market, the tenth market,
229. Ale actually is not going to
fight.
230. Therefore, the person who's the
entrant in the tenth market
231. should know that they can safely
enter and Ale won't fight them.
232. But now we're in trouble.
Why are we in trouble?
233. Well let's go back to the ninth
market, the second to last
234. market.
So I've forgotten where it was.
the second to the last market.
236. Okay, the second to last market
is this guy?
237. You're the tenth market?
So this guy who is in the Hong
238. Kong market, he should know he's
the second to last market.
239. He knows that no matter what he
does the tenth market's going to
240. enter and Ale's going to give in
to the tenth market.
241. Ale's going to let the tenth
entrant in.
242. Is that right?
So the ninth market actually
243. knows that nothing Ale's going
to do here is going to establish
244. a reputation to keep the tenth
guy out.
245. So therefore in fact,
he should what?
246. He should come in, right?
He should come in,
247. and in fact if he had come in,
Ale would have had to give in
248. because there's no way that Ale
can keep the tenth guy out.
249. We just argued the tenth guy's
coming in by backward induction.
250. So since we know that the tenth
guy's coming in anyway,
251. and in fact,
Ale's going to concede to them,
252. there's no point Ale trying to
scare off the tenth guy.
253. So in fact, Ale's going to say
no fight to the ninth guy.
254. But now we go to the eighth guy.
We've just argued that the
255. tenth guy's coming in anyway and
Ale's going to give in to him.
256. We've argued the ninth guy's
coming in, so Ale's going to
257. give in to this guy as well
because you can't put off the
258. tenth guy.
And therefore we know that once
259. we get to the eighth guy,
once again, he can safely come
260. in because Ale knows by backward
induction he can't keep the
261. ninth and the tenth guy out
anyway,
262. and so this guy should come in
as well, and if we do this
263. argument all the way back,
what do we get?
264. He lets everybody in.
Everybody should come in and he
265. should let everybody in.
So we have a problem here.
266. We have a problem.
Backward induction says,
267. even with these ten markets,
Ale in fact should let
268. everybody in.
Everyone should know that,
269. so they should come in.
So there's a disconnect here.
270. There's a disconnect between
what the theory is telling
271. us--backward induction is
telling us Ale can't keep people
272. out by threatening to fight,
by establishing a reputation
273. --, and what we actually just
saw, what happened,
274. which was Ale did fight and did
keep people out,
275. and we know that other
monopolist's do that as well.
276. So how can we make rigorous
this idea of reputation?
277. It's not captured by what we've
done so far in the class.
278. So how can we bring back what
must be true in some sense--it's
279. intuition that,
by fighting,
280. Ale could keep people out and
therefore will keep people out.
281. So to make that idea work I
want to introduce a new idea.
282. And the new idea is that,
with very small probability,
283. let's say 1% chance,
Ale is crazy.
284. So stand up a second,
so he looks like a normal kind
285. of guy but there's just 1%
chance that he's really bonkers.
286. There's a 1% chance that he's
actually Rahul.
287. So now let's redo the
analysis--And what do I mean by
288. bonkers?
By bonkers, I mean,
289. with 1%, Ale is the kind of guy
who likes to fight.
290. So with 1% chance,
he's actually not got these
291. payoffs at all;
he's actually got some
292. different payoffs,
which are payoffs of somebody
293. who--okay he'll lose money--but
he so much enjoys a fight he
294. gets +10 here.
That's the bonkers guy's payoff.
295. But there's only 1% chance he's
this bonkers guy.
296. So now what happens?
Let's just walk it through.
297. With 1% chance,
if there was only one market,
298. not the ten markets.
So there's only one market and
299. this one market was--I've
300. Student: Isabella.
Professor Ben Polak:
301. Isabella, who was in which
market, I've forgotten.
302. Student: Miami.
Professor Ben Polak: In
303. Miami, then she doesn't really
much care about the 1% chance
304. that Ale is actually Rahul.
That doesn't really bother her
305. very much, why?
Because with 99% chance Ale's
306. going to give way and that's
good enough odds.
307. With 99% chance she's happy to
come in.
308. So if there was only one market
here, we'd be done.
309. But with ten markets things are
a little different.
310. Why? Let's see why.
So suppose in fact that
311. Isabella in Miami thinks that
Ale--and everybody else thinks
312. Ale is a pretty sane guy.
With 99% probability he's a
313. sane guy, and Isabella enters
and everyone sees this.
314. And to everyone's surprise,
rather than doing the sane
315. thing, which is letting Isabella
in and switching to a duopoly in
316. Miami,
what happens,
317. in fact, after Isabella comes
in is that Ale fights.
318. Okay, so now it's too late for
Isabella, she's lost her money,
319. but our second market is,
320. Student: Scott.
Professor Ben Polak:
321. Scott, which market were you?
322. Professor Ben Polak: So
323. what happened in Miami and
initially he thought that Ale
324. was Ale.
325. sane, nice, calm,
Italian guy.
326. But on the other hand,
he just saw this sane,
327. calm, Italian guy fight,
as he shouldn't have fought
328. because of backward
induction--fought the entrant in
329. Miami.
So now Scott thinks to himself:
330. hmm, I'm not so sure that Ale
is this sane guy.
331. Maybe I should update my
beliefs in the direction of
332. thinking that Ale might actually
be the insane guy.
333. So maybe--we're up to maybe a
probability 1/3 that Ale's
334. actually insane.
So he thinks:
335. okay, probability 1/3 that's
still not very much,
336. I'll still risk it,
he comes in,
337. and Ale fights him again.
It's a probability 1/3 he's
338. sane.
He's going to give in to me.
339. He comes in--Ale fights him
again.
340. So now we're in the third
market, which was which market?
341. Student: Bridgeport.
Professor Ben Polak:
342. Bridgeport.
And Bridgeport's seen this
343. horrible fight going on in Miami
and this horrible fight going on
and now he's getting pretty
345. sure that this nice,
calm, looking Ale is not nice,
346. calm, looking Ale.
He's crazy Rahul.
347. There's lot of evidence that
he's crazy Rahul.
348. He's fought the last two
markets making huge losses.
349. It must be that Ale likes to
fight.
350. So what does he do?
He says, I'm going to stay out
351. of here.
I'm convinced that this guy may
352. be crazy, so I'll stay out.
And all the subsequent markets,
353. they think: oh well you know he
fought these first two markets,
354. that means he must be a crazy
guy or at least there's a high
355. probability that he's a crazy
guy,
356. so they all stay out which is
exactly what happened until we
357. got to here.
And even here when they came in
358. Ale acted liked crazy Rahul.
359. possible was what?
360. possible was the small
possibility, the 1% possibility
361. that Ale is bonkers.
But you know,
362. how well do you know Ale?
There's a 1% chance he's
363. bonkers.
How many of you think you're
364. really sure that he's a sane
guy?
365. He supports Italian football
teams, he's got to be pretty
366. crazy, right?
So what happened here?
367. This small possibility that Ale
is crazy allowed him to build up
368. a reputation that kept all these
guys out,
369. but actually the argument is
stronger than that.
370. Let's try and push this
argument harder.
371. Suppose, in fact,
that Ale is not crazy.
372. Suppose that Ale is the sane
Ale, the nice,
373. calm, Ale we all know and love.
But we've just argued that if
374. Ale acts as if he's the crazy
guy then you're going to be
375. convinced that he is the crazy
guy,
376. so by acting crazy he might be
able to convince you that he is
377. crazy and therefore keep you
out.
378. So we argued before that,
if there's some chance that
379. Ale's crazy, by acting crazy
early on,
380. he's going to deter these late
entrants from entering the
381. market because they think
they're fighting Rahul and they
382. don't want to fight Rahul.
But we said these early guys,
383. they had probability .99 that
he was sane;
384. and .6 that he was sane;
and maybe even .5 he was sane
385. here--so they thought of coming
in.
386. But now we're arguing that even
if Ale is sane,
387. even if he's that sane guy,
a rational guy,
388. he should behave as if he's
crazy in order to keep these
389. late guys out.
And these early players knowing
390. that even the sane version of
Ale is going to fight them,
391. should also stay out.
Now notice something's happened
392. here.
They're not staying out because
393. they think Ale's crazy,
they're staying out because
394. they know that even the sane
version of Ale is going to fight
395. them in order to seem crazy,
is that right?
396. Everyone see that's a stronger
argument?
397. So now even these early guys
are going to stay out of the
398. market.
Now we're almost there.
399. What we've argued--let's just
make sure we get the two pieces
400. of this on the board.
We've argued that if there's an
401. epsilon chance,
a very small chance,
402. let's call it a 1% chance where
Ale is crazy,
403. then he can deter entry by
fighting, i.e.,
404. seeming crazy.
We argued that what really
405. makes this argument strong is
once we realize that the sane
406. person's going to act crazy,
we really know that everyone's
407. going to act crazy and therefore
we should stay out.
408. Now that argument won't quite
be right.
409. So that's enough of the
argument I want you to have for
410. the purpose of the exam,
but let me just point out that
411. that argument isn't quite
correct.
412. That can't quite be an
equilibrium.
413. Now why can't that be an
equilibrium?
414. We've just argued that even the
sane version of Ale--so this is
415. a sort of slightly more subtle
argument so just pay attention a
416. second.
We've argued that even the sane
417. version of Ale is going to act
like a crazy guy.
418. So if anyone came in,
he's going to act crazy anyway.
419. So you're not going to update
your belief as to whether he's
420. crazy or sane because we know
that the crazy guy is going to
421. fight because he likes fighting
and the sane guy is going to
422. fight because he wants to seem
like a crazy guy.
423. So you're really learning
424. fighting or now.
But now let's go back to our
425. tenth market,
way back in our tenth market.
426. Our tenth market participant,
whose name was Andy,
Ale.
428. He hasn't learned anything
because whether Ale was sane or
429. crazy he's going to fight.
So observing what his actions
430. early on, if that was really an
equilibrium, our tenth guy
431. wouldn't have updated his belief
at all,
432. and therefore,
would still believe with
433. probability .99 that Ale was
sane, in which case our tenth
434. guy would enter.
Once again, that argument would
435. unravel from the back.
So what I described before
436. can't quite be an equilibrium.
It can't be just as simple as
437. all sane guys are going to act
crazy because then you wouldn't
438. learn anything.
So it turns out that the
439. equilibrium in this model is
actually very subtle,
440. and it involves mixed
strategies,
441. and mixed strategies are
something we did in the first
442. half of the semester,
so we don't want to go back to
443. it now.
So trust me,
444. you can solve this out with
mixed strategies and the basic
445. idea I gave you is right.
The basic idea is sane guys are
446. occasionally going to act like
crazy guys in order to establish
447. a reputation,
and that reputation helps them
448. down the tree.
So this idea that even when
449. there's a chain store,
people will enter--even when
450. Ale has ten monopolies,
people will enter--this is a
451. famous idea.
It's called the Chain Store
452. Paradox, and it's due to a guy
called Selten who actually won
453. the Nobel Prize.
This is the Chain Store Paradox
454. and this idea of establishing
reputation is a big idea.
455. The idea is once again you
might want to behave as if
456. you're someone else in order to
deter people's actions,
457. in order to affect people's
actions down the tree.
458. Okay, so what have we learned
here?
459. Let's just work it out.
So the first thing we learned
460. was kind of a nerdy point,
but let me make it anyway.
461. Introducing just a very,
very small probability,
462. just a tiny probability that
Ale might be someone else--he
463. might be a Rahul,
he might be crazy,
464. he must like fights--that very
465. changes the outcome of the game.
If we were all 100% sure he was
466. sane we'd be tied by backward
induction and entry would
467. follow.
He wouldn't be able to keep
468. anybody out.
But that small probability
469. allows us to get a very
different outcome.
470. That's the first point I want
to draw into this.
471. The second point I want to get
out of this, is really this idea
472. of reputation.
There are lots of settings in
473. society where reputation matters
and one of them is a reputation
474. to fight.
How many of you have friends
475. who have somewhat short fuses?
You know people who have short
476. fuses, right?
When you're going out,
477. choosing some movie to go to
with these guys who have short
478. fuses or trying to decide who's
going to order something at a
479. restaurant,
or who's going to get to be
480. which side when you're playing
some game, some video game.
481. I claim, is this true,
that the people who have
482. slightly short fuses,
slightly more often get their
483. way, is that right?
If you have a sibling who has a
484. short fuse, they slightly more
often get their way and that's
485. exactly this idea.
They have short fuses,
486. the fact that they tend to blow
up and get angry at you gives
487. them a little bit of an
488. And notice that maybe they
don't have a short fuse at all,
489. maybe they're just pretending
to have a short fuse because
490. they know they're going to get
their way over you softies more
491. often.
None of you have short fuses,
492. you're all sane people right?
So this idea,
493. it should be a familiar idea to
you, but it's not just an idea
494. in the sort of trivial world of
bargaining.
495. Notice this idea of reputation
occurs all over the place.
496. So another place it occurs,
somewhat grim place it occurs,
497. is in the subject of hostage
negotiations.
498. In the subject of hostage
negotiations,
499. when some other country has
seized some hostages from the
500. U.S.
or maybe some criminal
501. organization has seized some
members of your family or some
502. members of your community and is
holding the hostages,
503. there's a well known idea which
is what?
504. Which is that you should never
negotiate with hostage takers,
505. is that right?
Everyone's heard that idea?
506. You should never negotiate with
hostage takers.
507. You never give in just because
they have hostages.
508. Why?
It's the same idea:
509. because you want to have a
reputation for being somebody
510. who doesn't give in to hostage
takers in order to deter future
511. potential hostage takers from
taking hostages.
512. This has grim consequences but
sometimes it's worth bearing the
513. cost of having your relatives
come back in pieces in order to
514. deter future relatives from
being taken.
515. So that's a somewhat macabre
version of this.
516. Let me give you one other
version.
517. There are whole areas of the
economy where reputation is
518. crucial, where if people played
according to their
519. backward-induction,
sane incentives,
520. we'd have a disaster.
But having a reputation here
521. isn't necessarily a reputation
as a tough guy.
522. It could be a reputation for
somebody who's a nice guy.
523. Could be that you want to have
a reputation for being the sort
524. of person who derives pleasure
or utility from,
525. (quote) "doing the right
thing," from acting honestly.
professions where the reputation
527. of the person in the profession
is crucial.
528. Doctors, for example:
it's crucial for a doctor that
529. he or she has the reputation of
someone who tells the truth.
530. Otherwise, you'd stop going to
that doctor.
531. Accountants:
accounting firms rely on having
532. a reputation for being honest
and not cheating the books.
533. When they stop having that
reputation for being
534. honest--think of Arthur Anderson
after the events in Enron--they
535. pretty quickly cease to be in
536. I gave that example a couple of
years ago, and it was very
537. embarrassing because it turned
out Arthur Anderson was in the
538. class--literally Arthur Anderson
III was in the class.
539. These things happen at Yale.
But nevertheless,
540. Arthur Anderson relied on his
reputation, the firm relied on
541. its reputation as an honest
firm,
542. and it was worth behaving
honestly to maintain that
Reputation is a huge topic and
544. my guess is that the next time
there's a Nobel Prize in game
545. theory it'll be for this idea.
So that's my prediction.
546. Now having said that,
I want to spend the whole of
547. the rest of today playing one
game and analyzing one game.
548. So we're going to play this
game and for this game I need a
549. couple of volunteers.
So I'm going to pull out some
550. volunteers.
Anyone want to volunteer?
551. I need two volunteers for this
game.
552. How about my guy from the
football team,
553. was that a raised hand?
It wasn't a raised hand,
554. how about my guy from the
football team?
555. Is it football team?
Baseball team,
556. that may be unfair in this
particular game.
557. Maybe I'll take someone who
isn't on the baseball team.
558. Anyone who's on the football
team?
559. These guys, you guys on the
football team?
560. Okay great, so you two guys.
I need you at the front and
Chevy and Patrick.
562. So Chevy and Patrick are going
to be our volunteers.
563. Now the idea of this game is
you guys provided the
564. volunteers--wait down here a
second--you guys provided the
565. volunteers.
This game involves two
566. volunteers that you just
provided, and two wet sponges.
567. I will provide the wet sponges.
568. So I have here a couple of
sponges and in a minute I'm
569. going to wet them,
and I'll tell you what the
570. rules are in a second.
Okay, so I'm going to give one
571. of these sponges each to Chevy
and Patrick, and then going to
572. position Chevy and Patrick at
either end of this central
573. aisle,
of this aisle here,
574. and the game is going to be as
follows.
575. It's important that everyone
listen to the rules here because
576. I'm going to pick two more
volunteers in a moment.
577. So the game they're going to
play is as follows.
578. Each of them has one sponge.
It's crucial they only have one
579. sponge.
And they're going to take turns.
580. And when it's your turn,
as long as you still have your
you face a choice.
582. You can either throw your
583. and if you hit your opponent
you win the game,
584. or you have to take a step
forward.
585. So either you throw the sponge
or you take a step forward.
586. Now there's a crucial rule here.
Each player only has one sponge
587. and, once they've thrown that
sponge, they do not get the
588. sponge back.
Everyone understand that?
589. Once you've thrown the sponge
you do not get the sponge back.
590. So once again,
if you throw your sponge at
591. your opponent and you hit your
opponent then you've won the
592. game.
But if you throw your sponge at
593. your opponent and you miss,
the game continues.
594. So let's make sure we
understand that,
595. if you throw your sponge and
miss the game continues:.
596. You still have to step forward.
597. to do at that point?
598. do?
Let's make sure that our
599. football players understand
this.
600. It wasn't meant that way.
They could have been soccer
601. players, come on.
Student: I didn't
602. appreciate that very much
Professor Ben Polak:
603. I'm sorry I didn't mean it
that way.
604. So if your opponent whose name
is Patrick throws and misses,
605. what are you going to do?
Student: I'll walk
606. forward until I slam the sponge
in his face.
607. Professor Ben Polak:
Great, you will walk forward
608. until you politely put it on his
609. Everyone understand?
That, if in fact,
610. you throw and miss you've lost
the game, because the other guy
611. can wait until he's standing
right on top of you and just
612. place the sponge gently on his
613. Now for fairness sake,
it's important that these
614. sponges are equally weighted,
and I'm going to weight
615. them--I'm going to put water in
them now.
616. And--you know nothing but the
best for Yale students--I'm
617. going to provide Yale University
spring water.
618. Who knew that Yale University
619. That's kind of a strange one?
If it makes you feel better you
620. can think of this American beer.
621. I'm not going to make these too
heavy, partly because it makes
622. it too easy and partly because I
don't want to get sued.
623. So I'm going to squeeze these
out somewhere away from the
624. wires.
We're going to get our judge
625. here to weigh them,
I need a mike here,
626. let me get a mike.
So I'll need you to hold those
627. sponges in your hands and tell
me if they're equally weighted.
628. Pretty equal?
Okay, they're pretty equal,
629. everyone agrees.
So how is this going to work?
630. I'm going to give the blue
sponge to Chevy and the green
631. sponge to Patrick.
And Chevy's going to stand
632. here, and Patrick's going to
stand as far back as I can get
633. him on camera,
which I'm going to be told how
634. far back I can go.
Don't go too far.
635. Okay come back Patrick,
you're too ambitious.
636. Come back.
Keep coming. stop.
637. Okay, we're going to start
here--start quite close
638. actually.
Everyone understand how we're
639. going to play this?
So Chevy is Player 1,
640. Patrick is Player 2.
Chevy has to decide whether to
641. throw or to step.
Student: I'll step.
642. Professor Ben Polak:
Okay, let's just hold the game a
643. second.
Now its Patrick's turn.
644. Does anyone have any advice for
Patrick at this point?
645. If you think throw,
646. If you think step,
647. There's a lot more steps than
throws.
648. I thought the Yale football
team was good this year.
649. Your choice, step or throw.
I should announce two other
650. rules.
It's kind of important I should
651. have said this.
First, a step has to be a
652. proper step, like a yard long;
and second (I think this will
653. work in America):
gentleman never duck.
654. No dodging the sponge okay.
655. Student: I don't really
trust my arm.
656. I'm going to step.
Professor Ben Polak: All
657. right, so you're stepping again.
Let me go to Patrick.
658. I feel like I'm in the line of
fire here.
659. Patrick what are you going to
do?
660. Student: I'm going to
throw.
661. Professor Ben Polak:
Patrick's going to throw,
662. I'm really going to get out of
the way then.
663. We'll see this in slow motion.
664. Continue, all right so you have
to take a step forward,
665. so Chevy's going to take a step
forward I assume?
666. Patrick's going to take a step
forward.
667. Chevy's going to take a step
forward.
668. Patrick's going to take a step
forward.
669. Good, so a round of applause
for our players,
670. thank you.
So I think we have time to do
671. this once more,
and then we're going to analyze
672. it.
So I want to get two women
673. involved.
It's too sexist otherwise.
674. So can we have two women in the
675. Two volunteers,
come on, you can volunteer.
676. There's a volunteer.
Thank you great.
677. Okay great, thank you.
678. Student: Jessica.
Professor Ben Polak:
679. Jessica and your name is?
Student: Clara-Elise.
680. Professor Ben Polak:
Clara-Elise and Jessica.
681. We'll start at the same
positions, we'll use the same
682. sponges, and I just need to
remind you where that position
683. was.
Just give me a thumbs up when
684. I'm in the right position.
Good, same rules,
685. Clara-Elise and Jessica.
We'll let Jessica be Player 1.
686. So Jessica you can step or
throw, what do you want to do?
687. Student: I'm going to
step.
688. Professor Ben Polak:
Going to step,
689. okay.
You know what might be a good
690. idea: Ale why don't you put the
mike on Clara-Elise.
691. So we have a mike at either end
rather than running to and fro,
692. that's good.
So Clara-Elise what are you
693. going to do?
Student: I'm going to
694. step.
Professor Ben Polak:
695. You're going to step and Jessica
what are you going to do?
696. Student: I'm going to
step.
697. Professor Ben Polak:
You're going to step.
698. Ale and I are in danger here
but never mind.
Do people think that Jessica
700. should throw?
If you think she should throw
There's a large majority for
702. step.
So up to you,
703. what are you going to do?
Student: I'm going to
704. step.
Professor Ben Polak:
705. Going to step okay.
Clara-Elise any decisions?
706. It's a pretty light sponge,
It's pretty hard to throw the
707. sponge, because we've seen that.
Okay, stepping again.
708. Student: I'm going to
throw.
709. Professor Ben Polak:
You're going to throw okay,
710. here we go, let me get out of
the way.
Clara-Elise's turn.
712. Student: I'll step.
Professor Ben Polak:
713. You'll step.
Jessica has to step,
714. Clara-Elise has to step,
all right good.
715. So we've seen how the game
works, everyone understands how
716. the game works.
I want to spend the rest of
717. today analyzing this game.
Before I do so,
718. we should just talk about what
this game is.
719. Let me get some new boards down
here.
720. So one quick announcement:
I'm going to analyze this,
721. and we're going to spend the
rest of today analyzing this,
722. but this is going to be quite
hard.
723. So I'm going to provide you a
handout that I'll put on the web
724. probably tomorrow that goes over
this argument.
725. So you don't have to take
detailed notes now,
726. I want you to pay attention and
see if you can follow the
727. argument.
So this game is called duel,
728. not surprisingly,
and you may wonder what are we
729. doing--as are my colleagues that
are here--what are we doing
730. having a duel in class.
Of course one answer to that
731. is, it's kind of fun watching
732. throw wet sponges at each other.
That's probably reason in
733. itself.
But there are other reasons.
734. Duel is a real game.
So those of you who are well
735. versed in Russian literature
will have seen duals before,
736. or at least read of duals.
There are some famous duels in
737. Russian literature.
Anyone want to tell me some
738. famous duels in Russian
literature?
739. Any Russian majors here?
No, nobody want to give me a
740. shot at this?
Really nobody.
741. This is Yale, come on.
Well, how about in War and
742. Peace okay.
There's a duel like this in War
743. and Peace, and in War and Peace
and without giving away the
744. ending--actually it's in the
middle of the book and it's 800
745. pages long,
so it isn't exactly the ending.
746. But in War and Peace I think
we're led to believe that the
747. hero, the protagonist Pierre,
shoots his gun--in War and
748. Peace it's a gun and not a
sponge, no surprise--he shoots
749. his gun too early,
we're led to believe.
750. There's a famous one in Eugene
Onegin, in Pushkin's Eugene
751. Onegin, and there are lots of
others actually,
752. so there's lots in literature.
There are also settings which
753. aren't exactly out of
literature.
754. So one example would be in a
bike race.
755. How many of you ever watch the
Tour de France?
756. Everyone know what I mean by
the Tour de France.
757. So this is a bike race that
goes around France.
758. It takes stages.
And in the Tour de France,
759. there's a key decision.
There's a game within the game,
760. and the game within the
game--I'm looking at Jake who's
761. a real cyclist here--but the
game within the game is when do
762. you try to break away from the
pack,
763. which is called the peloton.
And if you break away too early
764. from the peloton,
it turns out that you're going
765. to get reeled in.
It turns out that over the long
766. haul the peloton can go much
faster than you.
767. So if you break too early
they're going to catch you up.
768. On the other hand,
if you break too late,
769. then you're going to lose
because there are going to be
770. some people in the peloton who
are just excellent sprinters.
771. So if you break too late the
sprinters are going to win the
772. race.
So you have to decide when to
773. break from the peloton.
This is the second most
774. important game within a game in
the Tour de France,
775. the most important game within
a game is where to hide your
776. steroids.
So let me give you one other
777. example.
Let me give you a more economic
778. example--it's meant to be an
economics class.
779. Imagine there's two firms,
and these two firms are both
780. engaged in R&D,
research and development.
781. And they're trying to develop a
new product, and they're going
782. to launch this new product onto
the market.
783. But the nature of this market
is--maybe it's a network
784. good--the nature of this market
is there's only going to be one
785. successful good out there.
So essentially there's going to
786. be one standard,
let's say of a software or a
787. technological standard,
and only one of them is going
788. to survive.
The problem is you haven't
If you launch your product too
790. early it may not work,
and then consumers are never
791. going to trust you again.
But if you launch it too late
792. the other side will have
793. they will have got a toehold in
the market, and you're toast.
794. So that game--that game about
product launch--is like duel,
795. except you're launching a
product rather than launching a
796. sponge.
Is that right?
797. Now all of these games have a
common feature,
798. and it's a new feature for us.
It's about the nature of the
799. strategic decision.
In most of the games we've
800. looked at in the course so far
the strategic decision has been
801. of the form: where should I
locate,
802. how much should I do,
what price should I set,
803. should I stand for election or
not?
804. Here the strategic decision is
not of the form what should I
805. do.
It's of the form what?
806. When?
It's of the form when am
807. I going to launch the sponge?
We know perfectly well what
808. you're going to do.
You're going to throw the
809. sponge.
The strategic issue in question
810. is when.
So the issue here is when.
811. So to analyze this,
I'm going to need a little bit
812. of notation, and let me put that
notation up now.
813. So in particular,
I want to use the notation,
814. Pi[d]
to be what?
815. Let Pi[d]
be Player i's probability of
816. hitting if i shoots at distance
d.
817. So Pi[d]
is the probability that i will
818. hit if he or she shoots at
distance d.
819. Everyone happy with that?
That's the only notation I'm
820. going to use today.
And I'm going to make some
821. assumptions about the nature of
this probability.
822. Two of the assumptions are
pretty innocent.
823. So let's draw a picture.
So the picture is going to look
824. like this.
Here's a graph,
825. and on the horizontal axis I'm
going to put d.
826. This is the distance apart of
the two players.
827. And on the vertical axis I'm
going to put P which is the
828. probability, Pr the probability.
So here they're at distance 0,
829. and I'm going to make an
830. probability of hitting is if
you're at distance 0.
831. What's the sensible assumption?
What's the probability of
832. hitting somebody with your
sponge if you're 0 distance
833. away?
1, okay, I agree, so 1.
834. So the first assumption I'm
going to make is,
835. if they're right on top of each
other, they're going to hit with
836. probability 1.
Now the second assumption I'm
837. going to make is:
as you get further away this
838. probability decreases.
It doesn't have to look exactly
839. like this but something like
that.
840. That also I think--Is that an
okay assumption?
841. As you're further away there's
a lower probability of hitting.
842. Now I'm not going to assume
that these two players have
843. equal abilities.
For example, I don't know;
844. I didn't ask them but one of
our two football players might
845. be a quarterback and the other
one might be a linebacker or a
846. running back.
And I'm assuming the
847. quarterback is probably more
accurate, is that right?
848. So I'm not going to assume that
they're equally good,
849. so it could be that their
abilities look like this.
850. This could be P1[d],
and this could be P2[d].
851. Everyone okay with that?
So shout this out,
852. in this picture,
who is the better shot and who
853. is the less good shot?
Who is the better shot?
854. 1 is the better shot because at
every distance if Player 1 were
855. to throw, Player 1's probability
of hitting is higher than Player
856. 2's probability of hitting as
drawn.
857. Now, I don't even need to
assume this.
858. It could well be that these
probabilities cross.
859. It could be that these curves
cross.
860. So it could be that Player 1 is
better at close distances,
861. but Player 2 is better at far
distances.
862. That's fine.
We'll assume it's like this
863. today but I'm not going to use
that.
864. I could do away with that.
As drawn, Player 1 is the
865. better shot.
Now I'm going to make one
866. assumption that matters,
and it's really a critical
867. assumption.
I'm going to make the
868. assumption because it keeps the
math simple for today.
869. We have enough math to do
anyway.
870. I'm going to assume that these
abilities are known.
871. I'm going to assume that not
only do you know your own
872. ability of hitting your opponent
at any distance.
873. I'm going to assume you also
know the ability of your
874. opponent to hit you.
875. Now let's look at this a second.
We've got a little bit of
876. notation on the board,
let's discuss this a second.
877. What do we think is going to
happen here?
878. In this particular example we
have a good shot and a less good
879. shot.
Who do we think is going to
880. shoot first?
Let's try and cold call some
881. people.
So you sir what's your name?
882. Student: Frank.
Professor Ben Polak:
883. Frank, so who do you think is
going to shoot first,
884. the better shot or the worse
shot?
885. Student: The better shot
but also depends on who steps
886. first.
Professor Ben Polak:
887. Okay, let's assume Player 1 is
going to step first.
888. Student: Player 1.
Professor Ben Polak: So
889. Frank thinks Player 1 is going
to shoot first because Player 1
890. is the better shot.
891. Let's see, so what's your name?
Student: Nick.
892. Professor Ben Polak:
Nick, who do you think is going
893. to shoot first?
Student: I think Player
894. 2 would shoot first.
Professor Ben Polak: All
895. right, so let's talk why.
Why do you think Player 1 was
896. going to shoot first?
Let's do a poll.
897. How many of you think the
better shot's going to shoot
898. first?
How many people think the worse
899. shot's going to shoot first?
How many people of you are
900. being chickens and abstaining?
Quite a few okay.
901. So why do we think the better
shot might shoot first?
902. Student: At equal
distance, he has a better chance
903. of hitting.
Professor Ben Polak:
904. Because he has a better chance
of hitting, but why do you think
905. that the less good shot might
shoot first?
906. Student: He knows that
if P1 gets too close he's going
907. to win anyway,
so he may as well take a shot
908. with lower chance before P1.
Professor Ben Polak: All
909. right, okay so you have two
arguments here,
910. the first argument is maybe the
better shot will shoot first
911. because,
after all, he has a higher
912. chance of hitting.
And the other argument says
913. maybe the worse shot will shoot
first, to what?
914. To pre-empt the better shot
from shooting him.
915. But now we get more
complicated, because after all,
916. if you're the better shot and
you know that the worse shot
917. maybe going to try and shoot
first to try and pre-empt you
918. from shooting him,
you might be tempted to shoot
919. before the worse shot shoots to
preempt the worse shot from
920. pre-empting you from shooting
him.
921. And if you're the worse shot
maybe you're going to try and
922. shoot first, even earlier,
to pre-empt the better shot
923. from pre-empting the worse shot,
from pre-empting the better
924. shot from shooting the worse
shot and so on.
925. So what's clear is that this
game has a lot to do with
926. pre-emption.
Pre-emption's a big idea here,
927. but I claim it's not at all
obvious who's going to shoot
928. first, the better shot or the
worse shot.
929. Is that right?
So those people who abstained
Those people who abstained
931. before, it seemed like it was a
pretty sensible time to abstain.
932. It's not obvious at all to me
who's going to shoot first here.
933. Are people convinced at least
that it's a hard problem?
934. Yes or no, people convinced?
Yeah okay good.
935. It's a hard problem.
So what I want to do is,
936. as a class, as a group,
what I want us to do is solve
937. this game;
and I want to solve not just
938. who is going to shoot first,
I want to figure out exactly
939. when they're going to shoot.
So we're going to do this in
940. the next half hour and we're
going to do it as a class,
941. so you're going to do it.
So we're going to nail this
942. problem basically,
and we're going to do it using
943. two kinds of arguments.
One kind of argument is an
944. argument we learned the very
first day of the class and
945. that's a dominance argument,
and the second kind of argument
946. is an argument we've been using
a little bit recently,
947. and what kind of argument is
that?
948. What is it?
Backward induction.
949. So we're going to use dominance
arguments and backward
950. induction, and we're going to
figure out not just whether the
951. better shot or the worse shot's
going to shoot,
952. but exactly who's going to
shoot when.
953. Let's keep our picture handy,
get rid of, well I can get this
954. down I guess.
955. Can you still see the picture?
All right, let's proceed with
956. this argument.
So to do this argument,
957. I first of all want to
establish a couple of facts.
958. I want to establish two facts,
and we'll call the first fact:
959. Fact A.
Let's go back to our two
In fact, maybe it would be
961. helpful to have our players.
Can I use our first two players
962. as props.
Can I have you guys on stage?
963. While they're coming up--sorry
guys, I'm exploiting you a bit
964. today.
I hope you both signed your
965. legal release forms.
Why don't you guys sit here a
966. second so I can use you as
props.
967. So imagine that these two guys
still have their sponges.
968. Let's actually set this up.
So suppose that Chevy still has
969. a sponge, and Patrick still has
his sponge.
970. And suppose it's Chevy's turn,
and suppose that Chevy is
971. trying to decide whether he
should throw his sponge or not.
972. Let me give you a mike each so
you have them for future
973. reference.
So Chevy is trying to decide
974. whether to throw his sponge or
not.
975. Now suppose that Chevy knows
that Patrick is not going to
976. shoot next turn when it's his
turn.
977. So Chevy's trying to decide
whether to shoot and he knows
978. that Patrick is not going to
shoot next turn when it's his
979. turn.
What should Chevy do?
980. Chevy what should you do?
Student: Take a step.
981. Professor Ben Polak:
Take a step, that's right.
982. Why should he take a step?
What's the argument?
983. Why he should take a step?
Well let's find out:
984. what's the argument?
Student: Because I'll
985. just be one step closer and I'll
be able to make the same choice
986. next time.
Professor Ben Polak:
987. Good, hold the mike up to you.
You're a rock star now.
988. All right good,
he's correctly saying he should
989. wait, why should he wait?
Because he's going to be closer
990. next time.
So the first fact is:
991. assuming no one has thrown yet,
if Player i knows (at (say)
992. distance d) that j will not
shoot--let me call it tomorrow;
993. and tomorrow he'll be closer,
he'll be at distance d -
994. 1--then Chevy correctly says,
I should not shoot today.
995. Again, recall the argument,
the argument is you'll get a
996. better shot, a closer shot,
the day after tomorrow.
997. Now, let's turn things around.
Suppose, conversely--once again
998. we're picking on Chevy a
second--so Chevy has his sponge.
999. No one has thrown yet,
and suppose Chevy knows that
1000. Patrick is going to throw
tomorrow.
1001. Now what should Chevy do?
Well that's a harder decision.
1002. What should Chevy do?
He knows Patrick's going to
1003. shoot tomorrow.
What should he do?
1004. Should he shoot or what,
1005. What do you reckon?
Student: It depends.
1006. Professor Ben Polak: It
depends.
1007. I think that's the right
1008. Good, so the question
is--someone else,
1009. I don't want to pick entirely
on these guys--so what does it
1010. depend on?
It's right that it depends.
1011. What does it depend on?
Student: If the other
1012. guy's chances are greater than
or less than 50%.
1013. Professor Ben Polak: All
right, so it might depend on the
1014. other side's chances being less
than or great than 50%.
1015. It certainly depends on the
other guy's ability and on my
1016. ability.
Everyone clear on that?
1017. Everyone agrees whether I
should shoot now if I know the
1018. other guy is going to shoot
tomorrow depends on our
1019. abilities,
but how exactly does it depend
1020. on our abilities?
Student: It depends on:
1021. if you're probability to hit is
greater than his probability to
1022. miss.
Professor Ben Polak:
Student: Osmont
1024. Professor Ben Polak:
Osmont.
1025. So Osmont is saying--let's be
careful here:
1026. it depends on whether my
probability of hitting if I
1027. throw now is bigger than his
probability of missing
1028. tomorrow.
And why is that the right
1029. comparison?
That's the right comparison
1030. because, if I throw now,
my probability of winning the
1031. game is the probability that I
hit my opponent.
1032. If I wait and take a step,
then my probability of winning
1033. the game is the probability that
he misses me tomorrow.
1034. So I have to compare winning
probabilities with winning
1035. probabilities:
I have to compare apples with
1036. apples, not apples with oranges.
Everyone see that?
1037. Okay, so let's put that up.
So the same assumption:
1038. assuming no one has thrown,
if i knows (at d) that j
1039. will shoot tomorrow (at
d--1),
1040. then i should shoot if--need a
gap here--if i's probability of
1041. hitting at d--and let me leave a
gap here--is bigger than or
1042. equal to--it doesn't really
1043. greater than or equal to j's
probability of missing tomorrow.
1044. Because this is the probability
that you'll win if you throw,
1045. and this is the probability
that you'll win if you wait.
1046. Okay, so let's put in what
those things are.
1047. So the probability that i will
hit at distance D,
1048. that's not that hard,
that's Pi[d]--everyone happy
1049. with that?
What's the probability that j
1050. will miss tomorrow if j throws?
What's the probability that j
1051. will miss?
Somebody?
1052. Let's be careful,
so it's 1--Pj--but what
1053. distance will they be at--d--1:
so it's (1--Pj[d--1]).
1054. So this is the key rule,
if Chevy knows that Patrick's
1055. going to shoot tomorrow then
Chevy should shoot if his
1056. probability of hitting Pi[d]
is bigger than Patrick's
1057. probability of missing (1 -
Pj[d--1]).
1058. Now I want to do one piece of
math.
1059. This is the only math in this
proof.
1060. So everyone who is math phobic,
which I know there is a lot of
1061. you, can you just hold onto your
seats?
1062. Don't panic:
a little bit of math coming.
1063. This is the math,
1064. to both sides of this
inequality.
1065. That's it, so what does that
tell me?
to this side,
1067. I get +Pj[d--1],
everyone happy with that?
1068. On the other side if I add
Pj[d--1], I get just 1.
1069. Everyone happy with that?
So here's our rule,
1070. our rule is,
let's flip it around,
1071. if Patrick hasn't thrown yet
and thinks that Chevy's going to
1072. shoot tomorrow then Patrick
should shoot now if his
1073. probability of hitting now plus
Chevy's probability of hitting
1074. tomorrow is bigger than 1.
Let's call this * and let's put
1075. this stuff up somewhere where we
can use it for future reference.
1076. Sorry guys, I'll use you again
in a minute.
1077. I know you're feeling self
conscious up there.
1078. Believe me, I'm self conscious
up here too.
1079. So let's look at that *
inequality up there.
1080. Now way out here,
is that * inequality met or not
1081. met?
1082. It's not met because way out
here these two probabilities are
1083. small, so the sum is less than
1.
1084. In here, is the * inequality
met or not met?
1085. Let me pick on you guys,
so Patrick is it met or not met
1086. in here?
1087. Student: It's met.
Professor Ben Polak:
1088. It's met, thank you okay.
So, in here,
1089. the inequality is met:
the sum is bigger than 1;
1090. and out here it's less than 1.
If we put in all the steps
1091. here--here they are getting
closer and closer together.
1092. Here's the steps,
as they get closer and closer
1093. together.
We put these steps in.
1094. There's going to be some step
where, for the first time,
1095. the * inequality is met.
Notice that they start out
1096. here, they get closer and
closer, it's not met,
1097. not met,
not met, not met,
1098. not met, not met,
then suddenly it's going to
1099. met.
Maybe around here.
1100. Let's just try and pick it out,
maybe it's here.
1101. So this might be the first time
that this * inequality is met,
1102. let's call it d*.
Everyone understand what D* is?
1103. At every one of these steps to
the right of d*,
1104. when we take the sum of the
probability Pi[d]
1105. + Pj[d--1], we get something
less than 1.
1106. But to the left of d* or closer
in than d*--the game is
1107. proceeding this way because it's
moving right to left,
1108. they're getting closer and
closer--to the left of d* the
1109. sum of those probabilities is
bigger than 1.
1110. Let's say that again,
d* is the first step at which
1111. the sum of those two
probabilities exceeds 1.
1112. Everyone okay about what d* is?
Anyone want to ask me a
1113. question?
Okay, people should feel free
1114. to ask questions on this,
I want to make sure everyone is
1115. following.
Is everyone following so far?
1116. Yeah?
I need to see all your eyes.
1117. You don't look like you're
stuck in the headlamps like you
1118. were on Monday.
I think we're better off than
1119. we were on Monday.
Good okay.
1120. Okay, so here's our picture,
and I actually want a bit more
1121. space--I'm not going to have
much more space.
1122. So now I'm going to tell you
the solution.
1123. The solution to this game is
this.
1124. I claim that the first shot
should occur at d*.
1125. So that's my claim.
1126. No one should shoot until you
get to d*, and whoever's turn it
1127. is--whether it's Chevy's turn or
Patrick's turn at d*--that
1128. person should shoot.
That's my claim,
1129. and that's what we're going to
prove.
1130. Everyone understand the claim?
It says: nobody shoots,
1131. nobody shoots,
nobody shoots,
1132. nobody shoots,
nobody shoots,
1133. nobody shoots,
shoot.
1134. Let's prove it,
1135. Yeah, people should be awake.
1136. nudge them hard.
Good, all right,
1137. let's start this analysis way
out here, miles apart.
1138. These guys are miles apart and
I want to use them as props.
1139. So I've got these guys--stay
where you are Patrick but stand
1140. up, and Chevy over here
somewhere, just where that black
1141. line is.
Maybe they're even further than
1142. this.
They're really far apart.
1143. And here they are miles away,
and let's say it's Chevy's
1144. turn.
He's way out here:
1145. perhaps the first step of the
game.
1146. Imagine it's even further
because it was even further,
1147. and let's think through what
should be going on in Chevy's
There are two possible things
1149. going on.
He's going to think about what
1150. Patrick's going to do.
So this is Chevy's turn.
1151. Here he is.
And he should think:
1152. tomorrow, it's going to be
Patrick's turn,
1153. and there's two possibilities.
One possibility is that Patrick
1154. is not going to shoot tomorrow.
And if we think that Patrick is
1155. not going to shoot tomorrow,
which fact should Chevy use?
1156. Should he use Fact A or Fact B?
Chevy which fact should you
1157. use?
Student: Fact A.
1158. Professor Ben Polak:
Fact A, okay.
1159. So, using Fact A,
he should not shoot.
1160. Alternatively,
he could think that Patrick's
1161. going to shoot tomorrow,
is that right?
1162. He could think Patrick's going
to shoot tomorrow.
1163. If he thinks Patrick's going to
shoot tomorrow,
1164. which fact should he use?
B.
1165. He should use Fact B,
in which case he has to look at
1166. this inequality up here and say,
I'll shoot if my probability of
1167. hitting now plus his probability
of hitting tomorrow is bigger
1168. than 1.
Well let's have a look.
1169. This is Patrick's probability
of hitting today and this is
1170. Chevy's probability
of--sorry--this is Chevy's
1171. probability of hitting today,
and this is Patrick's
1172. probability of hitting tomorrow.
And is the sum of them bigger
1173. than 1 or not?
Is it bigger than 1 or not?
1174. It's not bigger than 1.
So what should Chevy do?
1175. He should step.
So he'd step.
1176. Now it's Patrick's turn,
and once again,
1177. imagine this distance is still
pretty large,
1178. and there's two things Patrick
could think.
1179. Patrick could think that
Chevy's not going to shoot
1180. tomorrow.
So here's Patrick,
1181. he's looking forward to Chevy
tomorrow, and he could think
1182. that Chevy's not going to shoot
tomorrow.
1183. If he thinks Chevy's not going
to shoot tomorrow which fact
1184. should he choose?
Student: A.
1185. Professor Ben Polak:
Should choose Fact A,
1186. okay.
If he's using Fact A,
1187. he should not shoot.
Alternatively,
1188. he could think that Chevy is
going to shoot tomorrow,
1189. in which case he uses Fact B,
and what does he do?
1190. He adds up his probability,
Patrick's probability of
1191. hitting today plus Chevy's
probability of hitting tomorrow:
1192. he asks is that sum bigger than
1,
1193. and he concludes, no.
So we have no shot here and no
1194. shot here, and notice that both
of those arguments were
1195. dominance arguments.
In each case,
1196. whether Chevy thought that
Patrick was going to shoot
1197. tomorrow or not,
in either case,
1198. he concluded he should not
shoot today.
1199. When it was Patrick's turn,
whether Patrick thought that
1200. Chevy was going to shoot
tomorrow or not,
1201. in either case,
he concluded he should not
1202. shoot today.
So he takes a step forward.
1203. This argument continues,
it'll be Chevy's turn next,
1204. and once again he'll look at
these two possibilities.
1205. If he thinks Patrick's not
shooting tomorrow,
1206. he wants to step,
if he thinks Patrick is going
1207. to shoot tomorrow,
he's again going to want to
1208. step the way it's drawn.
And, once again,
1209. we'll conclude step.
We'll go on doing this
1210. argument, and everyone see that
in each case this dominance
1211. argument will apply.
It won't matter whether I think
1212. you should shoot tomorrow or
not, in either case,
1213. it'll turn out that I should
step forward:
1214. whether Fact A applies or
whether Fact B applies,
1215. So we'll go on going forward
and we'll have:
1216. no shot, no shot,
no shot, no shot,
1217. no shot, no shot,
and we'll arrive at d*.
1218. So it turns out that d* is
going to be Chevy's turn again.
1219. At d* we try exactly the same
reasoning.
1220. At d* he says,
if I think Patrick is not going
1221. to shoot tomorrow what should I
do?
1222. What should I do if I think
Patrick's not going to shoot
1223. tomorrow?
Student: Not shoot.
1224. Professor Ben Polak: Not
shoot.
1225. But now something different
occurs.
1226. Now he says,
if I think Patrick is going to
1227. shoot tomorrow,
then when I look at my
1228. inequality up there,
my * inequality,
1229. and add up my probability of
hitting today--which is this
1230. line here--plus Patrick's
probability of hitting
1231. tomorrow--which is this line
here--suddenly he finds it is
1232. bigger than 1.
So now if Chevy thinks that
1233. Patrick is going to shoot
tomorrow, what should Chevy do?
1234. He should shoot.
So up until this point,
1235. a dominance argument has told
us no one should shoot,
1236. but suddenly we have a dilemma.
The dilemma is:
1237. if Chevy thinks Patrick's not
shooting, he should step;
1238. and if Chevy thinks Patrick is
shooting, he should shoot.
1239. Everyone with me so far?
So what have we shown so far?
1240. We've shown that no one should
shoot until d* but we're stuck
1241. because we don't know what to do
at d*;
1242. because we don't know what
Chevy should believe at d*.
1243. We don't know whether Chevy
should believe that Patrick's
1244. going to shoot or whether Chevy
should believe that Patrick's
1245. not going to shoot.
So how do we figure out what
1246. Chevy should believe Patrick's
going to do?
1247. Wait, wake up the guy in orange
there, the guy with the ginger
1248. hair, that's right.
1249. question?
1250. question is backward induction.
Round of applause for
Good, backward induction is the
especially when you're asleep
1253. right.
Okay, so now we're going to use
1254. backward induction,
but where does backward
1255. induction start here?
Backward induction starts at
1256. the back of the game,
and what's the back of the game
1257. here?
The back of the game is where?
1258. It's when these two guys,
neither of them have thrown
1259. their sponge,
and they've reached here.
1260. So come here a second,
step, step.
1261. Let's assume it's Patrick's
turn, and they're absurdly
1262. close: they're uncomfortably
close.
1263. If they had longer noses they'd
be touching.
1264. They're at distance 0.
And at distance 0,
1265. at d = 0, let's suppose it's
Patrick's turn.
1266. So at d = 0,
no one has shot,
1267. it's Patrick's turn,
he's got the sponge,
1268. what should Patrick do?
Shout it out Patrick.
1269. Student: I should shoot.
Professor Ben Polak: You
1270. should shoot.
Patrick should shoot, right?
1271. At d = 0, say it's 2's turn,
and the answer is he should
1272. shoot because the probability of
it hitting at distance 0 is 1.
1273. Let's just move you to the side
a bit so that people can see the
1274. board.
I know it's an awkward dance
1275. but here you are--stop there:
that's good.
1276. So at distance 0 they should
certainly shoot.
1277. So now let's go back a step in
the backward induction.
1278. So we're at distance 1,
just take a step back.
1279. So take a step back,
it's Chevy's turn at distance
1280. 1.
And what does Chevy know at
1281. distance 1?
Chevy what do you know?
1282. Shout it out.
Student: That Patrick
1283. will shoot next turn.
Professor Ben Polak:
1284. Right, so now Chevy knows that
Patrick's going to shoot
1285. tomorrow.
So which fact should Chevy use
1286. in deciding whether he should
shoot today?
1287. B.
He should use Fact B,
1288. and that tells us--so 1 knows
that 2 will shoot tomorrow,
1289. so by B, 1 should shoot if his
probability of hitting at
1290. distance 1 plus Patrick's
probability of hitting at
1291. distance 0,
if that is bigger than 1.
1292. Well is it bigger than 1?
Well let's have a look.
we put a shot in here,
1294. and we're looking at distance
1.
1295. And so we're looking at this
distance here plus this distance
1296. here.
Is it bigger than 1?
1297. Yeah, it's bigger than 1.
Here's a bit more math.
1298. I lied to you before.
1 plus something is bigger than
1299. 1.
So this is bigger than 1.
1300. So shoot.
Let's put it on our chart as
1301. shoot.
Let's go back a
1302. step--sorry--I'll have to have
Chevy do all the going
1303. backwards.
Now we're at distance what?
1304. We're at distance 2,
we're at d = 2,
1305. and it's Patrick's turn,
2's turn, and what does Patrick
1306. know?
Shout it out Patrick.
1307. Student: I know that
Chevy's going to shoot next
1308. turn.
Professor Ben Polak:
1309. Patrick knows that Chevy's going
to shoot next turn,
1310. so Patrick therefore should use
which fact?
1311. Fact B.
And Fact B tells him that he
1312. should shoot--let's just put
this in.
1313. So 2 knows that 1 will shoot
tomorrow, so by B it's all the
1314. same thing.
We know that 2 should shoot if
1315. P2[2]
+ P1[1]
1316. is bigger than 1,
and if we look at it on the
1317. board--here we are--it's 2's
turn,
1318. he's looking at this distance
plus this distance,
1319. and is it bigger than 1?
It is bigger than 1.
1320. So he should shoot.
And we can go on doing this
1321. argument backwards.,
We'll find that Chevy should
1322. shoot here because this plus
this is bigger than 1.
1323. And we'll know that here
Patrick once again will know
1324. that Chevy's going to shoot
tomorrow,
1325. so he should use Fact B,
so he should shoot provided
1326. this plus this is bigger than 1,
but it is.
1327. Now, we're back at d* and the
question we had at d*-- the
1328. question we'd left hanging at
d*--was what?
1329. At d* we knew already that
Chevy would not shoot if he
1330. thought Patrick was not going to
shoot tomorrow,
1331. but he should shoot if thinks
Patrick is going to shoot
1332. tomorrow, but what does Chevy
know at d*?
1333. Student: I know Patrick
is going to shoot.
1334. Professor Ben Polak: He
knows Patrick's going to shoot,
1335. so he should shoot.
Is that right?
1336. He knows Patrick's going to
shoot by backward induction so
1337. he should shoot.
So we just solved this.
1338. What did we actually show?
We showed, have seats
1339. gentlemen--sorry to keep you up
here--we know that prior to d*
1340. no one will shoot--will not
shoot--and we know that at d*,
1341. and in fact at any point
further on, we should shoot.
1342. That's horrible writing but it
says shoot.
1343. So we've shown more than we
claimed.
1344. We claimed that the first shot
should occur at d*,
1345. and we've actually shown more
than that: we've shown that even
1346. if you went beyond d*,
1347. to shoot at d*,
at least you should shoot now.
1348. Give me like two more minutes
or three more minutes to finish
1349. this up because we're at a high
point now.
1350. Everyone okay to wait a couple
of minutes?
1351. Okay, so what did we prove here?
We proved that the first shot
1352. occurs at d* whoever's turn it
is at d*.
1353. It wasn't that the best guy,
the better shot,
1354. should shoot first,
or the worse shot should shoot
1355. first.
It turned out that,
1356. given their abilities,
there was a critical distance
1357. at which they should shoot.
If you go back to the
1358. eighteenth century military
strategy, you should shoot when
1359. you see the whites of their
eyes, which is at d*.
1360. But we learned something else
on the way.
1361. I claimed we learned that if
you're patient and you go
1362. through things carefully,
that the arguments we've
1363. learned from the course so far,
dominance arguments and
1364. backward induction arguments,
can solve out a really quite
1365. hard problem.
This was hard.
1366. It would have been useful for
the guy in War and Peace,
1367. or in Onegin or the guys
cycling in the Tour de France,
1368. or you guys with your sponges
to know this.
1369. And we can solve this exactly
using backward induction,
1370. and everyone in the room can do
it.
1371. Let me just push the argument a
tiny bit further.
1372. One thing we've always asked in
this class, is okay that's fine
1373. if everyone knows what's going
on in the game:
1374. here we have our smart Yale
football players and they know
1375. how to play this game,
so they're going to shoot at
1376. the right time.
But what happens if,
smart Yale football player,
1378. they're playing some uneducated
probably simple-minded football
1379. player from,
say, Harvard.
1380. Now that changes things a bit
doesn't it because we know that
1381. the Yale football player is
sophisticated,
1382. has taken my class,
and knows that he should shoot
1383. at d*, but the Harvard guy
doesn't learn anything anymore,
1384. so they're stuck.
So if you're the Yale guy
1385. playing the Harvard guy how does
1386. Should you shoot earlier than
d* when you're playing against
1387. the Harvard guy,
or later than d* when you're
1388. playing against the Harvard guy?
Let's try our Yale guys and see
1389. what they think,
what do you think Chevy?
1390. Student: Definitely not
earlier.
1391. Professor Ben Polak:
Definitely not earlier,
1392. that's the key thing right.
Now why?
1393. Why definitely not earlier?
Student: Because if you
1394. miss, the other person has a
probability of 1,
1395. you have a higher chance of
missing.
1396. Professor Ben Polak: All
right, so I claim Chevy's right.
1397. That's good because I've just
claimed that Yale football
1398. players are sophisticated.
Chevy's right,
1399. that even if you're playing
against a Harvard guy you
1400. shouldn't shoot before d*
because it was a dominant
1401. strategy not to shoot before
d*.
1402. It doesn't matter whether you
think the Harvard guy is going
1403. to be dumb enough to shoot early
or not.
1404. If he is dumb enough to shoot
early, so much the better.
1405. You should wait until D*.
Notice that argument doesn't
1406. depend on you playing against
somebody who is sophisticated,
1407. or someone who's less
sophisticated,
1408. like a Harvard football player,
or somebody who's basically a
1409. chair,
like a Harvard football player.
1410. You shouldn't shoot before d*
because it's a dominant strategy
1411. not to shoot before d*.
Now, you might want to wait a
1412. little to see if they're not
going to shoot early,
1413. to see if he's not going to
shoot, but you certainly
1414. shouldn't shoot early.
Let me finish with one other
1415. thing.
Every time when we've played
1416. this game in class,
whether it's here or up in SOM,
1417. people shoot too early.
They miss.
1418. You can do the econometrics on
this, you could figure out that,
1419. on average--average
abilities--I'm sometimes getting
1420. the better shots,
sometimes I'm getting the worse
1421. shots--on average I should see
people hitting about half of the
1422. time over a large sample.
But here I tend to see people
1423. miss, as we did today,
almost all the time.
1424. Why do we see so many misses?
So one problem may be that
1425. people are just overconfident.
They're overconfident on their
1426. ability to throw.
And there's a large literature
1427. in Economics about how people
tend to be overconfident.
1428. But there's another possible
explanation, and let me just
1429. push it past you as the last
thing for today.
1430. I think Americans--I think this
doesn't go for the
1431. Brits--Americans have what I
call a "pro-active bias."
1432. You guys are brought up since
you're in kindergarten--maybe
1433. before--and you're told you have
to be pro-active.
1434. You have "to make the world
come to you."
1435. And my evidence for this is
based on sophisticated empirical
1436. work watching Sports Center.
So on Sports Center when they
1437. interview these sweaty athletes
after the game,
1438. the sweaty athletes say,
it's great I now control my own
1439. destiny.
Well, I'm a Brit.
1440. I think controlling my own
destiny sounds kind of scary to
1441. me;
it doesn't sound like a good
1442. thing at all.
In fact, if I wanted to control
1443. my own destiny,
I wouldn't have got married.
1444. That's going to be edited off
the film, but the point I want
1445. to make is this.
Every time I play this game,
1446. when I ask people why they
shoot early I hear the same
1447. thing, and it's evidence to this
proactive bias.
1448. People say, well at least I
went down swinging and the
1449. problem is: the aim in life is
not to go down swinging,
1450. it's not to go down.
So one lesson to get from this
1451. lecture is, sometimes waiting is
a good strategy.
1452. Alright, and we'll come back to
it on Monday.