English subtitles

← 12. Evolutionary stability: social convention, aggression, and cycles

Get Embed Code
1 Language

Showing Revision 1 created 07/12/2012 by Amara Bot.

  1. All right, so this was the
    definition we saw at the end
  2. last time.
    I've tried to write it a bit
  3. larger, it just repeats the
    second of those definitions.
  4. This is the definition that
    connects the notion of
  5. evolutionary stability,
    for now in pure strategies,
  6. with Nash Equilibrium.
    Basically it says this,
  7. to check whether a strategy is
    evolutionarily stable in pure
  8. strategies,
    first check whether
  9. (Ŝ,Ŝ) is a symmetric
    Nash Equilibrium.
  10. And, if it is,
    if it's a strict Nash
  11. Equilibrium we're done.
    And if it's not strict,
  12. that means there's at least
    another strategy that would tie
  13. with Ŝ against Ŝ,
    then compare how Ŝ
  14. does against this mutation with
    how the mutation does against
  15. itself.
    And if Ŝ
  16. does better than the mutation
    than the mutation does against
  17. itself then we're okay.
    One virtue of this definition
  18. is it's very easy to check,
    so let's try an example to see
  19. that and also to get us back
    into gear and reminding
  20. ourselves what we're doing a
    bit.
  21. So in this example--sort of
    trivial game but still--the game
  22. looks like this.
    And suppose we're asked the
  23. question what is evolutionarily
    stable in this game?
  24. So no prizes for finding the
    symmetric Nash Equilibrium in
  25. this game.
    Shout it out.
  26. What's the symmetric Nash
    Equilibrium in this game?
  27. (A,A).
    So (A,A) is a symmetric Nash
  28. Equilibrium.
    That's easy to check.
  29. So really the only candidate
    for an evolutionarily stable
  30. strategy here is A.
    So the second thing you would
  31. check is: is (A,A) a strict Nash
    Equilibrium?
  32. So what does strict Nash
    Equilibrium mean?
  33. It means that if you deviate
    you do strictly worse.
  34. So is (A,A) strict Nash?
    Well is it strict Nash or not?
  35. It's not.
    So if you deviate to B,
  36. you notice very quickly that
    U(A,A) is equal to U(B,A).
  37. Is that right?
    It's just tie,
  38. both of these get 1.
    So it's not a strict Nash
  39. Equilibrium so we have to check
    our third rule.
  40. What's our third rule?
    So we need to check how does A
  41. do against the possible
    deviation which is here--which
  42. is B here, how does that compare
    with B against itself?
  43. So U(A,B) the payoff to A
    against B is 1 and the payoff to
  44. B against itself is 0:
    1 is indeed bigger than 0,
  45. so we're okay and A is in fact
    evolutionarily stable.
  46. So: just a very,
    very simple example to
  47. illustrate how quick it is to
    check this idea.
  48. I want to spend all of today
    going over more interesting
  49. examples, so having the payoff
    of having invested in this last
  50. time.
    To start off with,
  51. let's get rid of this rather
    trivial example.
  52. I want to think about evolution
    as it's often applied in the
  53. social sciences.
    So one thing you might talk
  54. about in the social sciences is
    the evolution of a social
  55. convention.
    Sometimes you'll read a lot in
  56. sociology or political science
    about the evolution of
  57. institutions or social
    conventions and things like
  58. this--maybe also in
    anthropology--and I want to see
  59. a trivial example of this to see
    how this might work and to see
  60. if we can learn anything.
    The trivial example I'm going
  61. to think about is driving on the
    left or the right;
  62. on the left or right side of
    the road.
  63. So this is a very simple social
    convention.
  64. I think we all can agree this
    is a social convention and let's
  65. have a look at the payoffs in
    this little game.
  66. So you could imagine people
    drive on the left or drive on
  67. the right, and if you drive on
    the left and the other people
  68. are driving on the right,
    you don't do great and nor do
  69. they;
    and if you drive on the right
  70. and they're driving on the left
    you don't do great,
  71. and nor do they.
    If you both drive on the right,
  72. you do fine.
    But if you both drive on the
  73. left you do a little better,
    because you look a little bit
  74. more sophisticated.
    So this is our little game.
  75. We can see that this,
    this could be an evolutionary
  76. game.
    So you could imagine this
  77. emerging--you can imagine a
    government coming in and
  78. imposing a law saying everyone
    has to drive on the left or
  79. everyone has to drive on the
    right,
  80. but you could also imagine at
    the beginning of roads in
  81. different societies in different
    parts of the world,
  82. people just started doing
    something and then they settled
  83. down to one convention or
    another.
  84. You can see how evolutionary
    stability's going to play a role
  85. here.
    Well perhaps we should just
  86. work it through and see what
    happens.
  87. So what are the potential
    evolutionarily stable things
  88. here?
    What are the potentially
  89. evolutionarily stable things?
    Let's get some mikes up.
  90. What's liable to be
    evolutionarily stable in this
  91. setting?
    Anyone?
  92. Student: Left,
    left and right,
  93. right are both candidates.
    Professor Ben Polak:
  94. Good, so our obvious candidates
    here are left,
  95. left and right,
    right.
  96. These are the two candidates.
    More formally,
  97. left is a candidate and right
    is a candidate,
  98. but you're right to say that
    left, left and right,
  99. right are both Nash Equilibria
    in this game.
  100. What's more,
    not only are they Nash
  101. Equilibria but what kind of Nash
    Equilibria are they?
  102. They're strict Nash Equilibria.
    The fact they're strict,
  103. both are strict,
    so indeed left is
  104. evolutionarily stable and right
    is evolutionarily stable.
  105. Let's just talk through the
    intuition for that and make it
  106. kind of clear.
    So suppose you're in a society
  107. in which everybody drives on the
    left, so this is England and
  108. suppose a mutation occurs and
    the mutation you could think of
  109. as an American tourist.
    An American tourist is dropped
  110. into English society,
    hasn't read the guidebook
  111. carefully,
    starts driving on the right,
  112. and what happens to the
    tourist?
  113. They die out pretty rapidly.
    Conversely, if you drop an
  114. unobservant Brit into America
    and they drive on the right,
  115. they're going to get squashed
    pretty quickly,
  116. so it's kind of clear why
    everyone driving on the left or
  117. why everyone driving on the
    right is each of these are
  118. perfectly good evolutionarily
    stable social conventions.
  119. But despite that simplicity,
    there's kind of a useful lesson
  120. here.
    So the first lesson here is you
  121. can have multiple evolutionarily
    stable settings,
  122. you could have multiple social
    conventions that are
  123. evolutionarily stable.
    We shouldn't be surprised
  124. particularly if we think there's
    some evolutionary type force,
  125. some sort of random dynamic
    going on that's generating
  126. social conventions other than
    stable,
  127. we should not be surprised to
    see different social conventions
  128. in different parts of the world.
    In fact we do,
  129. we do see parts of the world
    like England and Japan,
  130. and Australia where they drive
    on the left and we see parts of
  131. the world like France and
    America where they drive on the
  132. right,
    so we can have multiple
  133. evolutionarily stable
    conventions.
  134. There's another lesson here,
    which is you could imagine a
  135. society settling down to a
    social convention down here.
  136. You could imagine ending up at
    a social convention of right,
  137. right.
    What do we know about the
  138. social convention of everyone
    driving on the right?
  139. We know it's worse than the
    social convention of everyone
  140. driving on the left,
    at least in my version.
  141. So what do we regard,
    what can we see here?
  142. We can see that they're not
    necessarily efficient.
  143. These need not be equally good.
    So it's hard to resist saying
  144. that American society's driving
    habits, if we think about the
  145. alternatives to evolution,
    are a good example of
  146. unintelligent design.
    So when you're talking about
  147. this in a less formal way,
    in your anthropology or
  148. political science,
    or sociology classes you want
  149. to have in the back of your
    mind, what we mean by
  150. evolutionary stability and that
    it doesn't necessarily mean that
  151. we're going to arrive at
    efficiency.
  152. In some sense this is exactly
    analogous to what we discussed
  153. when we looked at games like
    this with rational players
  154. playing if you think about
    coordination games.
  155. These are essentially
    coordination games.
  156. That was a fairly
    straightforward example.
  157. To leave it up there let me,
    there's another board here
  158. somewhere, here we go,
    let's look at yet another
  159. example, slightly more
    interesting.
  160. So today I'm going to spend
    most of today just looking at
  161. examples and talking about them.
    So here's another example of a
  162. game we might imagine and once
    again it's a two-by-two game,
  163. a nice and simple game,
    and here the payoffs are as
  164. follows.
    So down the diagonal we have
  165. 0,0, 0,0 and off the diagonal we
    have 2,1 and 1,2.
  166. So what is this game
    essentially?
  167. It's very, very similar to a
    game we've seen already in
  168. class, anybody?
    This is essentially Battle of
  169. the Sexes.
    I've taken the Battle of the
  170. Sexes Game, The Dating Game,
    I've just twiddled it around to
  171. make it symmetric,
    so this is a symmetric version
  172. of Battle of the Sexes or our
    dating game.
  173. You can think of this game also
    in the context of driving
  174. around.
    So you could imagine that one
  175. version of this game,
    you sometimes hear this game
  176. referred to as chicken.
    What's the game chicken when
  177. cars are involved?
    Anybody?
  178. What's the game chicken when
    cars are involved?
  179. It's probably a good thing that
    people don't know this.
  180. No one knows this?
    Okay, all right somebody knows
  181. it.
    Can I get the mic way back
  182. there?
    Yeah shout it out.
  183. Right, so you can imagine,
    here we are on a road that
  184. perhaps isn't big enough to have
    driving and driving on the
  185. right.
    These two cars face each other,
  186. they drive towards each other,
    both of them going in the
  187. middle of the road,
    and the loser is the person who
  188. swerves first.
    If you think of A as being the
  189. aggressive strategy of not
    swerving and B as being the less
  190. aggressive,
    more benevolent strategy,
  191. if you like,
    of swerving--so the best thing
  192. for you is for you to be
    aggressive and the other person
  193. to swerve,
    and then at least they remain
  194. alive.
    Conversely, if you're
  195. benevolent and they swerve,
    you remain alive but they win
  196. and unfortunately now,
    if you're both aggressive you
  197. get nothing, and the way we've
    written this game,
  198. if you're both benevolent you
    get nothing.
  199. You can imagine making some
    more negatives here.
  200. So this is a game that seems
    kind of important in nature,
  201. not just in teenage male
    behavior but in animal behavior,
  202. since we're talking about
    aggression and non-aggression.
  203. So what's evolutionarily stable
    in this game?
  204. Well remember our starting
    point is what?
  205. Our starting point is to look
    for symmetric Nash Equilibria.
  206. So are there any symmetric Nash
    Equilibria in this game and if
  207. so what are they?
    There are some Nash Equilibria
  208. in pure strategies,
    there are some Nash Equilibria
  209. in this game,
    for example,
  210. (A,B) is a Nash Equilibrium and
    (B,A) is a Nash Equilibrium,
  211. but unfortunately they're not
    symmetric,
  212. and so far we're focusing on
    games that are symmetric,
  213. this is random matching.
    There's no asymmetry in the
  214. rows--in the row and column
    player, although in the
  215. handout--not the handout,
    in the reading packet I made
  216. for you--they do also look at
    some asymmetric versions of
  217. games,
    but for now we're just looking
  218. at symmetry.
    So neither (A,B) nor (B,A) will
  219. serve our purpose because
    they're not symmetric Nash
  220. Equilibria.
    In fact, there is no symmetric
  221. pure strategy Nash Equilibrium
    in this game.
  222. So it can't be that if this was
    a species, if this was an animal
  223. that came in,
    that had two possible
  224. strategies, aggression or
    passivity, it can't be the case
  225. in this particular game that you
    end up with 100% aggression out
  226. there--100% aggressive genes out
    there--or 100% unaggressive
  227. genes out there.
    In either case,
  228. if you had 100% aggressive
    genes out there,
  229. then it would be doing very,
    very badly and you get an
  230. invasion of passive genes and if
    you had 100% passive genes out
  231. there,
    you'd get an invasion of
  232. aggressive genes.
    You can't have a pure ESS,
  233. a pure evolutionarily stable
    gene mix out there.
  234. So what does that suggest?
    It suggests we should start
  235. looking at mixed strategies.
    There is a mixed strategy,
  236. there is a symmetric mixed
    strategy Nash Equilibrium in the
  237. game and we could go through and
    work it out.
  238. We all know now--probably
    you've been laboring through the
  239. homework assignments--so you all
    know how to find a mixed
  240. strategy Nash Equilibrium in
    this game,
  241. you have to set the mix to make
    the other player indifferent
  242. between her two strategies.
    But this is a game we've seen
  243. already, it's essentially Battle
    of the Sexes,
  244. so you probably remember from a
    week ago what that equilibrium
  245. mix is.
    Can anyone remember what the
  246. equilibrium mix in Battle of the
    Sexes?
  247. Its (2/3,1/3),
    turns out that (2/3,1/3) is a
  248. Nash Equilibrium here.
    So if you go back to your notes
  249. a week ago you'll find something
    very much like that was a Nash
  250. Equilibrium in the original
    version of Battle of the Sexes.
  251. A week ago it would have been
    (2/3,1/3), (1/3,2/3) because
  252. things weren't symmetric.
    Now I've made things symmetric,
  253. it's just (2/3,1/3) for both
    players.
  254. You can check it out.
    So what's this telling us?
  255. It's telling us that there's at
    least an equilibrium in this
  256. game in which 2/3 of the genes
    are aggressive and 1/3 of the
  257. genes are unaggressive.
    But does that mean anything?
  258. What could that mean?
    In terms of biology what could
  259. that mean?
    Well, so far,
  260. we've been looking at
    evolutionarily stable pure
  261. strategies and the
    evolutionarily stable pure
  262. strategies correspond to
    evolutionarily stable situations
  263. in nature which are what are
    called monomorphic.
  264. Monomorphic means one shape or
    one type out there,
  265. but you can also have
    situations in nature where there
  266. are actually stable mixed types
    and they're called polymorphic.
  267. It's probably not hyphenated,
    it's probably just one word.
  268. So you can have a monomorphic
    population, that's what we've
  269. focused on so far,
    but you could also have a mixed
  270. population.
    Now for this to be a mixed
  271. population we better change the
    definition accordingly.
  272. We'll come back and talk about
    what it means in a second a bit
  273. more, but first we better make
    sure we have an okay definition
  274. for it.
    So what I'm going to do is I'm
  275. going to drag this definition
    down and do something you're not
  276. meant to do usually in teaching,
    I'm going to use the eraser to
  277. correct my definition.
    So here I have my pure strategy
  278. definition and let me just
    change it into a definition that
  279. will allow for these polymorphic
    populations.
  280. So I'm going to change this
    into a P, I'm going to change
  281. this pure into mixed,
    and everywhere you see an
  282. Ŝ I'm going to put a P,,
    and over here too and
  283. everywhere you see an S' I'm
    going to put a P.''
  284. The reason I'm doing this this
    way is I want to emphasize that
  285. there's nothing new here,
    I'm just writing down the same
  286. definitions we had before,
    except I'm now allowing for the
  287. idea of populations being mixed,
    and I'm also,
  288. just to note in passing,
    I'm also allowing for the
  289. possibility that a mutation
    might be mixed.
  290. Did I catch them all?
    So I've just gone through the
  291. definition you have and I've
    switched everything from pure to
  292. mixed.
  293. So in our example does the mix
    2/3,1/3 satisfy the definition
  294. above?
    Let's go through carefully.
  295. So (2/3,1/3) is a Nash
    Equilibrium so we've satisfied
  296. part A, it's a symmetric Nash
    Equilibrium, so we're okay
  297. there.
    Is this equilibrium a strict
  298. equilibrium?
    Is this mixed population 2/3
  299. aggressive and 1/3 unaggressive,
    or this mixed strategy,
  300. is it a strict equilibrium?
    How do deviations do against it?
  301. Anybody?
    Go ahead.
  302. Student: Equally well.
    Professor Ben Polak:
  303. Equally well,
    thank you.
  304. So it can't be a strict Nash
    Equilibrium, because if we
  305. deviated to A we'd do as well as
    we were doing in the mix,
  306. or if we deviated to B,
    we'd do as well as we're doing
  307. in the mix.
    Another way of saying it is,
  308. an A mutation does exactly as
    well against this mix as the mix
  309. does against itself,
    and a B mutation does exactly
  310. as well.
    In fact, that's how we
  311. constructed the equilibrium in
    the first place.
  312. We chose a P that made you
    indifferent between A and B,
  313. so in a mixed Nash Equilibrium
    it can't be strict since it is
  314. mixed.
    In a mixed Nash Equilibrium,
  315. a genuinely mixed Nash
    Equilibrium by definition,
  316. you're indifferent between the
    strategies in the mix.
  317. So to show that this is in fact
    evolutionarily stable we'd have
  318. to show rule B,
    so we need to show--we need to
  319. check--let's give ourselves some
    room here--we need to check how
  320. the payoff of this strategy,
    let's call it P,
  321. how P does against all possible
    deviations and compare that with
  322. how those deviations do against
    themselves.
  323. We have to make this comparison
    - how does this mix do against
  324. all other possible mixes versus
    how those mixes do against
  325. themselves?
    We'd have to do this,
  326. unfortunately,
    we'd have to check this for all
  327. possible--and now we have to be
    careful--all possible mixed
  328. mutations P''.
    So that would take us a while,
  329. it's actually possible to do.
    If you do enough math,
  330. it isn't so hard to do.
    So rather than prove that to
  331. you, let me give you a heuristic
    argument why this is the case.
  332. I'm going to try and convince
    you without actually proving it
  333. that this is indeed the case.
    So here we are with this
  334. population, exactly 2/3 of the
    population is aggressive and 1/3
  335. of the population is passive.
    Suppose there is a mutation,
  336. P'', that is more aggressive
    than P, it's a relatively
  337. aggressively mutation.
    For example,
  338. this mutation may be 100%
    aggressive, or at least it may
  339. be very, very highly
    aggressively,
  340. maybe 90% aggressive or
    something like that.
  341. Now I want to argue that that
    aggressive mutation is going to
  342. die out and I'm going to argue
    it by thinking about this rule.
  343. So I want to argue that the
    reason this very aggressive
  344. mutation dies out is because the
    aggressive mutation does very
  345. badly against itself.
    Is that right?
  346. If you have a very aggressive
    mutant, the very aggressive
  347. mutants do very,
    very badly against themselves,
  348. they get 0.
    And that's going to cause them
  349. to die out.
    What about the other extreme?
  350. What about a deviation that's
    very passive?
  351. So a very nice mutation,
    a very passive type,
  352. for example,
    it could a 100% B or you know
  353. 99% B or 98% B,
    how will that do?
  354. Well it turns out,
    in this game again,
  355. it doesn't do very well against
    itself and in addition,
  356. the original mix,
    the mixed P that is more
  357. aggressive than this very
    passive mutation does very well
  358. against the mutation.
    So the mix that's in there,
  359. the mix that's in the
    population already is relatively
  360. aggressive compared to this very
    passive mutation and so the
  361. incumbent,
    the relatively aggressive
  362. incumbents are doing very,
    very well on average against
  363. the mutation,
    and hence once again,
  364. this equality holds.
    So just heuristically,
  365. without proving it,
    more aggressive mutations are
  366. going to lose out here because
    they do very badly against
  367. themselves,
    and more passive mutations are
  368. going to do badly because they
    make life easy for P which is
  369. more aggressive.
    So it wasn't a proof but it was
  370. a heuristic argument and it
    turns out indeed to be the case.
  371. So in this particular game,
    a game you could imagine in
  372. nature, a game involving
    aggression and passivity within
  373. this species,
    it turns out that in this
  374. example the only equilibrium is
    a mixed equilibrium with 2/3
  375. aggressive and 1/3 unaggressive.
    And this raises the question:
  376. what does it mean?
    What does it mean to have a mix
  377. in nature?
    So it could mean two different
  378. things.
    It could mean that the gene
  379. itself is randomizing.
    It could mean that the strategy
  380. played by the particular ant,
    squirrel, lion,
  381. or spider is actually to
    randomize, right,
  382. that's possible.
    But there's another thing it
  383. could mean that's probably a
    little bit more important.
  384. What's the other thing it could
    mean?
  385. It could mean that in the
    stable mix, the evolutionarily
  386. stable population,
    for this particular spider say,
  387. it could be that there are
    actually two types surviving
  388. stably in these proportions.
    If you go back to what we said
  389. about mixed strategies a week
    ago, we said one of the possible
  390. interpretations of mixed
    strategies is not that people
  391. are necessarily randomizing,
    but that you see a mix of
  392. different strategies in society.
    Again in nature,
  393. one of the impossible
    interpretations here,
  394. the polymorphic population
    interpretation,
  395. is that, rather than just have
    all of the species look and act
  396. alike, it could be there's a
    stable mix of behaviors and/or
  397. appearances in this species.
    So let me try and convince you
  398. that that's not an uninteresting
    idea.
  399. So again, with apologies I'm
    not a biologist,
  400. I've spent a bit of time on the
    web this weekend trying to come
  401. up with good examples for you
    and the example I really wanted
  402. to come up with,
    I couldn't find on the web,
  403. which makes me think maybe it's
    apocryphal, but I'll tell you
  404. the story anyway.
    It's not entirely apocryphal,
  405. it may just be that my version
    of it's apocryphal.
  406. So this particular example I
    have in mind,
  407. is to do with elephant seals,
    and I think even if it isn't
  408. true of elephant seals,
    it's definitely true of certain
  409. types of fish,
    except that elephant seals make
  410. a better story.
    So imagine that these elephant
  411. seals--it turns out that there
    are two possible mating
  412. strategies for male elephant
    seals.
  413. By the way, do you all know
    what elephant seals are?
  414. They're these big--people are
    looking blankly at me.
  415. You all have some rough image
    in your mind of an elephant
  416. seal?
    You've all seen enough nature
  417. shows at night?
    Yes, no, yes?
  418. Okay, so there are two male
    mating strategies for the male
  419. elephant seal.
    One successful male mating
  420. strategy is to be the head,
    the dominant,
  421. or a dominant elephant male,
    male elephant seal,
  422. and have as it were,
    a harem of many female elephant
  423. seals with which the male mates
    with.
  424. For the males in the room don't
    get too happy,
  425. these are elephant seals,
    they're not you guys.
  426. So one possible successful
    strategy is to be a successful
  427. bull elephant seal and have
    many, many, many potentially
  428. wives, so to be a polygamist.
    Presumably to do that well a
  429. good idea, a thing that would go
    well with that strategy is to be
  430. huge.
    So you could imagine the
  431. successful male elephant seal
    being an enormous animal.
  432. It looks like a sort of a
    linebacker in football and
  433. basically fights off all other
    big elephant seals that show up.
  434. But it turns out,
    I think I'm right in saying if
  435. I did my research correctly,
    this is true among northern
  436. elephant seals but not true
    among southern elephant seals,
  437. so the Arctic not the
    Antarctic, but someone's going
  438. to correct me.
    Once it's on the web I'm going
  439. to get floods of emails saying
    that I've got this wrong.
  440. But never mind.
    So it turns out that this is
  441. not quite evolutionarily stable.
    Why is this not evolutionarily
  442. stable?
    What's the alternative male
  443. strategy that can successfully
    invade the large bull harem
  444. keeper elephant seal?
    Any guesses?
  445. Anyone looking for a successful
    career as an elephant seal,
  446. as a male elephant seal?
    Say that again?
  447. Student: [Inaudible]
    Professor Ben Polak:
  448. Good, so good thank you.
    Did people catch that?
  449. Good, an alternative strategy
    is to be a male elephant seal
  450. who looks remarkably like a
    female elephant seal.
  451. Instead of looking like a
    linebacker, they look like a
  452. wide receiver.
    I'm offending somebody in the
  453. football team.
    You get the idea, right?
  454. What do they do?
    They sneak in among these large
  455. numbers of male elephant seals
    and they just mate with a few of
  456. them.
    So they look like a female
  457. seal, and they can hide among
    the female elephant seals in the
  458. harem,
    and they mate with a few of
  459. them and provided this is
    successful enough it'll be
  460. evolutionarily stable for the
    female elephant seal to want to
  461. mate with that too.
    Now I forget if actually this
  462. is exactly right,
    but it's certainly--I did
  463. enough research over the weekend
    to know it's right at least in
  464. some species,
    and the nicest part of this
  465. story is that at least some
    biologists have a nice technical
  466. name for this strategy that was
    well described by our friend at
  467. the back,
    and the name for this strategy
  468. is SLF, and since we're on film
    I'm going to tell you what the S
  469. and the L are,
    but you'll have to guess the
  470. rest.
    So this is sneaky,
  471. this is little,
    and you can guess what that is.
  472. So this turns out to be
    actually quite a common
  473. occurrence.
    It's been observed in a number
  474. of different species,
    perhaps not with the full added
  475. color I just gave to it.
    So having convinced you that
  476. polymorphic populations can be
    interesting, let's go back to a
  477. case,
    a more subtle case,
  478. of aggression and
    non-aggression,
  479. because that seems to be one of
    the most important things we can
  480. think of in animal behavior.
    So let's go back and look at a
  481. harder example of this,
    where we started.
  482. So as these examples get
    harder, they also get more
  483. interesting.
    So that's why I want to get a
  484. little bit harder.
  485. So the chicken game,
    the Battle of the Sexes game,
  486. is not a particularly
    interesting version of
  487. aggression and non-aggression.
    Let's look at a more general
  488. version of aggression versus
    non-aggression,
  489. and let's look at a game that's
    been studied a lot by biologists
  490. and a little bit by economists
    called hawk-dove.
  491. And again, just to stress,
    we're talking about within
  492. species competition here,
    so I don't mean hawks versus
  493. doves.
    I mean thinking of hawk as
  494. being an aggressive strategy and
    dove as being a dovish,
  495. a passive strategy.
    So here's the game and now
  496. we're going to look at more
    general payoffs than we did
  497. before.
    So this is the hawk strategy,
  498. this is the dove strategy--hawk
    and dove--and the payoffs are as
  499. follows.
    V + C--sorry, start again.
  500. (V-C)/2 and (V-C)/2 and here we
    get V/2 and V/2,
  501. and here we get V and 0 and
    here we get 0 and V.
  502. So this is a generalization,
    a more interesting version of
  503. the game we saw already.
    Let's just talk about it a
  504. little bit.
    So the idea here is there's
  505. some potential battle that can
    occur among these two animals
  506. and the prize in the battle is
    V.
  507. So V is the victor's spoils and
    we're going to assume that V is
  508. positive.
    And unfortunately,
  509. if the animals fight--so if the
    hawk meets another hawk and they
  510. fight one another--then there
    are costs of fighting.
  511. So the costs of fighting are C,
    and again, we'll assume that
  512. they're positive.
    So this is the cost of fighting.
  513. This more general format is
    going to allow us to do two
  514. things.
    We're going to look and ask
  515. what is going to be
    evolutionarily stable,
  516. including mixtures now.
    And we're also going to be
  517. allowed to ask,
    able to ask,
  518. what happens,
    what will happen to the
  519. evolutionarily stable mix as we
    change the prize or as we change
  520. the cost of fighting?
    Seems a more interesting,
  521. a richer game.
    Okay, so let's start off by
  522. asking could we have an
    evolutionarily stable population
  523. of doves?
    So is D an evolutionarily
  524. stable strategy?
    I'll start using the term ESS
  525. now.
    So ESS means the evolutionarily
  526. stable strategy.
    Is D in an ESS.
  527. So in this game,
    could it be the case that we
  528. end up with a population of
    doves?
  529. Seems a nice thing to imagine,
    but is it going to occur in
  530. this game in nature?
    What do people think?
  531. How do we go about checking
    that?
  532. What's the first step?
    First step is to ask,
  533. is (D, D) a Nash Equilibrium?
    If it's evolutionarily stable,
  534. in particular,
    (D,D) would have to be a Nash
  535. Equilibrium.
    That's going to make it pretty
  536. easy to check,
    so is (D, D) a Nash Equilibrium
  537. in this game?
    It's not, but why not?
  538. Because if you had a mutation,
    keep on tempted to say
  539. deviation, but you want to think
    of it as mutation.
  540. If I had a mutation of hawks,
    the hawk mutation against the
  541. doves is getting V,
    whereas, dove against dove is
  542. only getting V/2,
    so it's not Nash.
  543. So we can't have a
    evolutionarily stable population
  544. of doves, and the reason is
    there will be a hawk mutation,
  545. an aggressive type will get in
    there and grow,
  546. much like we had last week when
    we dropped Rahul into the
  547. classroom in Prisoner's Dilemma
    and he grew,
  548. or his type grew.
    So second question:
  549. is hawk an evolutionarily
    stable strategy?
  550. So how do we check this?
    What we have to look at once
  551. again--and ask the
    question--this is the first
  552. question to ask is:
    is (H,H) a Nash Equilibrium?
  553. So is it a Nash Equilibrium?
    Well I claim it depends.
  554. I claim it's a Nash Equilibrium
    provided (V-C)/2 is at least as
  555. large as 0.
    Is that right?
  556. It's a Nash Equilibrium--it's a
    symmetric Nash Equilibrium
  557. --provided hawk against hawk
    does better, or does at least as
  558. well as, dove against hawk.
    So the answer is yes,
  559. if (V-C)/2 is at least as big
    as 0.
  560. So now we have to think fairly
    carefully, because there's two
  561. cases.
    So case one is the easy case
  562. which is when V is strictly
    bigger than C.
  563. If V is strictly bigger than C,
    then (V-C)/2 is strictly
  564. positive, is that right?
    In which case,
  565. what kind of a Nash Equilibrium
    is this?
  566. It's strict right?
    So if V is bigger than C then
  567. (H, H) is a strict Nash
    Equilibrium.
  568. The second case is if V is
    equal to C, then (V-C)/2 is
  569. actually equal to 0,
    which is the same as saying
  570. that the payoff of H against H
    is equal to the payoff of dove
  571. against hawk.
    That correct?
  572. So in that case what do we have
    to check?
  573. Well I've deleted it now,
    it'll have to come from your
  574. notes, what do I have to check
    in the case in which there's a
  575. tie like that?
  576. What do I have to check?
    I have to check--in this case I
  577. need to check how hawk does
    against dove,
  578. because dove will be the
    mutation.
  579. I need to compare that with the
    payoff of dove against dove.
  580. So how does hawk do against
    dove?
  581. What's the payoff of hawk
    against dove?
  582. Anybody?
    Payoff of hawk against dove.
  583. It shouldn't be that hard.
    It's on the board:
  584. hawk against dove.
    Shout it out.
  585. V, thank you.
    So this is V and how about the
  586. payoff of dove against dove?
    V/2, so which is bigger V or
  587. V/2?
    V is bigger because it's
  588. positive, so this is bigger so
    we're okay.
  589. So what have we shown?
    We've shown,
  590. let's just draw it over here,
    we've shown,
  591. that if V is at least as big as
    C, then H is an evolutionarily
  592. stable strategy.
    So in this game,
  593. in this setting in nature,
    if the size of the prize to
  594. winning the fight is bigger than
    the cost that would occur if
  595. there is a fight,
    then it can occur that all the
  596. animals in this species are
    going to fight in an
  597. evolutionarily stable setting.
    Let me say it again,
  598. if it turns out in this setting
    in nature that the prize to
  599. winning the fight is bigger,
    or at least as big as,
  600. the cost of fighting,
    then it will turn out that it
  601. be evolutionarily stable for all
    the animals to fight.
  602. The only surviving genes will
    be the aggressive genes.
  603. What does that mean?
    So what typically do we think
  604. of as the payoffs to fight and
    the costs of fighting?
  605. Just to put this in a
    biological context.
  606. The fight could be about what?
    It could be--let's go back to
  607. where we started from--it could
    be males fighting for the right
  608. to mate with females.
    This could be pretty important
  609. for genetic fitness.
    It could be females fighting
  610. over the right to mate with
    males.
  611. It could also be fighting over,
    for example,
  612. food or shelter.
    So if the prize is large and
  613. the cost of fighting is small,
    you're going to see fights in
  614. nature.
    But we're not done yet,
  615. why are we not done?
    Because we've only considered
  616. the case when V is bigger than
    C.
  617. So we also need to consider the
    case when C is bigger than V.
  618. This is the case where the cost
    of fighting are high relative to
  619. the prize in the particular
    setting we're looking at.
  620. So again let's go back to the
    example, suppose the cost of
  621. fighting could be that the
    animal could lose a leg or even
  622. its life,
    and the prize is just today's
  623. meal, and perhaps there are
    other meals out there.
  624. Then we expect something
    different to occur.
  625. However, we've already
    concluded that even in this
  626. setting it cannot be the case
    only to have doves in the
  627. population.
    We've shown that even in the
  628. case where the costs of fighting
    are high relative to the prizes,
  629. it cannot be evolutionarily
    stable only to have dove genes
  630. around;
    passive genes around.
  631. So in this case,
    it must be the case that if
  632. anything is evolutionarily
    stable it's going to be what?
  633. It's going to be a mix.
    So in this case,
  634. we know that H is not ESS and
    we know that D is not ESS.
  635. So what about a mix?
  636. What about some mix P?
    We could actually put P the in
  637. here.
    We can imagine looking for a
  638. mix P, 1- P that will be stable.
    Now how we do go about finding
  639. a possible mixed population that
    has some chance or some hope of
  640. being evolutionarily stable?
    So here we are we're biologists.
  641. We're about to set up an
    experiment.
  642. We're about to either
    experiment, or about to go out
  643. and do some field work out
    there.
  644. And you want to set things up.
    And we're asking the question:
  645. what's the mix we expect to
    see?
  646. What's the first exercise we
    should do here?
  647. Well if it has any hope to be
    evolutionarily stable,
  648. what does it have to be?
    It has to be a symmetric Nash
  649. Equilibrium.
    So the first step is,
  650. step one find a symmetric mixed
    Nash Equilibrium in which people
  651. will be playing P (,1- P).
    It's symmetric,
  652. so both sides will be playing
    this.
  653. So this is good review for the
    exam on Wednesday.
  654. How do I go about finding a
    mixed equilibrium here?
  655. Shouldn't be too many blank
    faces.
  656. This is likely to come up on
    the exam on Wednesday.
  657. Let's get some cold calling
    going on here.
  658. How do I find the mix strategy?
    Just find anybody.
  659. How do we find--how do I find a
    mixed strategy equilibrium?
  660. Student: Just use the
    other player's payoff.
  661. Professor Ben Polak: I
    use the other player's payoffs
  662. and what do I do with the other
    person's payoffs?
  663. Student: You set them
    equal.
  664. Professor Ben Polak: Set
    them equal, okay.
  665. So here it's a symmetric game.
    It's really there's only one
  666. population out there.
    So all I need,
  667. I need the payoff of hawk
    against P or (P,
  668. 1-P), I need this to be equal
    to the payoff of dove against
  669. this P.
    So the payoff of hawk is going
  670. to be what?
    Just reading up from up
  671. there--let's use our pointer--so
    hawk P of the time will meet
  672. another hawk and get this
    payoff.
  673. So they'll get--so P of the
    time they'll get a payoff of
  674. (V-C)/2 and 1- P of the time
    they'll meet a dove and get a
  675. payoff of V.
    And dove against this same mix
  676. (P, 1- P): P of the time they'll
    meet a hawk and get nothing and
  677. 1-P of the time they'll meet
    another dove and get V/2.
  678. Everyone happy with the way I
    did that?
  679. That should be pretty familiar
    territory to everybody by now,
  680. is that right?
    So I'm going to set these two
  681. things equal to each other since
    they must be equal,
  682. if this is in fact a mixed
    strategy equilibrium.
  683. And then I'm going to play
    around with the algebra.
  684. So as to save time I did it at
    home, so trust me on this--this
  685. is implication with the word
    trust on top of it--trust me
  686. that I got the algebra right or
    check me at home.
  687. This is going to turn out to
    imply that P equals V/C.
  688. So it turns out that there is
    in fact a mixed Nash
  689. Equilibirum.
    There's a mixed Nash
  690. Equilibrium which is of the
    following form,
  691. V/C and 1-V/C played by both
    players.
  692. Is this a strict Nash
    Equilibrium?
  693. I've found the Nash
    Equilibrium, is it strict?
  694. Everyone should be shouting it
    out.
  695. Is it strict?
    It can't be strict because it's
  696. mixed right?
    By definition it's not.
  697. It can't be strict because we
    know that deviating to H,
  698. or, for that matter,
    deviating to D yields the same
  699. payoff.
    So it can't be a strict Nash
  700. Equilibrium.
    So we need to check something.
  701. So we need to check not strict;
    so we need to check whether U
  702. of P against P'' is bigger than
    U of P'' against itself,
  703. and we need to check this for
    all possible mutations P''.
  704. Again, that would take a little
    bit of time to do in class,
  705. so just trust me on it and once
    again,
  706. let me give you the heuristic
    argument I gave to you before.
  707. It's essentially the same
    argument.
  708. So the heuristic argument I
    gave to you before was:
  709. imagine a P',
    a mutation, that is more
  710. aggressive than our candidate
    equilibrium.
  711. If it's more aggressive then
    it's going to do very,
  712. very badly against itself
    because C is bigger than V in
  713. this case.
    So it's actually going to get
  714. negative payoffs against itself.
    Since it gets negative payoffs
  715. against itself,
    it turns out that will cause it
  716. to die out.
    Conversely, imagine a mutation
  717. that's relatively soft,
    that's relatively dovish,
  718. this mutation is very good for
    the incumbents because the
  719. incumbents essentially beat up
    on it or score very highly on
  720. it.
    So once again,
  721. the more dovish mutation will
    die out.
  722. So again, that isn't a proof
    but trust the argument.
  723. We need to show this but it
    does in fact turn out to be the
  724. case.
    So what have we shown here?
  725. It didn't prove the last bit,
    but what we've argued is that,
  726. in the case in which the cost
    of fighting in nature are bigger
  727. than the prizes of winning the
    fight,
  728. it is not the case that we end
    up with a 100% doves.
  729. So we don't end up with no
    fights, for example:
  730. no fights is not what we would
    expect to observe in nature.
  731. And we don't end up with a 100%
    fights: 100% fights is not what
  732. you expect to see in nature.
    What we end up with is a
  733. mixture of hawks and doves,
    such that V/C is the proportion
  734. of hawks.
    So the fights that occur are
  735. essentially V/C squared.
    We can actually observe those
  736. fights in nature.
    What lessons can we draw from
  737. this?
    So biology lessons.
  738. So we used a lot of what we've
    learned in the last day or so to
  739. figure out what the ESS was
    there.
  740. We kind of did the nerdy part.
    Now let's try and draw some
  741. lessons from it.
  742. So the first thing we know
    is--I've hidden what we care
  743. about here--so we know that if V
    is smaller than C,
  744. then the evolutionarily stable
    mixed population has V/C hawks.
  745. So let's just see how much of
    this makes sense.
  746. So as V goes up,
    as the prizes go up--if you
  747. took the same species and put
    them into a setting in which the
  748. prizes tended to be larger--what
    would we expect to see?
  749. Would we expect to see the
    proportion of hawks up or down?
  750. Up right: as V goes up,
    we see more hawks.
  751. What else do we see?
    Not so surprisingly as C goes
  752. up--we look at settings where
    the cost of fighting is
  753. higher--we tend to see more
    doves.
  754. Now this is in ESS,
    so more hawks in the
  755. evolutionarily stable mix.
    And more doves in the
  756. evolutionarily stable mix.
    It's possible,
  757. of course that the species in
    question can recognize these two
  758. different situations and be
    coded differently,
  759. to behave differently in these
    two different situations but
  760. that's beyond the class for now.
    Perhaps a more interesting
  761. observation is about the
    payoffs.
  762. Let's look at the actual
    genetic fitness of the species
  763. overall.
    So in this mix what is the
  764. payoff?
    Well how are we going to figure
  765. out what is the payoff?
    So the payoff in this mix,
  766. we can actually construct by
    looking at the payoff to dove.
  767. It doesn't really matter
    whether you look at the payoff
  768. to dove or the payoff to hawk.
    Let's look at the payoff to
  769. dove.
    So the payoff was what?
  770. It was, if you were dove,
    then 1-V/C of the time,
  771. you met another dove and in
    that instance you get a payoff
  772. of V/2 and it must be the payoff
    to being a dove is the same
  773. since they're mixing.
    This is the payoff.
  774. So what's happening to this
    payoff as we increase the cost
  775. of fighting?
    What happens as C rises?
  776. So just to note out,
    what happens as C goes up?
  777. So you might think naively,
    you might think that if you're
  778. in a setting,
    be it a social evolutionary
  779. setting or a biology,
    a nature evolutionary setting,
  780. you might think that as the
    cost of fighting goes up for you
  781. guys in society,
    or for the ants,
  782. antelopes, or lions we're
    talking about,
  783. you might think the payoffs in
    society go down.
  784. Costs of fighting go up,
    more limbs get lost and so on,
  785. sounds like that's going to be
    bad for the overall genetic
  786. fitness of the species.
    But in fact we don't find that.
  787. What happens as C goes up?
    The payoff goes up.
  788. As C goes up,
    the payoff goes up.
  789. As we take C bigger,
    this begets smaller,
  790. which means this is bigger.
    Everyone see that?
  791. Just look at that term
    (1-V/C)(V/2),
  792. it's actually increasing in C.
    So how does that work?
  793. As the costs of fighting go up,
    it's true that if you do fight,
  794. you're more likely to lose a
    finger,
  795. or a limb, or a claw,
    or whatever those things are
  796. called, or a foot or whatever it
    is you're likely to lose.
  797. But the number of fights that
    actually occur in this
  798. evolutionarily stable mix goes
    down and it goes down
  799. sufficiently much to compensate
    you for that.
  800. Kind of a remarkable thing.
    So these animals that actually
  801. are going to lose a lot through
    fighting are actually going to
  802. do rather well overall because
    of that mix effect.
  803. I feel like it's one of those
    strategic effects.
  804. Now of course that raises a
    question, which is what would
  805. happen if a particular part of
    this species evolved that had
  806. lower costs of fighting?
    It could regrow a leg.
  807. Sounds like that would do
    pretty well, and that would be
  808. bad news for the species as a
    whole.
  809. Third thing we can observe here
    is what's sometimes called
  810. identification.
  811. So what does identification
    mean here?
  812. It means that by observing the
    data in nature,
  813. by going out and filming these
    animals behaving for hours and
  814. hours,
    or changing their setting in a
  815. lab and seeing how they
    interact, or changing their
  816. setting in the field and seeing
    how they interact,
  817. we can actually observe
    something, namely the proportion
  818. of fights.
    Perhaps we can do better now
  819. and actually look at their
    genetics directly since science
  820. has evolved, and we can actually
    back out the V and the C.
  821. By looking at the proportion of
    hawk genes out there or hawkish
  822. behavior out there,
    we can actually identify what
  823. must be the ratio of V to C.
    We can tell what the ratio V/C
  824. is from looking at data.
    We started off with a little
  825. matrix, I've just written in V's
    and C's, I didn't put any
  826. numbers in there.
    We can't tell what V is,
  827. we can't tell what C is,
    but we can tell what the ratio
  828. is by looking at real world
    data.
  829. So if you spend enough hours in
    front of the TV watching nature
  830. shows you could back this out.
    Not literally,
  831. you need to actually do some
    serious work.
  832. So this is a useful input of
    theory into empirical science.
  833. You want theory to be able to
    set up the science so you can
  834. back out the unknowns in the
    theory: and that's called
  835. identification,
    not just in biology but in
  836. Economics as well.
    Now there's one other thing
  837. you'd like theory to be other
    than identifiable.
  838. You'd like theory to be
    testable.
  839. You'd like theory to make
    predictions that were kind of
  840. outside of the sample you
    started with.
  841. Is what I'm saying familiar to
    everyone in the room.
  842. This is a very familiar idea
    I'm hoping to everybody,
  843. slightly philosophy but a very
    familiar idea.
  844. You have a new theory.
    It's one thing for that theory
  845. to explain the existing facts,
    but you'd like it to predict
  846. new facts.
    Why?
  847. Because it's a little easy--it
    might be a little bit too easy
  848. to reverse engineer a model or
    theory to fit existing facts,
  849. but if it has to deal with new
    facts, that's kind of exciting,
  850. that's a real test.
    Does that make sense?
  851. So you might ask about this
    theory, you might say,
  852. well it's just a whole bunch of
    'just so stories'.
  853. Does anyone know what I mean by
    a 'just so story'?
  854. It's children's stories,
    written by Kipling,
  855. and people sometimes accuse a
    lot of evolutionary theory as
  856. being 'just so stories',
    because you know what the fact
  857. is already, and you reverse the
    game, you come up with a story
  858. afterwards.
    That doesn't sound like good
  859. science.
    So you'd like this theory,
  860. this theory that matches Game
    Theory with evolution,
  861. to predict something that we
    hadn't seen before.
  862. And then we can go out and look
    for it, and see if it's there,
  863. and that's exactly what we now
    have.
  864. So our last example is a
    slightly more complicated game
  865. again.
  866. The slightly more complicated
    game has three strategies and
  867. the strategies are called,
    I'll tell you what the
  868. strategies are called in a
    second actually,
  869. I'll give you the payoffs first
    of all.
  870. So once again this is a game
    about different forms of
  871. aggression and we'll look at
    other interpretations in a
  872. second.
    Once again, V is going to be
  873. the prize for winning,
    0 is going to be the prize for
  874. losing, and 1 is if it's a tie.
    I'm just short sighted enough
  875. that I can't read my own
    writing.
  876. So I hope I've got this right,
    there we go.
  877. So this is the game,
    and we're going to assume that
  878. the prize V is somewhere between
    1 and 2.
  879. So V you can think of as
    winning, 0 is losing,
  880. and 1 is if it's a tie.
    Does anyone recognize what this
  881. game essentially is?
    It's essentially rock,
  882. paper, scissors.
    And it turns out that when
  883. biologists play rock,
    paper, scissors they give it a
  884. different name,
    they call it "scratch,
  885. bite, and trample."
    Scratch, bite,
  886. and trample is essentially the
    tactics of the Australian
  887. football team.
    So scratch, bite,
  888. and trample are the three
    strategies, and it's a little
  889. bit like rock,
    paper, and scissors.
  890. How do we change it?
    First, we added 1 to all the
  891. payoffs to make sure there are
    no negatives in there and second
  892. we added a little bit more than
    1 to winning.
  893. If we added 1 to
    everything--sorry,
  894. a little bit less than 1 to
    winning.
  895. So if we added 1 to everything
    then V would have been 2,
  896. but we kept V somewhere between
    1 and 2.
  897. So this is certainly a game,
    you can imagine in nature,
  898. there's three possible
    strategies for this species and
  899. the payoff matrix happens to
    look like this.
  900. So where's my prediction going
    to come from?
  901. Well since this is rock,
    paper, scissors we know that
  902. there's really only one hope for
    an evolutionarily stable
  903. strategy.
    Since it's essentially rock,
  904. paper, scissors what would,
    if there is an evolutionarily
  905. stable strategy or
    evolutionarily stable mix,
  906. what must it be?
    (1/3,1/3,1/3).
  907. So the only hope for an ESS is
    (1/3,1/3,1/3)--let's put that in
  908. here.
    So (1/3,1/3,1/3) and you can
  909. check at home that that indeed
    is a mixed-strategy equilibrium.
  910. And the question is:
    is this evolutionarily stable?
  911. So we know it's a Nash
    Equilibrium that I've given you,
  912. and we know it's not a strict
    Nash Equilibrium.
  913. Everyone okay with that?
    It can't be a strict Nash
  914. Equilibrium because it's mixed.
    So if this is an ESS it must be
  915. the case--we'd have to check
    that--let's call this P again
  916. like we've been doing--we have
    to check that the payoff from P
  917. against any other P' would have
    to be bigger than the payoff
  918. from P' against itself.
    We need that to be the case.
  919. So let P' be scratch.
    So let's compare these things.
  920. So U of P against scratch is
    what?
  921. Well you're playing against
    scratch, you are 1/3 scratch,
  922. 1/3 bite, 1/3 trample.
    So 1/3 of the time you get
  923. 1,1/3 of the time you get
    nothing, and 1/3 of the time you
  924. get V.
    Is that right?
  925. So your payoff is (1+V)/3.
    How would we do if we're
  926. scratch against scratch?
    The payoff of scratch against
  927. scratch is what?
    No prizes for this,
  928. what's the payoff of scratch
    against scratch?
  929. 1.
    Which is bigger, (1+V)/3 or 1?
  930. Well look, V is less than 2 so
    (1+V)/3 is less than 1.
  931. So 1 is bigger.
    So in this game the only hope
  932. for an evolutionarily stable mix
    was a (1/3,1/3,1/3) and it isn't
  933. stable.
    So here's an example.
  934. In this example there is no
    evolutionarily stable strategy.
  935. There's no evolutionarily
    stable mix.
  936. Then the obvious question is,
    what does that mean in nature?
  937. Can we find a setting that
    looks like rock,
  938. paper, scissors in nature in
    which nothing is ES?
  939. If nothing is ES what's going
    to happen?
  940. We're going to see a kind of
    cycling around.
  941. You're going to see a lot of
    the scratch strategy,
  942. followed by a lot of the
    trample strategy,
  943. followed by a lot of the bite
    strategy and so on.
  944. So it turns out that's exactly
    what you see when you look at
  945. these, the example I've left you
    on the web.
  946. There's an article in nature in
    the mid 90s that looked at a
  947. certain type of lizard.
    And these lizards come in three
  948. colors.
    I forget what the colors are,
  949. I know I wrote it down.
    One is orange;
  950. one is yellow;
    and one is blue.
  951. And these lizards have three
    strategies.
  952. The orange lizard is like our
    big elephant bull:
  953. it likes to keep a harem of
    many--or a large territory with
  954. many female lizards in it to
    mate with.
  955. But that can be invaded by our
    SLF strategy which turns out to
  956. be the yellow lizard.
    The yellow lizard can invade
  957. and just mate with a few of
    these female lizards.
  958. But when there are too many of
    these sneaky yellow lizards,
  959. then it turns out that they can
    be invaded by a blue lizard:
  960. and the blue lizard has much
    smaller territories,
  961. it's almost monogamous.
    So what happens in nature is
  962. you get a cycle,
    orange invaded by yellow,
  963. invaded by blue:
    harem keeper invaded by sneaky,
  964. invaded by monogamous invaded
    by harem keeper again.
  965. Indeed, the population does
    cycle around exactly as
  966. predicted by the model,
    so here's an example of
  967. evolutionary theory,
    via Game Theory,
  968. making a prediction that we can
    actually go off and test and
  969. find.
    This, for biologists,
  970. was like finding a black hole,
    it's a really cool thing.
  971. We'll leave evolution here:
    mid-term on Wednesday,
  972. we'll come back to do something
    totally different next week.
  973. See you on Wednesday.