10 Game Theory

In this chapter we will learn:

  • What is “Strategic Interactions”
  • Understand the notion of “non-cooperative behavior”
  • Define and explain the notion of Nash Equilibrium
  • Understand and explain why Nash equilibria often are not optimal/efficient
  • Understand and explain the concept of “backward induction”

10.2 Introduction

Game theory is the study of

Strategic Behavior

Strategic behavior means that my payoff, whatever that is, depends not just on my actions, but also on the actions of other people and the actions that I pick will influence the actions of other people. And, of course, vice versa.

Examples of a situation where strategic behavior/interaction matters are:

  • It is nighttime. You are driving up to an intersection and you see another car driving up to the same intersection from the right exactly at the same time. The outcome, whether you make it safely across the intersection, clearly depends upon your own behavior and the behavior of the other driver. And your behavior will depend on the behavior of the other driver and vice versa.
  • In a football game, it is third down and 7 yards to go. The payoff, whether you get first down or not, depends upon your actions, pass or rush, it depends on the actions of the defense. And it depends on how the defense reads you and how you read the defense and how you adjust and how the defense adjusts, etc., etc.
  • October 23, 2021. Mayor Blasio is facing a difficult situation. 30% of NYC police officers and firefighters have not been vaccinated. Should Mayor Blasio enforce a vaccine mandate? Apart from any ethical or public health considerations, is it in his interest to impose such a mandate? That will depend on how the firefighters and police officers respond. If those who are not vaccinated, refuse to get vaccinated, they will be put on unpaid leave. Other police officers and firefighters may go on a work slowdown, call in sick, etc. Having a large fraction of first responders unavailable for duty could be very costly for the residents of NYC and come with large political costs for the mayor. The problem for the mayor is: He has to make a decision before he can possibly know how the first responders will respond.52

In the markets we have considered up until this chapter there was no strategic interaction. In competitive markets the participants only need to know the price of the product. Buyers only need to know the price, which is their marginal cost, and their own marginal benefit form the product to make a rational decision. For the sellers, the price is their marginal revenue and rational, that is profit maximizing behavior, only requires equating the price to its own marginal costs.

No strategic behavior is required. There is no gaming of the system. That is the beauty of such markets. In the other market structures, we have considered, monopoly, there is also no strategic behavior. There is only one firm in the market. There are only two things the firm needs to know to maximize its profit: the market demand and its own marginal cost.

There are, however, many markets, where strategic behavior is crucial. Just think of two gas stations that are located at the same intersection. Each morning, each owner of the gas station has to decide what price to charge for gasoline. Surely, the profit one makes depends on both of the prices charged.

Coke and Pepsi, two perennial rivals. The success of each advertising strategy will depend on the advertising strategy of the other.

H&M, taking a stand on human rights in China, will have to reckon with the response of the Chinese government. Surely, their bottom line depends upon the Chinese reaction.

We will have to study and understand such markets as well.

There are many other strategic situations. How hard a Democratic/Republican candidate for the White House campaigns in Arizona will influence the chance of the Republican/Democratic candidate to win those electoral votes.

One of the most powerful tools that has been developed to understand such markets and other strategic reactions is

Game Theory

We will see how game theory can help us understand such markets. But we will also see that game theory can help us in many other areas of life as well.

So, what is game theory?

What is a game?

A game is any situation that has several actors, which we call players. Each player has an objective or pay off function. For a firm this can be to maximize profit. For a lawyer it might be to win a court case. For a country, it might be to win an arms race.

Each player has available a set of actions. For a firm, this would be the set of prices they could pick or the quality or type of a product. For a lawyer this could be the set of witnesses to call and how to prepare the clint for cross examination. For a county in an arms race, this could be how much of the government budget to spend on new arms or the kinds of arms to develop.

In a game, in general, each player’s payoff will depend on the actions of other players. In an arms race, whether you win or lose, depends on your and your opponent’s actions. In the court room, the outcome depends on the actions of both prosecutor and defense attorney.

Summarizing all this we can say that a game consists of: - A list of players - A set of specified actions for each player - A pay-off function for each player that generally depends on the actions of all players.

10.3 Specific Games

An example to play in online class: Guess a number between 1 and 100

I will pick a number between 1 and 100. You will guess the number. You will get 5 tries to guess the number. Your payoff is $10, if you guess at the first try. Your payoff decreases by $2 with each try. After each try, I will tell you whether you are high or low.

Your objective I to make as much money as possible.

Guess what my objective is?

10.3.1 Prisoners Dilemma

We will begin with what is most likely the most famous and well-known game, the Prisoners’ Dilemma Game, PDG.

In the Prisoners’ Dilemma Game there are two players, we call them Jessie and Frank. Jessie’s actions are listed in the rows. Jessie can choose from the actions: cheat or not cheat. Cheating means cheating on the other player, the brother Frank. Cheating on the brother means cooperating with the sheriff. For Frank, the actions are in the columns, and they are also called cheat and not cheat. The payoffs are listed in the matrix below. The first number in each of the parenthesis is the pay-off for Jessie and the second number is the pay- off for Frank. In this game, payoffs are listed as years in prison. Fewer years are of course better.

Prisoners Dilemma Game.

Figure 10.1: Prisoners Dilemma Game.

Suppose Jessie picks “cheat” and Frank also picks “cheat”. Then each player gets a payoff of -5. If Jessie picks “cheat” and Frank picks “not cheat”, then player I gets 0 and player II gets (-10). That is one example of a payoff structure. ETC.

We want to be able to make predictions about what will happen in such a game. We want to be able to predict how players will choose and what the payoffs will be.

The notion of an equilibrium that turns out to be useful in this case is the notion of a

Nash equilibrium

We start with the insight that each player can only choose/control their own actions, that the actions of all the other players are beyond each player’s control. So, each player will take the actions of all the other players as given when choosing their own actions. Given the actions of the other players, each player will pick their own action that is in their own best interest, that maximizes their own payoff.

What should a player expect the actions of the other player to be? Well, if one player chooses actions to maximize their own payoff it is reasonable that the other players will do the same. Then what should one expect others to do? Of course, the expectation should be that they do what is best for them.

So, we will assume that each player picks the action that is best for them, assuming that all the other players do exactly the same.

This is precisely the notion of a Nash Equilibrium.

This may be easier to explain in a two-player game. Call the two players Jennie and Joanie. The Nash equilibrium notion is:

Jenny picks her best action, assuming that Joanie does the same thing.

And

Joanie picks her best action, assuming that Jennie does the same thing.

In general, a Nash equilibrium is defined as when each player in a game uses a strategy or action that is the best response to the best responses of all the other players in the game.

We can now go to Figure 10.1, reproduced here, and think about how we can determine the Nash Equilibrium in that game.

Suppose Frank picks strategy “not cheat”. What is Jessie’s best response to that? If Jessie pick’s “cheat” he gets zero and if he picks “not cheat” he gets -1. So, the best response for Jessie to Frank I picking “not cheat” is: “cheat”.

Now put yourself in Frank’s shoes. If Jessie cheats, then Frank gets -5 from picking cheat and -10 from picking “not cheat”. So, Frank’s best response to Jessie picking “cheat” is to pick “cheat”.

If Frank picks “cheat”, then Jessie’s best response is to pick “cheat”.

Putting this all together at the risk of being a bit redundant:

Player I’s best response to player II picking ”cheat” is “cheat”.

AND:

Player II’s best response to player I picking “cheat” is “cheat”.

Each player’s strategy is a best response to the other player’s strategy.

That is exactly the Nash equilibrium.

Note: This is an awful equilibrium, an inefficient equilibrium. Both players could be made better off, getting -1 each, if they somehow could pick not cheat and not cheat as their respective strategies. But this cannot happen in a Nash equilibrium. If player I chooses cheat, the best response of Player II is to cheat as well. The outcome where each player ends up with -1 is clearly efficient, or Pareto optimal. But if one player would pick not cheat, to have shot at the outcome -1, the other player would rationally pick cheat, destroying the hope of such a favorable, efficient outcome. The punch line from this note is that the Nash equilibrium need not be efficient. In the Prisoners’ Dilemma Game, it clearly is not efficient.

10.3.2 Bach or Stravinsky

The next game we consider is Bach or Stravinsky. The story goes that there are two people: James and Tom. They both very much enjoy each others company, James is a big Bach fan, Tom loves Stravinsky. They are deciding whether or not to go to a performance of Bach’s Brandenburg Concertos or Stravinsky’s Rite of Spring. Both performances are only for one night and the same night.

The game is illustrated in Figure 10.2 below. We can see that both would enjoy being with each other rather than going by themselves to their composer of choice.

Bach and Stravinsky

Figure 10.2: Bach and Stravinsky

The game is very symmetrical. There are two Nash equilibria.

In the first equilibrium, James chooses Bach and Tom chooses Bach. In the second equilibrium, Tom and James both choose Stravinsky.

To see that these are really Nash equilibria, consider the upper left-hand corner. James does not have an incentive to deviate since that would lower the payoff to 2. The same is true for Tom. So, Bach is Tom’s best response to James picking Bach and Bach is James’ best response to Tom picking Bach.

These same arguments do apply to the lower right-hand corner.

10.3.3 Chicken

The third game we consider is called Game of Chicken. You may have seen versions of this played out in some old James Dean or similar movies about growing up. The payoffs for this game are in Figure 10.3.

Game of Chicken

Figure 10.3: Game of Chicken

The lower left and the upper right are two Nash equilibria. Imagine you are in the lower left where player I gets 50 and player II gets 0. Deviation by player II would lower player I’s payoff from 50 to the very low number of -100. For player II the payoff would go from 0 to the very low number -100. No player has an incentive to deviate. What we have found is one Nash equilibrium. But then, for reasons of symmetry, the upper right must be an equilibrium as well.

10.4 Repeated Games

Sometimes in life we face games that are played just once. The invasion of the Normandy in 1944 is one such example. The German unification is a second. How to deal with the Russian annexation of the Crimean Peninsula is another. This last example points to repeated interaction between players, after the relationship between the US and Russia is ongoing. How we dealt with (or not) with the Russian interference in the 2016 elections will surely matter for how Russia deals with the 2020 election.

Here we will use a simple example to think about repeated strategic interactions.

Think of the following one-shot game: There are two players, Jill and Abby. Each player chooses a number between 1 and 10, inclusive of 1 and of 10. Without communication of course. The player who picks the lower number will get that amount in dollars. If the two players pick the same number, the money is split evenly.

The Nash equilibrium in this game is simple: Each player picks the number “1”.

Why? If Abby were to pick anything else, like 7, Jill would undercut and pick 6. If Jill picks 6, Abby would undercut and pick 5, etc. etc. etc. This process only stops when both pick 1.

So far this is easy. This is just one particular example of prisoners’ dilemma game.

Now we do the following. We repeat this very same game for 20 rounds. Abby and Jill repeat this game of 20 rounds. At the beginning of the first round, both players know that the game is played exactly twenty rounds and that after the 20th round the game is over and they walk away from the game. After each round the actions of each player are revealed to the other. So, in any round, Jill knows what Abby did in all previous rounds, (if she does not forget the earlier rounds). And vice versa. They know at the outset that they will play for 20 rounds and that the game is over then. After these 20 rounds they will go their separate ways and they will never see each other again.

It appears that in this new game the strategy of each player will be a set of 20 numbers, a number to be picked in each round. So, Jill would pick 20 numbers and Abby would pick 20 numbers. And both players payoffs over the entire 20 rounds would be determined by these 40 numbers.

It is actually more complicated than that. Much more complicated.

Since Abby knows what Jill did in the past, Abby might actually make her choice in any round contingent on what Jill did in the past. So, in each round, Abby’s strategy would not be a simple number, but a function of what Jill did in the entire past. You see how quickly this sort of thing can get very complicated and involved.

One way to think about what might be an equilibrium in this 20-round game, is to start analyzing this game at the very last round, the 20th round.

At the beginning of the last period, the players both know that there is only one round left. Whatever happened in the first 19 periods is history. They will “let bygones be bygones”. That is the rational way to proceed. So, at the beginning of the last round, the players know that there is only one round left. It is exactly like the one-shot game described above and the Nash equilibrium in this the last round is (1,1).

Then we (mentally) go back to the beginning of round 19. What do the players know at the beginning of round 19?

They know the entire history up to round 19 and they know that in round 20 the equilibrium is (1,1). So now round 19 is effectively a one-shot game and therefore, the equilibrium in round 19 is (1,1).

Now we go to the beginning of round 18. At the beginning of round 18 the players know the history leading up to round 18 and that in each of the last two rounds the equilibrium will be (1,1) and (1,1). So again, the players will view round 18 as a one-shot game and the equilibrium in round 18 will be, surprise, surprise! (1,1).

Proceeding in this way we can work backwards all the way to the very first round. Doing this we will conclude that in any round the equilibrium is (1,1).

This way of thinking is called

Backward induction

This outcome is very disappointing for both Abby and Jill. If they pick 1 each period for 20 rounds, they split a dollar every round and each one of them gets \(\$10\) for the entire game.

Under different circumstances, each could have earned \(\$100\). How?

If somehow each could have picked the number 10 in each round, they would get \(\$5\) per round and walk away from this game with \(\$100\). Much better than the measly \(\$10\).

But how to get there? That is the tricky part. If in any round Jill picks a 10, Abby has an incentive to undercut her and pick 9. It would clearly be in Abby’s interest to do so. Jill knows this and would therefore be very skeptical to pick a 10.

(10, 10) is not a Nash equilibrium. It is, however, the outcome that maximizes joint payoffs.

So, the big tension is:

If I want to have a chance to get rich, my strategies should tend towards 10. But then I run the risk of being undercut and I may end up with nothing. If I play it safe, I pick 1 every time, but then I stay poor. How to play this game?

10.4.1 Tit for Tat

This follows Dixit and Nalebuff, The Art of Strategy, 2008

In the 1980s, a political scientist, Robert Axelrod invited other social scientists from a variety of backgrounds to submit computer algorithms to play repeated PDGs against each other. One algorithm that did well was “tit for tat” submitted by Anatol Rapoport, a math professor.

Why would “tit for tat” do well? Dixit and Nalebuff point out 4 properties of that strategy:

Clarity

Niceness

Provocability

Forgivingness

  1. Clarity: it is clear and very easy to understand. You start out nice and cooperatively and you stay that way until your opponent is non-cooperative. Then you become non-cooperative briefly for one period and then you go back and play nice and cooperate. Simple, easy, clear.
  2. Niceness: You are as nice as the other player. If the other player cooperates, you cooperate. So, cooperation has a chance to get started and it has a chance to be sustained. You are never the first to play nasty.
  3. Provocability: You are not a door mat. When your opponent starts playing non-cooperatively, you respond immediately. You let your opponent know, you have noticed the non-cooperative behavior and you are responding quickly in a measured way with a proportionate response.
  4. Forgivingness: After one period of punishment, you go back to playing cooperatively. You are not holding grudge forever. One period of punishment is good enough; then we can move on with life.

There are a few noteworthy things about this strategy:

  1. Notice that “tit for tat” is NOT a Nash equilibrium strategy. You may want to think about why this is true.
  2. Notice also, that you can do better than I, if I play “tit for tat”. You may think about this also.
  3. Notice that “tit for tat” does not beat other strategies in head on competitions. It only looses or ties. The redeeming feature of “tit for tat” is that you never lose too badly. The most important lesson to be learned here is: What can I do to establish cooperation without getting suckered or worse, die? Periodically there are remarkable stories of such cooperation, sometimes seemingly despite all odds.

10.4.2 Christmas Truce WWI 1914

The Wikipedia entry

The movie: Joyeux Noel

Two historical accounts:

Silent Night: The Story of the World War I Christmas Truce

The Small Peace in the Great War: Western Front 1914: When the Germans, French and British Celebrated Christmas Together

Fall and winter, 1914. Jack, 20 years old from Liverpool, UK, has been fighting the “Huns” in a miserable trench in northern France for months. His fiancée, whom he loves dearly, is back at home waiting for him.

Sepp, 22 years old from Augsburg, Germany is in a miserable trench in northern France, just about 100 or so yards away from Jack, defending his country against the “British imperialists”. He is married to a lovely woman, whom he loves dearly, who is pregnant with their second child.

What is Jack’s over-riding interest?

To make it back home to his fiancée safe and sound.

What is Sepp’s over-riding interest?

To make it back home to his family safe and sound.

Who stands between Jack and his goal?

Joseph, of course!

Who stands between Sepp and his goal?

Jack, of course!

There are lots and lots of Jacks and Sepp.

This sure looks like repeated prisoners’ dilemma game to me. Sepp can have a small gain by taking out Jack. Jack can have a small gain by taking out Sepp. The big loss comes of course, when a Jack, does not matter which one, takes out Sepp. For Sepp that is a big loss.

Or vice versa. One of the many Sepps takes out Jack.

The Jacks and the Sepps have been in these miserable water-logged, rat-infested trenches ever since The Guns of August (by Barbara Tuchman). Over time, the Jacks (Sepps) figure out and learn that Sepp (Jack) is less likely to shoot and kill, if Jack (Sepp) acts likewise, especially when one is heeding natures call and when, with the pants down, one is very vulnerable.

Cooperation has its benefits. It is a matter life and death.

Survival chances rise when guns rest. (We still don’t seem to realize that!)

So, what happens on December 24, 1914?

On some parts of the front, there is a truce, an absolute truce. Fighting ceases totally. Gifts are exchanged, Brandy and Schnaps. Carols are sung. There are even soccer matches played between the trenches.

You can imagine that establishing such cooperation to cease all hostilities might take some time. Hoping that such a miracle would happen overnight is probably totally unrealistic. Having spent day after day after day in a trench just the length of a soccer field away from the “enemy” probably helped in allowing such cooperation to develop.

According to some historical accounts this kind of cooperation, the Christmas truce, was more likely between the Germans and the British than between the Germans and the French. Why? It turns out that before the war, many young German men had actually lived and worked in England. They spoke English. Being able to communicate helps. Perhaps learning a foreign language, beyond being able to order a good bottle of red wine in a nice restaurant in Barcelona, is useful.

Many had sweethearts in England. They actually used the Christmas truce to send letters to their sweethearts. For many this was the only way to communicate: Cell phones were not invented yet and, in the trenches, there would probably have been terrible service and then all that awful background noise. In addition: the German government and postal service censored all mail and did not deliver letters to the enemy.

Alas, peace was short-lived.

The military leadership found this fraternizing with the enemy, this truce, abhorrent. So, what can you do to disturb the peace?

  1. Vigorously prosecute fraternizers. Obvious.
  2. Rotate troops in and out more frequently. There is a little game theory involved.
  3. Order those who cannot participate in exchange of brandy and Schnaps to disturb the peace. That would be the artillery. Artillery fire from 2 miles behind the lines could disturb any peace.

Undoubtedly, there are others.

There were no Christmas celebrations on the front in 1915.

The war dragged in till 1918. Some celebrate Armistice Day on November 11. The total number of WWI casualties, military and civilian, has been estimated at 40 million people.

10.5 Glossary of Terms

Backward Induction: A method to solve for the equilibrium in finitely repeated games. It starts by determining the equilibrium in the last periods and then works backwards to the first period.

Battle of the Sexes Game: see payoff structure in text

Best Response: A strategy is a best response for a player if it maximizes that player’s payoff GIVEN the vector of particular strategies for all other players.

Dominant strategy: A strategy for a player is called dominant if it is the best strategy that player can use, regardless of the strategies chosen by the other players.

Game: A list of players with their feasible actions/strategies and their payoffs where each payoff may depend on the actions of all other players.

Game of Chicken: see payoff structure in text

Grim trigger: A particular strategy that a player may choose in a PDF game. It involves punishment for deviating from a cooperative outcome and the punishment lasts to the end of the last round of the game.

Nash equilibrium: A particular equilibrium that is obtained when all players in a game use best responses to each other.

Prisoners’ Dilemma Game: see payoff structure in text

Strategic behavior: Behavior that takes the potential reactions of others to one’s own actions into consideration.

Strategy: A feasible action in a game.

Tit-for-Tat Strategy: A particular strategy in a game that a player may choose in a PDF game. It involves punishment for deviating from the cooperative outcome and the punishment last only one or very few periods and then returns to cooperative behavior.

10.6 Practice Questions

10.6.1 Discussion

  1. In words, without any appeal to graphs, explain why the players in the PDG find it difficult to cooperate.
  2. Discuss the efficiency properties of the Nash equilibrium in the Game of Chicken.
  3. Explain why “tit-for-tat” in a repeated prisoners dilemma game is not a Nash equilibrium strategy.
  4. Consider a repeated prisoners’ dilemma game between you and me. If you know that I am playing “tit-for-tat”, explain why you can do better than I. What strategy will you or can you use to get a higher payoff than I?

10.6.2 Multiple Choice

  1. In the prisoners’ dilemma game

    A. The two Nash equilibria are efficient
    B. The two Nash equilibria are inefficient
    C. The single Nash equilibrium is efficient
    D. The single Nash equilibrium is inefficient

  2. A dominant strategy is

    A. A strategy that is best for the player regardless of what the other player does
    B. A strategy that is best for the player but only if the other player also uses a dominant strategy
    C. A strategy that guarantees a payoff that is higher than the other player’s payoff
    D. A strategy that guarantees an efficient equilibrium

  3. In the prisoners’ dilemma game

    A. Both players use a dominant strategy
    B. Only one player uses a dominant strategy
    C. No player uses a dominant strategy
    D. There is no dominant strategy

  4. In the Battle of the Sexes game

    A. There is one Nash equilibrium, and it is efficient
    B. There is one Nash equilibrium, and it is inefficient
    C. There are two Nash equilibria and only one of these is efficient
    D. There are two Nash equilibria that are efficient

  5. In the Game of Chicken, the number of efficient outcomes is

    A. 1
    B. 2
    C. 3
    D. 4

  6. In the Game of Chicken there is/are

    A. One efficient Nash equilibrium
    B. Two efficient Nash equilibria
    C. Three efficient Nash equilibria
    D. Four efficient Nash equilibria

  7. Game theory is most useful to study

    A. Competitive markets
    B. Individual behavior
    C. Monopolies
    D. Oligopolies

  8. Imagine that the US substantially increases its military presence in the Pacific Region. We would then expect

    A. South Korea to increase its defense expenditure because of the free rider problem
    B. South Korea to decrease its defense expenditure because of the free rider problem
    C. Overall defense spending by US allies in the Pacific region will increase approximately by the increase in US spending
    D. Overall defense spending by US and its allies will decrease

  9. Imagine that Abby and Jill play the repeated prisoners’ dilemma game for 20 rounds like the one that is described above. The meet regularly at their favorite gym and for dinner afterwards to collude and agree on strategies that generate high payoffs. There is one difference to the game described above. In all periods 3, 6, 9, 12, 15 and 18 the highest number that can be picked is 20, not 10 as in all the other periods and as in the game above. Then we would expect

    A. The collusive agreements likely to break down in the periods whose numbers are divisible by 3.
    B. The collusive agreements likely to break down in all other periods
    C. The collusive agreements likely to break down in the first 10 periods
    D. The collusive agreements likely to break in even numbered periods

  10. There is much shipping down across the Great Lakes of the US and Canada. There are just a few very large shipping companies. In some years the Great Lakes are frozen and therefore not “shippable” for long periods over the winter. Let’s call these years Harsh Winter Years, HWY. Other years are called normal. If the shipping companies have collusive agreements to cover the freight they charge to clients, we would expect

    A. The collusive agreements likely to be unaffected by the weather
    B. The collusive agreements likely to break down in HWY.
    C. The collusive agreements likely to break down in years other than HWY
    D. The collusive agreements likely to break down in winter.

  11. There are just a few large cement companies in Indiana. The market for cement is very localized due to large transportation costs. We would any kind of collusive agreements likely to break down in

    A. Recession years
    B. Boom years
    C. Even years
    D. Leap years

  12. There are two Cable TV companies in Indianapolis. They have sizeable monopoly power are pretty good at carving up the market and setting high collusive prices. The government breaks up these two companies into two companies each, so there is a total number of four firms. After the break-up

    A. Collusive agreements are harder to negotiate
    B. Collusive agreements break down more easily
    C. Prices can be expected to be lower
    D. All of the above

  13. Alice plays a repeated prisoners dilemma game for 20 rounds like the one described in the text above, alternatively against Jill and against Jackie. Both Jackie and Jill play versions of tit for tat and this is known to Alice. Pretty soon Alice realizes that Jill is more vengeful than Jackie, in the sense that her punishment phases last two periods, while Jackie’s last only one period. We can expect

    A. That Alice is as cooperative when she plays against Jackie, and she is when she plays against Jill
    B. That Alice is more cooperative when playing Jackie than when playing Jill
    C. That Alice is less cooperative when playing Jackie than when playing Jill
    D. Alice to ignore her opponents’ actions

  14. There are two private trash haulers in Bloomington. They serve the entire Bloomington market. One trash hauler, in an interview with the local newspaper, says he will be forced to increase prices next year by 10%. Such an announcement can be expected to

    A. Be ignored by the other trash hauler.
    B. Increase the likelihood of collusion in that market
    C. Decrease the likelihood of collusion in that market
    D. Decrease the price by the other trash hauler

  15. Imagine a prisoners dilemma game that goes on forever with some very high, close to one, probability. That means that each period, the probability that the game is going on next period is very high, close to one. Relative to the same prisoners’ dilemma game that is repeated only 20 rounds we would expect

    A. A more efficient outcome for the players
    B. Higher payoffs in period 19
    C. Higher payoffs in period 20
    D. All of the above