Dishonest opponents

 Posted by (Visited 10513 times)  Game talk  Tagged with:
Jul 082013
 

Another question from Quora. At issue was whether a game can be successful if it relies on players being honest about what they think. The example given was “what number am I thinking of?” If the player with the secret number lies, then the game can be unwinnable. So the poster wondered if there were any examples of successful games that rely on blind trust.

Original question is here. The poster has since updated it to ask “opponents” rather than “players.” Before the edit, I posted that I was unsure if I understood the question, because of course there are so many examples of games that rely in blind trust in other players:

  • A player in a team sport relies on his teammates’ cognition all the time. As just one example, passes are executed with the faith that the receiver will be where he is supposed to be, as previously practiced.
  • Team sports rely especially on the coach’s cognition, and there’s a good case to be made that many team games are actually coach vs coach, using the players as poorly controlled tokens. The players often cannot perceive the overall strategic situation very well
  • Bridge and many other cooperative games are about building up trust in partner’s capabilities even though they do not share equal access to information.
  • The classic Prisoner’s Dilemma is a game theory example of blind trust.

I could go on. Which led me to conclude that what was being asked was really about whether the opponent is trusted, and specifically as regards the feedback they give to an action in the game. In a game like this, the player makes a move (uses a verb), it feeds into the black box of rules, and the opponent is supposed to be honest about the way in which the game state is updated, and feed back to the player the results of the action.

Obviously, perfect information games render this moot; state is visible to all parties. In chess you move, the opponent can only challenge the legality of a move. The state of the board and the results of the move are obvious to all. Same with most “race” style board games, and so on.

So we’re really talking about imperfect information games. Here we see that many games have reliance on honesty. A few examples:

  • Go Fish relies on an opponent being honest when they say they have or do not have a card. Even the Wikipedia article notes that “Go Fish is very much dependent on the honor system; lying about the contents of one’s hand is difficult to prevent.”
  • Hangman relies on the opponent not changing up the word from among possible permutations when it looks like the player is making progress. (I remember once leading a game of Hangman when I was in second grade, and actually messing up my word and forgetting it midstream under all the pressure from the older kids. They were pretty upset with me.)
  • Battleship relies on the opponent not moving their ships and accurately reporting where hits and misses have occurred. Of course, moving the ship can be tricky to pull off, but misreporting a hit is easy.

In all these cases, however, there is a moment where tokens of some sort representing state move from hidden to public information. If the player had lied, it generally becomes apparent at that moment. As these are all games of gradually revealing the hidden information, as the picture becomes clearer, the lies are also exposed. In Go Fish, if the player puts down a pair that uses a card that they disclaimed, well, they’ll get caught. So in effect, the game has a sort of enforcement mechanism in that the progress of the game is from imperfect to perfect information.


That leaves games where state is not revealed. Examples might include the number guessing game, but also games like Diplomacy, Werewolf/Mafia, and even poker.

Many of these games rely instead on a presumption of deception. You’re assumed to be lying, and in fact it is a key strategy for success. But these games are also generally based on repeated turns of the same actions. Given iterative interaction, they also eventually oblige the player to reveal game state. In other words, you can lie your way through Werewolf, but the revelation of the truth is the end of the game. You can bluff your way through poker, but only while the opponents choose to fold. At  the point of an actual contest, you can’t win unless you show your winning hand.

It’s possible to “fix” the number guessing game using iterative interaction. Given enough guesses and a bounded range, the number guessing game would be cheat proof (especially if it takes the form of “hot and cold” and is therefore susceptible to a binary search). And in Diplomacy, eventually there’s a winner regardless of how much someone lied, and at the end, that person gets the cold shoulder. 😉

Iteration, in general, is what produces trust in humans. We trust where there is repeated interaction that is successful. (See my series: On Trust (part I), On Trust, Part II, On Trust, Part III). Games typically rely on iteration, and therefore can be seen as trust-building machines, in some sense. We often think of this the other way — that entering into a game is to put ourselves into a position of mutual trust within that famous old magic circle. Within that space, even deception is codified, and we trust that it will be used only in the way the rules dictate.

Any game where there are social contracts surrounding the rules (which is to say all of them) are built on certain sorts of trust. The most basic is “we’re going to follow the rules together,” even if this is never stated (though of course there is always room for the satirical take on this, as seen in Paranoia!). In ludological terms, these sorts of social contract rules are considered as binding and real as the ones in the code or the instruction booklet. Certainly you’ll get in just as much trouble with fellow players for breaking these informal rules as you will for breaking a rule in the rulebook; yet another example of how games exist not solely in the designed construct, but drawn on the canvas of the human mind.

 

  6 Responses to “Dishonest opponents”

  1. In the board game Cosmic Encounter, there are a couple of powers (Gambler comes to mind; I think there’s at least one other) that depend on never revealing a key piece of information. (See: http://cosmicencounter.wikia.com/wiki/Gambler ) It’s important that you never show whether or not you lied, because you may wish to try the same tactic again. Of course, if your bluff is called, you have to reveal it. It might be that CE would be less successful if the Gambler was a common type of power, but it isn’t.

  2. Ultima Online had some fascinating elements related to trust and deception in the earlier history of the game.
    Learning who you could trust or not, this led to players being more involved with other players. You made friends with those you trusted, worked together to learn who else you could trust or not. It made for a much more social game.
    Examples:
    -Trusted Blacksmiths at the forges for repairs
    -Knowing the magical properties of a weapon meant you knew that weapon- “I know that weapon, and it looks like mine!”
    -Knowing guilds as friendly or not
    -Buying and selling trickery
    -Scams

    These sorts of things provided clues and info that had meaning to you. It’s just too bad that the game didn’t offer more meaning to it through the social sphere.

  3. This makes me think of minecraft. Often on smaller servers you are both playing against and with players on an hour to hour basis. One minute you are good friends taking down bosses or building something together, but at any time you can turn on each other and the stakes are rather high. It often seems inevitable that players turn against each other since you have so much to gain by turning on someone and taking advantage of their trust. Since working together is sometimes required (or fun) you are constantly going between fighting and collaborating in a very prisoner del. type style.

  4. Diplomacy is an interesting example, because it’s often talked about (only half-jokingly) as damaging friendships. Backstabbing is very much part of the game, it’s what you do to win, but people can take it very personally.

    There’s a line between carefully managing hidden information (as in poker or Netrunner, for example) and the game actually being about lying to other players. The latter approach can work, but I feel like it should be kept short to limit the level of personal engagement with the game scenario.

  5. Competitive duplicate bridge includes, of course, coded communication with your partner, but also the stipulation that your communication have a standard encrypted set of possible meanings, as laid out by your convention cards. I’m not a bridge expert, but I do remember playing in little clubs as a novice and having to defend myself from accusations of dishonesty – I was in fact merely incompetent.

    It’s an interesting combination of relying on truth from your opponents and also making telling that truth a non-trivial problem.

  6. An interesting example of a kind of game where you could get away with dishonesty is the group of games where the rules themselves form (part of) the hidden state. I’m thinking specifically of Mao [http://en.wikipedia.org/wiki/Mao_(card_game)] and Penultima [http://en.wikipedia.org/wiki/Penultima_(game)]. In such a game, a player could easily claim to have invented a rule without actually devising one, and then invent it incrementally as the game unfolds, thus allowing them to potentially tailor the rule to other players’ rules, etc. It might even make for better games if this were common practice. I’m not entirely sure that the word “opponent” applies in this situation here… these games are more about creating a shared experience of reasoning out the rules than an actual competition, in my experience.

Sorry, the comment form is closed at this time.