1 / 35

Effect of Information on Collusion Strategies in Single winner, multi-agent games

Effect of Information on Collusion Strategies in Single winner, multi-agent games. December 2, 2010 Nick Gramsky Ken Knudsen. Contents. 1. Motivation 2. Identification of Collusion 3. Classification of Coalitions 4. Implementation 5. Results 6. Conclusions. Motivation.

Télécharger la présentation

Effect of Information on Collusion Strategies in Single winner, multi-agent games

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Effect of Information on Collusion Strategies inSingle winner, multi-agent games December 2, 2010 Nick Gramsky Ken Knudsen

  2. Contents • 1. Motivation • 2. Identification of Collusion • 3. Classification of Coalitions • 4. Implementation • 5. Results • 6. Conclusions

  3. Motivation • Reasons to Collude • Improve position relative to other agent(s) • Self-preservation / Survival • Explicit Collusions • Alliances • Survival • Truces • Implicit Collusions • Minimax against strongest player • Tit-for-tat

  4. Contents • 1. Motivation • 2. Identification of Collusion • 3. Classification of Coalitions • 4. Implementation • 5. Results • 6. Conclusions

  5. Identification • Find course grained collusive behavior • 1. Offensive-based collusion • Multiple agents attacking a single agent for a fixed number of rounds • In our examples, we limited this to 1 round. • 2. Defensive-based collusion • Multiple agents not attacking each other over a fixed number of rounds. • In our examples, we limited this to 2 rounds.

  6. Identification Offensive based coalitions

  7. Identification Defensive based coalitions

  8. Contents • 1. Motivation • 2. Identification of Collusion • 3. Classification of Coalitions • 4. Implementation • 5. Results • 6. Conclusions

  9. Classification Offensive based behaviors • Socially inclined behavior • For some predefined time, if target satisfies the following, then we define the actions of the attacking players as being 'socially oriented‘ • h(x) is a heuristic function for any adversary. • vh(x) when dealing with different layers of fog • Else: Some other collusive behavior

  10. Classification Offensive based algorithm

  11. Classification Defensive based algorithm

  12. Classification Missed opportunities  • Classify a missed opportunity by finding players that: • for a predefined period were not attacked above a certain percentage and… • satisfy either their power heuristic or visual heuristic (below) threshold

  13. Contents • 1. Motivation • 2. Identification of Collusion • 3. Classification of Coalitions • 4. Implementation • 5. Results • 6. Conclusions

  14. Implementation • Used Warfish to play games of Risk. • Free website warfish.net • Risk is a zero-sum game where players seek (simulated) world domination!  • Only one winner, the last remaining contestant. • Attacks are made via dice (random number generator) • Amass armies, grow in power, rule the world! • Or at least the world represented on a board...

  15. Implementation Environment • Reduced resource strategies • Randomized players • Set card trade-in values to be constant (5) • Disabled card capture on elimination • Multiple map types • Larger than original Risk board • Reduces board specific strategies in analysis

  16. Implementation World Map

  17. Implementation Europe Map

  18. Implementation Fog of War • Varied amount of information available to all agents via different levels of 'fog of war'. • 6 different levels of fog available in game • Level 0: No fog (perfect information) • Level 1: See all occupations, neighboring units only • Level 2: See all occupations (no units) • Level 3: Only see neighboring occupations and units • Level 4: See only neighboring occupations • Level 5: Complete fog (only know about self) • Tested with 3 levels of fog • {0,1,3}

  19. Implementation Oracles • Participants who annotated their strategies and behaviors as games were played • Compared oracle annotations to game data • Spot-check that analysis found collusion • Though noisy, analysis and annotations were inline with game history.

  20. Contents • 1. Motivation • 2. Identification of Collusion • 3. Classification of Coalitions • 4. Implementation • 5. Results • 6. Conclusions

  21. Results Collusion vs Game length x-axis: Number of turns y-axis: Number of "interesting" windows θh = 1.3 per 1 turn window

  22. Results Offensive • Players all gang up on Yellow. • Validated by Oracle annotations. • Game: 98478150 • Map: World • Fog Level: 1

  23. Results Offensive • Minmax against Blue • Confirmed by reading through the transcript. • Blue quickly gained power • Challenged remaining players to team up against him • Game: 97976903 • Map: Europe • Fog Level: 0 “Right now (Yellow) knows that if he does not get both you (Red) and (Green) on his side, this game will be won by me”

  24. Results Offensive Games 98478150 (left) and 97976903 (right) x-axis: Number of turns y-axis: Number of "interesting" windows θh = 1.3 / 1 turn window

  25. Results Offensive & Defensive • Minimax against strongest player • Towards the end of the game, explicit truce between top 2 players • Game: 12069561 • Map: Europe • Fog Level: 0

  26. Results Defensive *Game: 12069561 Scatter plot of number of windows classified as defensive-oriented for all games. x-axis: number of turns y-axis: number of interesting windows θ = 0.05

  27. Results Oracle • Oracle self-interest annotations (Blue) • Game: 88318444 • Map: World • Fog Level: 1 x-axis: Number of turns y-axis: Number of "interesting" windows θh = 1.3 / 1 turn window

  28. Results Fog Level 3 • Typical of the layer 3 games. • Everything breaks down. Players can’t figure out who is in the lead until it is too late. • Game: 67785982 • Map: Europe • Fog Level: 3

  29. Results • Collusion % is percentage of available windows where remaining players direct more than 75% of attacks towards target. • Social % is percentage of available windows with same criteria as above BUT the target satisfies heuristic thresholds from earlier • θh = 1.3 / 1 turn window • Target’s residual power • 43.3% (4-player) • 65% (3 player) • θh = 1.6 / 1 turn window • Target’s residual power • 53.3% (4-player) • 80% (3 player)

  30. Results Europe Map • θh = 1.3 • θh = 1.6

  31. Results World Map • θh = 1.3 • θh = 1.6

  32. Contents • 1. Motivation • 2. Identification of Collusion • 3. Classification of Coalitions • 4. Implementation • 5. Results • 6. Conclusions

  33. Conclusions • Presented a basic algorithm to identify and classify collusion • Games with unusually large number of collusive behaviors tended to prolong games beyond the average. • As fog increased (information decreased), collusive behaviors diminished. • Results were consistent across maps. • Level 0 data was consistent between our volunteers and the public. • Analysis supported by Oracle annotations and in-game conversations.

  34. Conclusions • Visual heuristic does not hold well for fog games • Based on a knowledge of territories and bonuses • Limited data sets • Time limitation • Short time-frame for project • Games averaged 20 days to complete • Require more experiments with fog levels • Data integrity • Games had large variance in player abilities • Players were involved in multiple simultaneous games • May have forgotten strategy • Players may have a predefined disposition towards other players (Social Value Orientation)

  35. Conclusions Future Work • Investigate possible equilibrium in collusions versus game length. • Lag response for social orientation. • Once the strongest player is removed from power, it can take a few rounds for the coalition to change strategies. • As information decreases, agents tend to collude less.  Why? • fairness • poor assessment of board • Mix socially oriented bots with human players

More Related