1 / 40

SIGNALING GAMES: Dynamics and Learning

SIGNALING GAMES: Dynamics and Learning. NASSLI 2016 Tuesday. The Flow of Information in Signaling Games. “In the beginning was information. The word came later.” Fred Dretske Knowledge and the Flow of Information. My gloss in Signals (2010).

gacosta
Télécharger la présentation

SIGNALING GAMES: Dynamics and Learning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. SIGNALING GAMES:Dynamics and Learning NASSLI 2016 Tuesday

  2. The Flow of Information in Signaling Games

  3. “In the beginning was information. The word came later.” Fred Dretske Knowledge and the Flow of Information

  4. My gloss in Signals (2010) “Dretske was calling for a reorientation in epistemology. He did not think that epistemologists should spend their time on little puzzles or on rehashing ancient arguments about skepticism. Rather, he held that epistemology would be better served by studying the flow of information.”

  5. Sender-Receiver Games • Nature picks a state with some probability • Sender picks a signal with probability conditional on state observed • Receiver picks an act with probability conditional on signal received • At any state of the system, equilibrium or not, there is a well-defined joint distribution on state, signal act. There are straightforward generalizations to signaling networks.

  6. Information

  7. How to measure information about a state? • Key Quantity prSiG (state)/pr(state) (The numerator being the probability of the state conditional on the signal.)

  8. If the signal tells us nothing the information should be zero Log[prSiG(state)/pr(state)] Aczel and Daroczy (1975) On measures of Information and their Characterization

  9. Informational Content of a Signal • Informational content is a vector. • Quantity of information is a scalar.

  10. Quantity of information in a signal about the states Istates (signal) = ∑i prsig(state i ) log[prsig(state i)/pr(state i)]

  11. This is the Kullback-Leibler Divergence Solomon Kullback Kullback and Leibler 1951, Kullback 1959, Lindley 1956

  12. figure 3.2: Information as a function of probability of state 1 given signal, states initially pr .6,.4

  13. Extension to a little Network • Nature chooses one of four states by independently flipping two fair coins. Coin 1 determines up of down – let us say – and coin 2 determined left or right. The four states, up-left etc., are equiprobable. There are now two senders. Sender 1 can only observe whether nature has chosen up or down; sender 2 observes whether it is left or right. Each sends one of two signals to the receiver [(R,G),(B,Y)]. •→•←•

  14. Suppose Senders have deterministic strategies: Sender 1: Up => Red Down=>Green Sender 2: Left => Blue Right => Yellow Each signal carries 1 bit of information, The combination of signals carries two bits.

  15. Information about the Act • Defined in an entirely parallel manner: Iacts(signal) = ∑i prsig(act i ) log[prsig(act i)/pr(act i)] Can differ from quantity of information about states.

  16. Informational Content of a Signal Informational content about states: < log[prsig(state 1)/pr(state 1)], log[prsig(state 2)/ pr(state 2)], .... > (Likewise for acts)

  17. Example • Suppose that there are four states, initially equiprobable, and signal 2 is sent only in state 2. Then the informational content about states of signal 2 is: • IStates(Signal 2) = < -∞, 2, -∞, -∞> • The -∞ components tell you that those states end up with probability zero.

  18. Philosopher’s Objection • “But shouldn’t the content – at least the declarative content – of a signal be a proposition? And isn’t a proposition a set of possible worlds or situations?”

  19. Reply • States may be individuated as finely as you please. • Proposition can be specified by the “possible worlds” ruled out. • That is just what the -∞ components of the information vector do.

  20. Example • 4 states • Signal “tells you” that it is state 2 or state 4 • Content vector: IStates(Signal) = < -∞, __, -∞, __> The content vector gives a richer account of meaning.

  21. Objective and Subjective Information • The information so far is objective. • The probabilities are propensities of nature and of sender and receiver in some state of the system. • If senders and/or receiver’s have degrees of belief for all this, there are correlative notions of subjective information.

  22. Flow of Information •→•→• Suppose: S1 => R => B => A1 S2 => G => Y => A2 Players 1 and 2 use different languages. The informational content of B (said by player 2) is the same as the informational content of R (said by player 1).

  23. Deception A naturalistic account

  24. Is deception possible? “I can by no means will that lying should be a universal law. For with such a law there would be no promises at all, since it would be in vain to allege my intention in regard to my future actions to those who would not believe this allegation …” - Immanuel Kant Groundwork for the Metaphysics of Morals

  25. Deception in Nature • Female Photuris firefly devours a Photinus

  26. What is Deception? • A signal that raises the probability of a state that is not the true state carries misinformation. • A signal that is systematically sent to the benefit of the sender and the detriment of the receiver is deception.

  27. Deception by “half-truth” 3 states • Receptive mate • Predator • Nothing shaking Mating signal raises the probabilities of states 1 & 2, and lowers the probability of state 3.

  28. Deception in Equilibrium

  29. Universal Deception in Equilibrium

  30. Universal Deception can be Good for You • Suppose you are hald the time in the role of sender and half the time in the role of receiver in the preceding game. • Then you would prefer deception as universal law.

  31. Kant was wrong, wasn’t he?

  32. Where deception is impossible

  33. Deception is Impossible in Equilibrium (Kant’s revenge) but…

  34. You may never get to equilibrium

  35. Chaos(structurally stable) Wagner BJPS 2012, Sato Akiyama, Farmer, PNAS 2002.

  36. Lexical Content? Ruth Millikan Peter Godfrey-Smith (2012) Jonathan Birch (2014)

  37. Thank you.

More Related