350 likes | 385 Vues
Representation. Franz Brentano 1838-1917. Zolt án Dienes, Philosophy of Psychology. What is a representation?
E N D
Representation Franz Brentano 1838-1917 Zoltán Dienes, Philosophy of Psychology
What is a representation? Something physical (the representational medium/vehicle) that is about something else (the referent). The representation (the medium) portrays the referent as being a certain way (the representational content/the sense). Content: The way the world is portrayed as being The tree is green The tree is green Refers to Referent: object or situation in the the world Medium: chalk, blackboard
The situation in the world has many aspects to it. Representational content presents the referent in only one (or a few) of the aspects: The tree is green The green has various shades A picture shows a scene from a certain point of view rather than all points of view etc Medium, content, referent are all different (but people often confuse them!)
Representations: 0) have aboutness (intentionality) 1) can be about the referent without the referent being about them (asymmetry) 2) can refer to a unique referent 3) can misrepresent 4) can be about non-existent referents (a representation always has content but not always a referent!) How? How can things be about something? (How can words mean anything? How can we mean anything?)
Brentano (1874): "aboutness" distinguishes the mental from the physical • Thoughts are always about something; tables are not • Are all mental states representational? • 2. Are all representations mental?
Are all mental states representational? • Traditional position in philosophy: • a. Propositional attitudes (believing that, wishing that, imagining that… ) are representational • (A proposition is something that can be true or false e.g. “I am in Australia”) • b. Qualitative states are not entirely representational • (consider pains, orgasm, free floating anxiety, happiness etc) • e.g. pain is just pain, it is not about anything.
Contrast: Even qualitative states are just representational Tye: pain can be seen as representational: Pain is your way of representing damage of a certain sort in a part of your body. Mood is your way of representing certain states of your body-brain Maybe mood is also a certain way of representing the world – e.g. generalised anxiety is representing the world in general as being threatening. Love is a certain way of representing a person. Is there something left over apart from representational content (i.e.qualia)?
2. Are all representations mental? Consider pictures, computer programmes, etc (Searle: they have intentionality derived from the mental.) Is having “aboutness” something intrinsic to the mental that cannot be explained? (dualism) Or can we have naturalistic theories of representation? How did the power to represent evolve? How can anything represent?
How do representations acquire representational power? i) A representation means what it does because someone intends it to mean that I intend this to be a drawing of my house.
How do representations acquire representational power? i) A representation means what it does because someone intends it to mean that I intend this to be a drawing of my house. But cannot explain mental representations: Must I intend my thought to be about what it is about? But what makes that intention be about my thought? Need another intention! And so on forever. This theory can’t apply to mental states. (or to frogs representing something as a bug. How did minds evolve to have aboutness?)
ii) similarity? is similar to (Is it?) Cannot explain asymmetry and misrepresentation
iii) Correlation? Neural activity in a cell correlates with the presence of cats. Therefore it codes (represents) cats But cannot explain asymmetry and misrepresentation
iv) Causal theory The presence of cats causes neural activity in a cell. Therefore the neural activity codes (represents) cats. Now there is asymmetry, but how do you explain misrepresentation? What sort of causes bring about representation? Not all causes do.
v) Functional role? Indicative representations: Dretske (1988): If A has the function of indicating B, then A represents (that there is) B. For example, if a pattern of neuronal activity has the function of indicating cats, then it represents “cat”. (Imperative representations: e.g. “Stand up!” have the function of bringing about their content)
a) Teleological theory of meaning: Function comes from history e.g. Bee dance (has the function of indicating the location of nectar) Millikan: Producer – bee that finds some nectar Consumer – other bees that then fly to the nectar Function arises from an evolutionary and/or learning history In the evolution of bees, the consumers could only do their function (get to the nectar) for a given bee dance when the nectar was in a certain location. Selection history -> function Once something has a function, it can malfunction: Misrepresentation is possible
Problems with historical functional accounts of representation: • “Swamp man” • In a cosmically improbable accident, some molecules in a swamp rearrange themselves to form an exact physical replica of you: it behaves like you, talks like you etc. • Swamp man comes from no evolutionary or learning history; there are no functions.
Problems with historical functional accounts of representation: • “Swamp man” • In a cosmically improbable accident, some molecules in a swamp rearrange themselves to form an exact physical replica of you: it behaves like you, talks like you etc. • Swamp man comes from no evolutionary or learning history; there are no functions. • If its “heart” stops it didn’t malfunction. • Just so, if it says “red” when it looks at green, it didn’t misperceive. • Therefore it has no intentionality, no mental life? • Does having a mental life depend on having a certain evolutionary/learning history or does it just depend on you as an individual having a certain physical state?
2. Meaning ambiguity The function of the frog’s fly-detector is to catch flies; that is why it represents flies Fodor (1990): But why represent flies, rather than small, black, moving things? Or why flies rather than stomach-filler? Or nutrient source? Functional theories can’t fix the precise meaning.
2. Meaning ambiguity The function of the frog’s fly-detector is to catch flies; that is why it represents flies Fodor (1990): But why represent flies, rather than small, black, moving things? Or why flies rather than stomach-filler? Or nutrient source? Functional theories can’t fix the precise meaning. Dennett: The functional story does not distinguish flies, stomach-filler, nutrient source, etc. But that is its strength: Why believe the frog’s representational content makes these distinctions? Presumably the frogs representations cut up the world less finely than our concepts (and a theory of representation should imply that). Function specifies meaning as precisely as it should.
An interesting consequence of historical functional theories: Externalism: The content of the representation is determined by events outside of the organism; representations have “broad content” (Consequence: When you mean something, there is no guarantee you know what you mean!)
An interesting consequence of historical functional theories: Externalism: The content of the representation is determined by events outside of the organism; representations have “broad content” (Consequence: When you mean something, there is no guarantee you know what you mean!) Intuitively we feel there is some content to our mental states dependent only on our state, not the environment Narrow content: content that does not depend on a system’s environment (Could Swampman have narrow content?) How could we have narrow content?
B. Functional role semantics (conceptual role semantics) Harman, Johnson-Laird, Block, Kuhn The content of a representation (e.g. mental state) depends on its functional role in inference (and in perception or action). content of a thought is a matter of how mental states are related to each other (and, on some accounts, to things in the external world – so functional role semantics need not be narrow)
B. Functional role semantics (conceptual role semantics) cont. Consider a calculator: a given state means “5”, because it is causally related to other states (“6”, “+”, etc) in appropriate ways Similarly, we understand the meaning of a number – say, 6 – only because our representation of 6 has appropriate functional/causal relations to other representations, like of 5 (and appropriate functional relations to perception, e.g. in seeing that there are 6 things). (The way the system is used as a whole makes the representations representations.)
But if the system is just defined in terms of causal relationships, how can it malfunction? How is misrepresentation possible?
Summary: With our mental states we represent the world HOW could we gradually evolve to be representational? Naturalistic theories of meaning – like Millikan – sketch a story of how representation can come about without mystery Given we can get such a theory to work: Dualism not demanded by the intentionality of mental states Representational theories of the mind possible!
People represent; but can we explain that in terms of inner (sub-personal) representations? Most of psychology (including cognitive and connectionist): Yes! Ecological optics, enactive perception, dynamic systems, New Robotics: No!
People represent; but can we explain that in terms of inner (sub-personal) representations? Most of psychology (including cognitive and connectionist): Yes! Ecological optics, enactive perception, dynamic systems, New Robotics: No! The “No”-camp argument: 1. We need less rich representations to solve many tasks than we previously thought 2. Therefore we don’t need representations 1 is a valuable radical insight; but 2 doesn’t follow!
Examples: • “Change blindness” • Ecological optics
Example from “ecological optics”: Catching a ball Within half a second of a 3-4 second flight, we know whether to run forwards or backwards, and at what speed, to arrive at the right place at the right time How do we do it??
Do people work out the ball trajectory in 3D space, calculate where it will land and run there? Assume only information is provided by angle of elevation of gaze, α, and its rate of change
To ensure α remains between 0 and 90 degrees could: • Let tan α increase at a constant rate • As α -> 90 degrees, tan α -> ∞ Tan α 0,0 α degrees 90 If tan α increases linearly, tan α will not reach ∞, so α will not reach 90 degrees!
Change of tan α with time is very nicely linear! McLeod and Dienes, 1993; Dienes & McLeod 1993
People do not use a rich representation of the world (e.g. trajectory of the ball) BUT people do represent: the angle of elevation of their gaze Simple, non-conscious, directly linked to action – but it is a representation and the explanation of how people catch the ball relies on there being in the person’s head a (sub-personal) representation as such of this angle.