890 likes | 1.79k Vues
Compositionality. Wilhelm von Humbolt famously described language as a system that “makes infinite use of finite means.”. Infinite Noun Phrases. There are infinitely many noun phrases: you can always make another one by adding another adjective: D og Old dog Smelly, old dog
E N D
Wilhelm von Humbolt famously described language as a system that “makes infinite use of finite means.”
Infinite Noun Phrases There are infinitely many noun phrases: you can always make another one by adding another adjective: • Dog • Old dog • Smelly, old dog • Brown, smelly, old dog • Big, brown, smelly, old dog
Infinite Adjective Phrases • Old • Extremely old • Probably extremely old • Invariably probably extremely old • Predictably invariably probably extremely old Used in a sentence: “Residents of nursing homes are predictably invariably probably extremely old.”
Infinitely Many VPs and DPs And of course there are infinitely many verb phrases, and even “determiner phrases”: • John’s mother. • John’s wife’s mother. • John’s wife’s lawyer’s mother. • John’s wife’s lawyer’s dog’s mother.
Infinitely Many Sentences It follows that there are infinitely many sentences, because (for example) each sentence NP + VP can be lengthened to AP + NP + VP, then to AP + AP + NP + VP, and so on.
Infinitely Many Sentences In addition, there are infinitely many sentences because you can take any sentence S and add “so-and-so believes that” to the front: • S • John believes that S • Mary hopes that John believes that S • Sam doubts that Mary hopes that John believes that S
Recursion This is in general possible because language is recursive. Suppose I’m throwing a party. I start writing the invite list: • My friends are invited. • Friends of my friends are invited. • Friends of friends of my friends are invited. • Friends of friends of friends… It seems like I’ll never finish!
Recursive Loop But suppose instead I said: INVITE LIST: • My friends are invited. • If x is a friend of someone who is invited, then x is invited. This captures all the cases, by going in a loop. (ii) defines who is invited in terms of who is invited.
Recursion and Language Here’s how language might do it: NOUN PHRASE: • “man” is a noun phrase • If NP is a noun phrase then “old” + NP is a noun phrase. From this recursive definition, it follows there are infinitely many noun phrases.
Infinite Use of Finite Means This is one sense in which language “makes infinite use of finite means.” There are finitely many words, and the rules of grammar are presumably finite. But recursion generates infinite complex expressions from a finite “base.”
Understanding But this raises another question. Each of the infinite distinct sentences in English has a different meaning. We cannot learn the meaning of each one separately. But we can understand any English sentence, even one we’ve never heard before.
Novel Utterance “Last week a former Royal Marine who is the boyfriend of the model Kelly Brooks crashed into a bus stop while driving a van carrying a load of dead badgers.”
Compositionality How is it possible for us to understand a potential infinitude of novel utterances? The most common solution in philosophy and linguistics is to maintain that the meanings of complex expressions depend on– and depend only on– the meanings of their simple parts and the ways that those parts are organized (put together by the grammar). This is called compositionality.
Compositionality How does this solution work? Since there are only finitely many simple expressions (words/ morphemes) in English (or any other language), each language user only has to learn finitely many meaning facts: what all the simple expressions mean.
Compositionality Then when that user encounters a novel utterance she just uses the already-learnt meaning facts about words and the grammar of the utterance to work out its meaning. Compositionality says that’s all she needs! “The meaning of the whole depends on (and only on) the meanings of the parts and the way they are combined.”
Non-Compositionality What does this claim rule out? Here’s an example of a non-compositional arithmetical function #: (n # m) := n + m, if (n # m) appears alone. (n # m) := n + m + x, in the context (x # (n # m)). In this example the value of (n # m) depends not just on the values of n and m, but sometimes on other values (e.g. x).
Locality In this sense, compositionality is local. In the expression [old [brown dog]] what “brown dog” means cannot depend on what “old” means, even though that’s also part of the expression containing “brown dog.”
Non-Compositionality The second thing compositionality rules out is that the meaning of a complex depends on more than just the meanings of the parts. For example consider the count function C, it counts the number of symbols after it, ignoring their values: C(8) = 1 C(5 + 3) = 3 C(2 + 2 + 2 + 2) = 7
Semantic Closure • “Lois Lane loves Clark Kent.” • “Lois Lane loves Superman.”
Semantic Closure We cannot say that the reason they have different meanings is that “Clark Kent” and “Superman” are different words. Compositionality says that different meanings are possible for complex expressions only if their parts have different meanings (not diff words).
Compositionality The meanings of complex expressions depend on– and depend only on– the meanings of their simple parts and the ways that those parts are combined.
Compositionality Different views regarding what compositionality is result from different views of what meaning is and what dependence is.
Compositionality The meanings of complex expressions depend on the meanings of their simple parts and the ways that those parts are combined.
Functional Dependence The meanings of complex expressions are a function ofthe meanings of their simple parts and the ways that those parts are combined.
The Substitutability Criterion An equivalent version of functional dependence: “For any sentence S(E) containing some expression E as part, if E and E* have the same meaning, then S(E) and S(E*) have the same meaning.”
Problems with Functional Dependence • Any language that contains no synonyms is automatically compositional. • The fact that the meaning of a complex expression is a function of the meanings of its parts & the way they’re combined does not guarantee that we can calculate the meaning of the complex expression from the meanings of its parts & the way they’re combined.
Problem #3: Systematicity • Le chienaboie. The dog barks. • Le chat aboie. The cat dances. • Le chat pue. The skunk eats.
Computational Dependence The meanings of complex expressions are a computable function ofthe meanings of their simple parts and the ways that those parts are combined.
The Empirical Conception of Dependence We could understand “dependence” as whatever relation obtains between the meaning of a complex expression and that expression’s syntax and the meanings of its parts that in fact explains our ability to learn and understand a language containing an infinity of expressions.
The Empirical Conception of Dependence That is, some philosophers think: we know that language is compositional, but it is an empirical question as to just what compositionality consists in. (What notion of dependence is correct.)
What’s at Stake? Before we consider arguments for or against compositionality, let’s look at what’s at stake. At various points, compositionality has been used to argue against all of the theories of meaning we have considered in class.
Vs. the Idea Theory According to the idea theory, the meaning of a word is an idea, where ideas are construed as something like “little colored pictures in the mind.” Let’s consider an example: what’s your idea of a pet?
Vs. the Idea Theory That clearly doesn’t work. Notice that we cannot say that in the context of “____ fish” “pet” means something other than . This would make the meaning of “pet” non-local (depend on surrounding context) and that’s not allowed on any compositional theory. Conclusion: the idea theory violates the principle of compositionality.
Vs. Verificationism Let’s suppose that the meaning of a sentence is the set of experiences that it probably causes you to have. So a cow will probably cause you to hear cow-sounds, so cow-sounds are part of the meaning of “cow.” In other words the probability of cow-sounds is increased by the presence of cows.
Cows are Safe Let’s suppose that the vast majority of cows are safe. So the meaning of “cow” does not include the experience of bodily harm, because encountering a cow lowers, rather than raises, the chances that you’ll experience bodily harm.
Brown Things are Safe Let’s also suppose that brown things are in general safe. So again, “brown” doesn’t have the experience of bodily harm as part of its meaning either. You’re less likely to experience this around brown things than around other-colored things.
Brown Cows are Dangerous However, suppose that the small number of dangerous cows and the small number of dangerous brown things are allbrown cows. Thus the meaning of “brown cow” contains the experience of bodily harm. That experience confirms the presence of brown cows.