1 / 76

June 6: General Introduction and “Framing Event Variables”

Meanings First Context and Content Lectures, Institut Jean Nicod. June 6: General Introduction and “Framing Event Variables” June 13: “I-Languages, T-Sentences, and Liars” June 20: “Words, Concepts, and Conjoinability ” June 27: “Meanings as Concept Assembly Instructions”

gotzon
Télécharger la présentation

June 6: General Introduction and “Framing Event Variables”

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Meanings First Context and Content Lectures, Institut Jean Nicod June 6: General Introduction and “Framing Event Variables” June 13: “I-Languages, T-Sentences, and Liars” June 20: “Words, Concepts, and Conjoinability” June 27: “Meanings as Concept Assembly Instructions” SLIDES POSTED BEFORE EACH TALK terpconnect.umd.edu/~pietro (OR GOOGLE ‘pietroski’ AND FOLLOW THE LINK) pietro@umd.edu

  2. Reminders of last two weeks... Human Language: a language that human children can naturally acquire (D) for each human language, there is a theory of truth that is also the core of an adequate theory of meaning for that language (C) each human language is an i-language: a biologically implementable procedure that generates expressions that connect meanings with articulations (B) each human language is an i-language for which there is a theory of truth that is also the core of an adequate theory of meaning for that i-language

  3. (D) for each human language, there is a theory of truth that is also the core of an adequate theory of meaning for that language Good IdeasBad Companion Ideas “e-positions” allow for “e-positions” are Tarskian variables conjunction reductionsthat have mind-independent values Alvin moved to Venice happily. Alvin moved to Venice. ee’e’’[AL(e’) & MOVED(e, e’) & T0(e, e’’) & VENICE(e’’) & HAPPILY(e)] ee’e’’[AL(e’) & MOVED(e, e’) & T0(e, e’’) & VENICE(e’’)]

  4. (D) for each human language, there is a theory of truth that is also the core of an adequate theory of meaning for that language Good IdeasBad Companion Ideas “e-positions” allow for “e-positions” are Tarskian variables conjunction reductionsthat have mind-independent values Alvin moved to Venice happily. Alvin moved Torcello to Venice. Alvin moved to Venice. Alvin chased Pegasus. Alvin chased Theodore happily. Theodore chased Alvin unhappily.

  5. (D) for each human language, there is a theory of truth that is also the core of an adequate theory of meaning for that language Good IdeasBad Companion Ideas “e-positions” allow for “e-positions” are Tarskian variables conjunction reductionsthat have mind-independent values as Foster’s Problem reveals, the meanings computed are humans compute meanings truth-theoretic properties of via specific operations human i-language expressions Liar Sentences don’t Liar T-sentences are true preclude meaning theories (‘The first sentence is true.’ iff for human i-languages the first sentence is true.)

  6. (D) for each human language, there is a theory of truth that is also the core of an adequate theory of meaning for that language Good IdeasBad Companion Ideas “e-positions” allow for characterizing meaning conjunction reductions in truth-theoretic terms yields good analyses as Foster’s Problem reveals, of specific constructions humans compute meanings via specific operations such characterization also helps address foundational Liar Sentences don’t issues concerning how preclude meaning theories human linguistic expressions for human i-languages could exhibit meanings at all

  7. Weeks 3 and 4: Short Form • In acquiring words, kids use available concepts to introduce new ones Sound('ride') + RIDE(_, _) ==>RIDE(_) + RIDE(_, _) + 'ride' • Meanings are instructions for how to access and combine i-concepts --lexicalizing RIDE(_, _) putsRIDE(_) at an accessible address --introduced concepts can be conjoined via simple operations that require neither Tarskian variables nor a Tarskian ampersand 'ride fast' RIDE( )^FAST( ) 'fast horse' FAST( )^HORSE( ) 'horses' HORSE( )^PLURAL( ) PLURAL( ) => COUNTABLE(_)

  8. Weeks 3 and 4: Short Form • In acquiring words, kids use available concepts to introduce new ones. Sound('ride') + RIDE(_, _) ==> RIDE(_) + RIDE(_, _) + 'ride' • Meanings are instructions for how to access and combine i-concepts --lexicalizing RIDE(_, _) putsRIDE(_) at an accessible address --introduced concepts can be conjoined via simple operations that require neither Tarskian variables nor a Tarskian ampersand 'fast horses' FAST( )^HORSES( ) 'ride horses' RIDE( )^[Θ( , _)^HORSES(_)]

  9. Weeks 3 and 4: Short Form • In acquiring words, kids use available concepts to introduce new ones. Sound('ride') + RIDE(_, _) ==> RIDE(_) + RIDE(_, _) + 'ride' • Meanings are instructions for how to access and combine i-concepts --lexicalizing RIDE(_, _) putsRIDE(_) at an accessible address --introduced concepts can be conjoined via simple operations that require neither Tarskian variables nor a Tarskian ampersand 'fast horses' FAST( )^HORSES( ) 'ride horses' RIDE( )^[Θ( , _)^HORSES(_)] Meaning('fast horses') = JOIN{Meaning('fast'), Meaning('horses')} Meaning('ride horses') = JOIN{Meaning('ride'), Θ[Meaning('horses')]} = JOIN{fetch@'ride'), Θ[Meaning('horses')]}

  10. Weeks 3 and 4: Short Form • In acquiring words, kids use available concepts to introduce new ones Sound('ride') + RIDE(_, _) ==> RIDE(_) + RIDE(_, _) + 'ride' • Meanings are instructions for how to access and combine i-concepts --lexicalizing RIDE(_, _) putsRIDE(_) at an accessible address --introduced concepts can be conjoined via simple operations that require neither Tarskian variables nor a Tarskian ampersand 'ride horses' RIDE( )^[Θ( , _)^HORSES(_)] 'ride fast horses' RIDE( )^[Θ( , _)^FAST(_)^HORSES(_)] 'ride horses fast' RIDE( )^[Θ( , _)^HORSES(_)]^FAST( )

  11. Weeks 3 and 4: Very Short Form • In acquiring words, kids use available concepts to introducei-concepts, which can be “joined” to form conjunctive monadic concepts, which may or may not have Tarskian satisfiers. 'fast horses' FAST( )^HORSES( ) 'ride horses' RIDE( )^[Θ( , _)^HORSES(_)] 'ride fast horses' RIDE( )^[Θ( , _)^FAST(_)^HORSES(_)] 'ride fast horses fast' RIDE( )^[Θ( , _)^FAST(_)^HORSES(_)]^FAST( ) • Some Implications Verbs do not fetch genuinely relational concepts Verbs are not saturated by grammatical arguments The number of arguments that a verb can/must combine with is not determined by the concept that the verb fetches

  12. Words, Concepts, and Conjoinability

  13. What makes humans linguistically special? • Lexicalization: capacity to acquire words • Combination: capacity to combine words • Lexicalization and Combination (iv)Distinctive concepts that get paired with signals (v) Something else entirely FACT: human children are the world’s best lexicalizers SUGGESTION: focus on lexicalization is independently plausible

  14. Constrained Homophony Again • A doctor rode a horse from Texas • A doctor rode a horse, and (i) the horse was from Texas (ii) the ride was from Texas why not… (iii) the doctor was from Texas

  15. Leading Idea (to be explained and defended) • In acquiring words, we use available concepts to introduce new ones Sound(’ride’) +RIDE(_, _)==>RIDE(_) + RIDE(_, _) + ’chase’ • The new concepts can be systematically conjoined in limited ways 'rode a horse from Texas' RODE(_) & [Θ(_, _) & HORSE(_) &FROM(_, TEXAS)] RIDE(_) & PAST(_) & [Θ(_, _) & HORSE(_) &[FROM(_, _) & TEXAS(_)]] RODE(_) & [Θ(_, _) & HORSE(_)] &FROM(_, TEXAS) y[RODE(x, y) & HORSE(y)] &FROM(x, TEXAS)

  16. A doctor rode a horse that was fromTexas x{Doctor(x) &y[Rode(x, y) & Horse(y) & From(y, Texas)]} & A doctor rode a horse from Texas A doctor rode a horse and the ride was fromTexas ex{Doctor(x) & y[Rode(e, x, y) & Horse(y) & From(e, Texas)]} & A doctor rode a horse from Texas

  17. A doctor rode a horse that was fromTexas ex{Doctor(x) &y[Rode(e, x, y) & Horse(y) & From(y, Texas)]} & A doctor rode a horse from Texas A doctor rode a horse and the ride was fromTexas ex{Doctor(x) & y[Rode(e, x,y) & Horse(y) & From(e, Texas)]} & A doctor rode a horse from Texas

  18. But why doesn’t the structure below support a different meaning: A doctor both rode a horse and was from Texas ex{Doctor(x) &y[Rode(e, x, y) & Horse(y) & From(x, Texas)]} Why can’t we hear the verb phrase as a predicate that is satisfied by xiffx rode a horse & x is from Texas? A doctor rode a horse and the ride was fromTexas ex{Doctor(x) & y[Rode(e, x, y) & Horse(y) & From(e, Texas)]} & A doctor rode a horse from Texas

  19. In acquiring words, we use available concepts to introduce new ones Sound('ride') +RIDE(_, _)==>RIDE(_) + RIDE(_, _) + 'ride' • The new concepts can be systematically conjoined in limited ways 'rode a horse from Texas' RODE(_) & [Θ(_, _) & HORSE(_) &FROM(_, TEXAS)] RODE(_) & [Θ(_, _) & HORSE(_)] &FROM(_, TEXAS) y[RODE(e, x, y) & HORSE(y)] &FROM(x, TEXAS) if 'rode' has a rider-variable, why can’t it be targeted by 'from Texas’? Verbs don’t fetch genuinely relational concepts. A phrasal meaning leaves no choice about which variable to target.

  20. In acquiring words, we use available concepts to introduce new ones Sound('ride') +RIDE(_, _)==>RIDE(_) + RIDE(_, _) + 'ride' • The new concepts can be systematically conjoined in limited ways 'rode a horse from Texas' RODE(_)^[Θ(_, _)^HORSE(_)^FROM(_, TEXAS)] RODE(_)^[Θ(_, _)^HORSE(_)]^FROM(_, TEXAS) y[RODE(e, x, y) & HORSE(y)] &FROM(x, TEXAS) Composition is simple and constrained, but unbounded. Phrasal meanings are generable, but always monadic. Lexicalization introduces concepts that can be systematically combined in simple ways.

  21. In acquiring words, we use available concepts to introduce new ones Sound('ride') +RIDE(_, _)==>RIDE(_) + RIDE(_, _) + 'ride' • DISTINGUISH Lexicalizedconcepts, L-concepts RIDE(_, _) GIVE(_, _, _) ALVIN HORSE(_) RIDE(_, _, ...) MORTAL(_, _) Introduced concepts, I-concepts RIDE(_) GIVE(_) CALLED(_,Sound('Alvin')) MORTAL(_) HORSE(_) hypothesis: I-concepts exhibit less typology than L-concepts special case: I-concepts exhibit fewer adicities than L-concepts

  22. Conceptual Adicity Two Common Metaphors Jigsaw Puzzles 7th Grade Chemistry -2 +1H–O–H+1

  23. Jigsaw Metaphor A THOUGHT

  24. Jigsaw Metaphor one Dyadic Concept (adicity: -2) “filled by” two Saturaters (adicity +1) yields a complete Thought Sang( ) Brutus Unsaturated Saturater 2nd saturater KICK(_, _) 1st saturater one Monadic Concept (adicity: -1) “filled by” one Saturater (adicity +1) yields a complete Thought Doubly Un- saturated Caesar Brutus

  25. 7th Grade Chemistry Metaphor a molecule of water -2 +1H(OH+1)-1 a single atom with valence -2 can combine with two atoms of valence +1 to form a stable molecule

  26. 7th Grade Chemistry Metaphor -2 +1Brutus(KickCaesar+1)-1

  27. 7th Grade Chemistry Metaphor +1BrutusSang-1 +1NaCl-1 an atom with valence -1 can combine with an atom of valence +1 to form a stable molecule

  28. Extending the Metaphor Aggie Aggie Cow( ) Brown( ) +1 +1 -1 -1 Aggie is brown Aggie is (a) cow BrownCow( ) Aggie Brown( ) & Cow( ) Aggie is (a) brown cow

  29. Extending the Metaphor Aggie Aggie Cow( ) Brown( ) +1 +1 -1 -1 Conjoining two monadic (-1) concepts can yield a complex monadic (-1) concept Aggie Brown( ) & Cow( )

  30. Conceptual Adicity TWO COMMON METAPHORS --Jigsaw Puzzles --7th Grade Chemistry DISTINGUISH Lexicalizedconcepts, L-concepts RIDE(_, _) GIVE(_, _, _) ALVIN Introduced concepts, I-concepts RIDE(_) GIVE(_) CALLED(_,Sound(’Alvin’)) hypothesis: I-concepts exhibit less typology than L-concepts special case: I-concepts exhibit fewer adicities than L-concepts

  31. A Different (and older)Hypothesis (1) concepts predate words (2) words label concepts • Acquiring words is basically a process of pairing pre-existingconcepts with perceptible signals • Lexicalization is a conceptually passive operation • Word combination mirrors concept combination • Sentence structure mirrors thought structure

  32. Bloom: How Children Learn the Meanings of Words word meanings are, at least primarily, concepts that kids have prior to lexicalization learning word meanings is, at least primarily, a process of figuring out which existing concepts are paired with which word-sized signals in this process, kids draw on many capacities—e.g., recognition of syntactic cues and speaker intentions—but no capacities specific to acquiring word meanings

  33. Lidz, Gleitman, and Gleitman “Clearly, the number of noun phrases required for the grammaticality of a verb in a sentence is a function of the number of participants logically implied by the verb meaning. It takes only one to sneeze, and therefore sneeze is intransitive, but it takes two for a kicking act (kicker and kickee), and hence kick is transitive. Of course there are quirks and provisos to these systematic form-to-meaning-correspondences…”

  34. Lidz, Gleitman, and Gleitman “Clearly, the number of noun phrases required for the grammaticality of a verb in a sentence is a function of the number of participants logically implied by the verb meaning. It takes only one to sneeze, and thereforesneeze is intransitive, but it takes two for a kicking act (kicker and kickee), and hencekick is transitive. Of course there are quirks and provisos to these systematic form-to-meaning-correspondences…”

  35. Why Not... Clearly, the number of noun phrases required for the grammaticality of a verb in a sentence isnot a function of the number of participants logically implied by the verb meaning. A paradigmatic act of kicking has exactly two participants (kicker and kickee), andyetkickneed not be transitive. Brutus kicked Caesar the ball Caesar was kicked Brutus kicked Brutus gave Caesar a swift kick Of course there are quirks and provisos. Some verbs do require a certain number of noun phrases in active voice sentences. *Brutus put the ball *Brutus put *Brutus sneezed Caesar

  36. Quirky information for lexical items like ‘kick’ Conceptof adicityn Perceptible Signal Conceptof adicityn Quirky information for lexical items like ‘put’ Conceptof adicity-1 Perceptible Signal

  37. Clearly, the number of noun phrases required for the grammaticality of a verb in a sentence is a function of the number of participants logically implied by the verb meaning. It takes only one to sneeze, and thereforesneeze is intransitive, but it takes two for a kicking act (kicker and kickee), and hence kick is transitive. Of course there are quirks and provisos to these systematic form-to-meaning-correspondences. Clearly, the number of noun phrases required for the grammaticality of a verb in a sentence isn’t a function of the number of participants logically implied by the verb meaning. It takes only one to sneeze, and usuallysneeze is intransitive. But it usuallytakes two to have a kicking; and yetkickcan be untransitive. Of course there are quirks and provisos. Some verbs do require a certain number of noun phrases in active voice sentences.

  38. Clearly, the number of noun phrases required for the grammaticality of a verb in a sentence is a function of the number of participants logically implied by the verb meaning. It takes only one to sneeze, and thereforesneeze is intransitive, but it takes two for a kicking act (kicker and kickee), and hence kick is transitive. Of course there are quirks and provisos to these systematic form-to-meaning-correspondences. Clearly, the number of noun phrases required for the grammaticality of a verb in a sentence isn’t a function of the number of participants logically implied by the verb meaning. It takes only one to sneeze, and sneeze is typically used intransitively; but a paradigmatic kicking has exactly two participants, and yetkickcan be used intransitivelyor ditransitively. Of course there are quirks and provisos. Some verbs do require a certain number of noun phrases in active voice sentences.

  39. Quirks and Provisos, or Normal Cases? KICK(x1, x2) The baby kicked RIDE(x1, x2) Can you give me a ride? BEWTEEN(x1, x2, x3) I am between him and her why not: I between him her BIGGER(x1, x2) This is biggerthan that why not: This bigs that MORTAL(…?...) Socrates is mortal A mortal wound is fatal FATHER(…?...) Fathers father Fathers father future fathers EAT/DINE/GRAZE(…?...)

  40. Lexicalization as Concept-Introduction (not mere labeling) Concept of type T Concept of type T Perceptible Signal Concept of type T*

  41. Lexicalization as Concept-Introduction (not mere labeling) Number(_) type: <e, t> Number(_) type: <e, t> Perceptible Signal NumberOf[_, Φ(_)] type: <<e, t>, <n, t>>

  42. Lexicalization as Concept-Introduction (not mere labeling) Concept of type T Concept of type T Perceptible Signal Concept of type T*

  43. One Possible (Davidsonian) Application: Increase Adicity ARRIVE(x) ARRIVE(e, x) Concept of adicity-1 Concept of adicity-1 Perceptible Signal Concept of adicity-2

  44. One Possible (Davidsonian) Application: Increase Adicity KICK(x1, x2) KICK(e, x1, x2) Concept of adicity-2 Concept of adicity-2 Perceptible Signal Concept of adicity-3

  45. Lexicalization asConcept-Introduction: Make Monads KICK(x1, x2) KICK(e) Concept of adicityn KICK(e, x1, x2) Concept of adicityn Perceptible Signal Concept of adicity-1

  46. Further lexical information (regarding flexibilities) Two Pictures of Lexicalization Concept of adicityn (or n−1) Perceptible Signal Concept of adicityn Concept of adicityn further lexical information (regarding inflexibilities) Perceptible Signal Concept of adicity−1

  47. Articulation and Perception of Signals Phonological Instructions   Language Acquisition Device in a Mature State (an I-Language): GRAMMAR LEXICON Language Acquisition Device in its Initial State Experience and Growth Lexicalizable concepts  Semantic Instructions  Introduced concepts Lexicalized concepts

  48. Further lexical information (regarding flexibilities) Two Pictures of Lexicalization Concept of adicityn (or n−1) Perceptible Signal Concept of adicityn Concept of adicityn further lexical information (regarding inflexibilities) Perceptible Signal Concept of adicity−1

  49. Subcategorization A verb can access a monadic concept and impose further (idiosyncratic) restrictions on complex expressions • Semantic Composition Adicity Number (SCAN) (instructions to fetch) singular concepts +1 singular <e> (instructions to fetch) monadic concepts -1 monadic <e, t> (instructions to fetch) dyadic concepts -2 dyadic <e,<e, t>> • Property of Smallest Sentential Entourage (POSSE) zeroNPs, oneNP, twoNPs, … the SCAN of every verb can be -1, while POSSEs vary: zero,one,two, …

  50. POSSE facts may reflect ...the adicities of the original concepts lexicalized ...statistics about how verbs are used(e.g., in active voice) ...prototypicality effects ...other agrammatical factors • ‘put’ may have a (lexically represented) POSSE of three in part because --the concept lexicalized was PUT(_, _, _) --the frequency of locatives (as in ‘put the cup on the table’) is salient • and note: * I put the cup the table ? I placed the cup

More Related