1 / 42

Introduction to Natural Language Processing

Introduction to Natural Language Processing. A.k.a., “Computational Linguistics”. Recall: Agents and Environment. Environment. Agent. percepts. sensors. ?. actions. actuators. Agents and Environments with NLP. Agent. Environment. sensors. Speech,

aqua
Télécharger la présentation

Introduction to Natural Language Processing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Introduction toNatural Language Processing A.k.a., “Computational Linguistics”

  2. Recall: Agents and Environment Environment Agent percepts sensors ? actions actuators

  3. Agents and Environments with NLP Agent Environment sensors Speech, Handwriting, printed text, digital text Agent • What do the other agents claim to believe? • (NL Understanding) • What do the other agents actually believe or want? • (Plan recognition, game theory) • How can I make the other agents believe X? • (Planning, NL Generation) Agent Agent Speech, Handwriting, printed text, digital text actuators

  4. WHAT IS LANGUAGE? Definition with respect to form: Language is a system of speech symbols. It is realized acoustically (sound waves), visually-spatially (sign language) and in written form. Definition with respect to function: Language is the most important means of human communication. It is used to convey and exchange information (informative function) Multiplicity of languages: We know of about 7000 languages, which is estimated to be about 1% of all the languages that ever existed.

  5. LANGUAGE AND THE BRAIN

  6. LANGUAGE AND THE BRAIN

  7. THEORIES OF LANGUAGE Noam Chomsky claims that language is innate. B. F. Skinner claims that language is learned; it is basically a stimulus-response mechanism.

  8. WHAT IS GRAMMAR? When we learn a language we also learn the rules that govern how language elements, such as words, are combined to produce meaningful language. These elements and rules constitute the Grammar of a language. The Grammar is “what we know” Grammar represents our linguistic competence.

  9. Prescriptive Descriptive DESCRIPTIVE vs PRESCRIPTIVEGRAMMAR (is) (should be)

  10. Areas of Linguistics phonetics - the study of speech sounds phonology - the study of sound systems morphology- the rules of word formation syntax - the rules of sentence formation semantics - the study of word meanings pragmatics – the study of discourse meanings sociolinguistics - the study of language in society applied linguistics –the application of the methods and results of linguistics to such areas as language teaching, national language policies, lexicography, translation, language in politics etc.

  11. What is the meaning of ‘meaning’? • Learning a language includes learning the “agreed upon” meanings of certain strings of sounds and, • Learning how to combine these meaningful units into larger units which also convey meaning.

  12. Morphemes • Morpheme is the smallest linguistic unit that has meaning. • Morpheme is a grammatical unit in which there is an arbitrary union of sound and a meaning and, • which cannot be further analysed (broken down into parts that have meaning).

  13. Morphemes • A morpheme may be represented by a single sound: • e.g. the plural morpheme [s] in cat+s • A morpheme may be represented by a syllable (monosyllabic): • e.g. child+ish

  14. Morphemes A morpheme may be represented by more than one syllable (polysyllabic): • e.g. lady, water or three syllables: • e.g. crocodile or four syllables: • e.g. salamander

  15. Words • Two basic ways to form words • Inflectional (e.g. English verbs + endings  other English verbs) • Open + ed = opened • Open + ing = opening • Derivational (e.g. adverbs from adjectives, nouns from adjectives) • Happy  happily • Happy  happiness (nouns from adjectives)

  16. Syntax The study of classes of words (nouns, verbs, etc.) and the rules that govern how the words can combine to make phrases and sentences.

  17. Basic classes of words • Classes of words aka parts of speech(POS) • Nouns • Verbs • Adjectives • Adverbs • The above classes of word belong to the type open class words • We also haveclosed class words, or function words • Articles, pronouns, prepositions, particles, quantifiers, conjunctions

  18. Basic phrases • A word from an open class can be used to form the basis of a phrase • The basis of a phrase is called the head

  19. Examples of phrases • Noun phrases • The manager of the institute • Her worry to pass the exams • Several students from the English Department • Adjective phrases • easy to understand • mad as a dog • glad that he passed the exam

  20. Examples of phrases • Adverb phrases • fast like the wind • outside the building • Verb phrases • ate her sandwich • went to the doctor • believed what I told him

  21. Grammars and parsing • syntactic parsing: Determining the syntactic structure of a sentence • Basic steps • Identify sentence boundaries • Identify what part of speech is each word • Identify pairs of words that form phrases • Identify pairs of phrases that form larger phrases …

  22. Context Free Grammar • S -> NP VP • NP -> det (adj) N • NP -> Proper N • NP -> N • VP -> V, VP -> V PP • VP -> V NP • VP -> V NP PP, PP -> Prep NP • VP -> V NP NP

  23. Parses The cat sat on the mat S NP VP Det PP N V the cat sat NP Prep on Det N the mat

  24. Parses Time flies like an arrow. S NP VP N time V PP flies Prep NP like Det N arrow an

  25. Parses Time flies like an arrow. S NP VP N time V NP N like flies N Det arrow an

  26. Semantics and Pragmatics Semantics: the study of meaning that can be determined from a sentence, phrase or word. Pragmatics: the study of meaning, as it depends on context (speaker, situation)

  27. Language to Logic • John went to a book store. s. bookstore(s) ^ go(John, s) • Every boy loves a girl. ∀b . boy(b)  ∃g . girl(g) ^ loves(b, g) • Who broke the vase? λx . broke(x, vase17)

  28. Headlines • Police Begin Campaign To Run Down Jaywalkers • Iraqi Head Seeks Arms • Teacher Strikes Idle Kids • Miners Refuse To Work After Death • Juvenile Court To Try Shooting Defendant

  29. Language Families

  30. NLP tends to focus on: • Syntax • Grammars, parsers, parse trees, dependency structures • Semantics • Subcategorization frames, semantic classes, ontologies, formal semantics • Pragmatics • Pronouns, reference resolution, discourse models

  31. Issues in NLP • Ambiguity • Lack of Knowledge – it’s needed for understanding, but computers don’t have it

  32. Ambiguity • Computational linguists are obsessed with ambiguity • Ambiguity is a fundamental problem of computational linguistics • Resolving ambiguity is a crucial goal

  33. Ambiguity • Find at least 5 meanings of this sentence: • I made her duck

  34. Ambiguity • Find at least 5 meanings of this sentence: • I made her duck • I cooked waterfowl for her benefit (to eat) • I cooked waterfowl belonging to her • I created the (plaster?) duck she owns • I caused her to quickly lower her head or body • I waved my magic wand and turned her into undifferentiated waterfowl • At least one other meaning that’s inappropriate for gentle company.

  35. Ambiguity is Pervasive • I caused her to quickly lower her head or body • Lexical category: “duck” can be a N or V • I cooked waterfowl belonging to her. • Lexical category: “her” can be a possessive (“of her”) or dative (“for her”) pronoun • I made the (plaster) duck statue she owns • Lexical Semantics: “make” can mean “create” or “cook”

  36. Ambiguity is Pervasive • Grammar: Make can be: • Transitive: (verb has a noun direct object) • I cooked [waterfowl belonging to her] • Ditransitive: (verb has 2 noun objects) • I made [her] (into) [undifferentiated waterfowl] • Action-transitive (verb has a direct object and another verb) • I caused [her] [to move her body]

  37. Ambiguity is Pervasive • Phonetics! • I mate or duck • I’m eight or duck • Eye maid; her duck • Aye mate, her duck • I maid her duck • I’m aid her duck • I mate her duck • I’m ate her duck • I’m ate or duck • I mate or duck

  38. Kinds of knowledge needed? • Consider the following interaction with HAL the computer from 2001: A Space Odyssey • Dave: Open the pod bay doors, Hal. • HAL: I’m sorry Dave, I’m afraid I can’t do that.

  39. Knowledge needed to build HAL? • Speech recognition and synthesis • Dictionaries (how words are pronounced) • Phonetics (how to recognize/produce each sound of English) • Natural language understanding • Knowledge of the English words involved • What they mean • How they combine (what is a `pod bay door’?) • Knowledge of syntactic structure • I’m I do, Sorry that afraid Dave I’m can’t

  40. What’s needed? • Dialog and pragmatic knowledge • “open the door” is a REQUEST (as opposed to a STATEMENT or information-question) • It is polite to respond, even if you’re planning to kill someone. • It is polite to pretend to want to be cooperative (I’m afraid I can’t…) • What is `that’ in `I can’t do that’? • Even a system to book airline flights needs much of this kind of knowledge

  41. Computational models of how natural languages work These are sometimes called Language Models or sometimes Grammars Three main types (among many others): • Document models, or “topic” models • Sequence models: Markov models, HMMs, others • Context-free grammar models

  42. Computational models of how natural languages work Most of the models I will show you are • Probabilistic models • Graphical models • Generative models In other words, they are essentially Bayes Nets. In addition, many (but not all) are • Latent variable models This means that some variables in the model are not observed in data, and must be inferred. (Like the hidden states in an HMM.)

More Related