1 / 32

A spider kills another spider - generic-if and quantification -

A spider kills another spider - generic-if and quantification -. Ikumi Imani & Itaru Takarajima Nagoya Gakuin University {imani, takaraji}@ngu.ac.jp. (1) A spider kills another spider. (universal/generic). n × (n - 1) (combinatory possibility)

ariane
Télécharger la présentation

A spider kills another spider - generic-if and quantification -

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A spider kills another spider- generic-if and quantification - Ikumi Imani & Itaru Takarajima Nagoya Gakuin University {imani, takaraji}@ngu.ac.jp

  2. (1) A spider kills another spider. (universal/generic) • n × (n - 1) (combinatory possibility) • V((1)) = 1 if you find a spider which survived all killings.

  3. Contents 1. Dynamic Semantics 2. E-type analysis (with choice functions) 3. Our approach

  4. Dynamic Semantics 1. DRT 2. File Change Semantics 3. Dynamic Predicate Logic (4. Model-theoretic semantics)

  5. DPL A formula denotes a set of 〈f, g〉, where f is an input and g an output. (1)f [∃xφ] g ⇔ ∃h: f and h differ at most in the value they assign to x and h [φ] g (1’)f [∃’x] g ⇔ f and g differ at most in the value they assign to x (2)f [∀xφ] g ⇔ f = g & ∀k: f [∃’x] k → ∃j: k [φ] j

  6. (1) A spider kills another spider. • V((1)) = {〈f, g〉| f = g & ∀k: f [∃’xs] k → ∃j: 〈k, j〉 ∈ V(kill(xs, ys)) } • Truth: φis true with respect to f in M iff ∃g: 〈f, g〉 ∈ [φ]M • But what (1) says is just that spiders have the habit of killing other spiders.

  7. (2) John bought a convertible. The car-dealer will deliver it to his house next week. Situation: John went to a car-dealer’s office. He looked at a catalogue of cars. He liked a convertible. He signed the check. They promised him to deliver it to his house next week. (Next day, he had a stroke. He died.) f [∃xφ] g ⇔ ∃h: f and h differ at most in the value they assign to x and h [φ] g Which car in the domain D is supposed to be purchased by John?

  8. E-type analysis with choice functions a. F(t) → F(εxFx)(introduction rule for epsilon terms) b. ∃xFx F(εxFx) c. ∃x¬Fx ¬F(εx¬Fx) d. ¬∃x¬Fx ¬¬F(εx¬Fx) e. ∀xFx  F(εx¬Fx) [εxFx]M,g = Φ([F] M,g ), where Φ is a choice function that is determined by the model M.

  9. A Choice function Φ a.Φ([F] M,g ) ∈ [F] M,g if [F] M,g ≠  b.Φ([F] M,g ) ∈ D if [F] M,g =  • Uniqueness problem (1) A wine glass broke last night. It had been very expensive. (2) Just one wine glass broke last night.

  10. Two kinds of choice functions  Once Φ chooses an element from D, it keeps to refer to the element. (1) A man is walking. He whistles. …. A man comes. He does not whistle. a. an F: [k xFx]= Φk[F]with k new b. the F:[εc xFx]= Φc [F]with c contextually determined (J.Peregrin and K. von Heusinger 1997)

  11. Salience • Definite NPs refer to their referents according to the given salience ranking of the discourse. Indefinite NPs, on the other hand, do not refer to any salient object but to an arbitrary chosen object. (J.Peregrin and K. von Heusinger 1997)

  12. (1) Why is a donkey walking in my garden? • Situation: Two men were looking at the garden. They noticed that a donkey was walking in the garden. One of them shouted ... (2) Put your jacket on the desk. • Situation: A man came in. He was soaked. The speaker did not get the room wet. She ordered him to put his jacket on the desk. He looked around, wondering where the desk was.

  13. Donald Duck Problem (P. Dekker 2001) (1) If we invite some philosopher, Max will be offended. Problem: The speaker does not know WHO is a philosopher, but there is some (specific) individual x, and if x is some philosopher which we invite, Max will be offended. Now, to make this statement true, it suffices to choose Donald for x. Since Donald is not a philosopher, the statement is trivially true. (The use of choice functions can handle this.) Situation: Suppose they are simply not going to invite Jacques Derrida. Then, when a choice function picks up Derrida, (1) is trivially true. In this way, (1) becomes equivalent to (2). (2) If we invite all philosophers, Max will be offended. (assuming there to be at least one philosopher)

  14. Model a. The domain of individuals, D b. An assignment function, I c. semantic definitions of terms I(dog) = a set of dogs

  15. Model-theoretic Semantics A dog is running. I(dog) = {x | dog(x)} I(run) = {y | run(x)} d Model = 〈D, I〉 g(x) = d d ∈ I(dog) ∩ I(run)

  16. What a machine does not do (1) I think a donkey is walking in my garden. Am I wrong? ‘A donkey is walking in my garden’ is true iff for at least one d ∈D, VM,g[x/d][walk(x) & donkey(x)] is true…

  17. What a machine does (verification by matching) A dog is running. I(dog) = {x | dog(x)} I(run) = {y | run(x)} g(x) = d verification by matching aj: dog, run

  18. What a machine does (storage) I(dog) = {x | dog(x)} I(run) = {y | run(x)} g(x) = d A dog is running. aj: dog, run

  19. Verification by matching vs. Storage (1)??Look! A man is a fireman. (matching) (2) If a man is a fireman, he is brave. (storing) (3) Look! A red car is running on the beach. (matching) (4) ?Look! A running car hit the tree. (matching) (5) If a running car hits a tree, it gets damaged. (storing)

  20. A: A man jumped from the bridge last night. (storing)B: He did not jump, but was pushed. A: I think a woman over there is pretty. (matching) B: She is not a woman. She is he. A: #I think a farmer who has a donkey is a woman. (storing: contradiction)

  21. Generic (storing) vs. actual (matching) (1) Most donkeys have a tail. They use it to swap flies. (2) ?Most donkeys have a tail (luckily enough). They are using it to swap flies. (3) *Most donkeys have a tail. It is long.

  22. verification by matching • storing (recording) • What state should a machine have?

  23. Object-setting (1) A man is running. (2) Every man is running. (3) Most men are running. (4) The man is running. (5) a j : man, run index j : every, generic, most, speaker’s referent, singular, plural…

  24. Example (1) (1) Every donkey is running. aevery: donkey, run (2) Most donkeys are running. amost: donkey, run (3) Every donkey loves a farmer. aevery: donkey; bsingular: farmer; love(aevery,bsingular)

  25. Example (2) • If a farmer owns a donkey, he beats it. *He is cruel. • (2) Every chess set comes with a spare pawn. It is taped under the box. (P.Sells)

  26. Example (3) (1) Most donkeys have a tail. They use it to swap flies. (2) #Most donkeys have a tail. It is long. (1’) amost: donkey, bs: tail, amost have bs, amost use bs (2’) [amost: donkey, bs: tail, amost have bs] → checking (no way to identify the referent of ‘it’)

  27. Destructive assignment problem In DPL, ∃x randomly assigns a value to x. So, it happens that when x carries some information about x, ∃x overwrites this information. To avoid this, indefinites should always quantify over new variables.

  28. Then, what can we say about spiders

  29. A strong reading (1) If a farmer owns a donkey, he beats it. In a model-theoretic semantics, (1) is true if each farmer beats every donkey he owns once. (1) is false if some farmer dies before he beats all donkeys he owns.

  30. Is a strong reading different from a weak reading? (1) If a farmer has a donkey, he beats it. (2) If a man has a credit card, he uses it. depends on knowledge about the world What a machine knows about ‘if A, B’ is: if it witnesses A, then it can infer B.

  31. No quantification in advance (1) If a man has a credit card, he puts it in his wallet. (2) If a man has a credit card, he uses it. A: a j : man, b j : card, a j have b j B: consequences

  32. Double-bind problem (1) If a theory is classical, then if it is inconsistent, it is usually trivial. Since there is one input value for the second antecedent, quantification (in terms of ‘usually’) has a problem. A → (B →usually C)

More Related