1 / 84

Artificial Intelligence and Computing Science

Artificial Intelligence and Computing Science. October 19, 2012. AI in the Seventies and Eighties. A belated realization: in most situations, intelligent behaviour requires knowledge. How many things do you know?. ``I can ’ t think of a hundred things that I know a thousand things about ’’

leola
Télécharger la présentation

Artificial Intelligence and Computing Science

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Artificial Intelligenceand Computing Science October 19, 2012

  2. AI in the Seventies and Eighties A belated realization: in most situations, intelligent behaviour requires knowledge

  3. How many things do you know? ``I can’t think of a hundred things that I know a thousand things about’’ (Marvin Minsky, `Semantic Information Processing’, 1968)

  4. AI in the Seventies Winograd, SHRDLU Expert Systems Minsky, `Frames’ Rumelhart and McClellan, `Parallel Distributed Programming’(return of the Perceptron)

  5. SHRDLU’s World

  6. Sample Dialogue Winograd: What’s on the green block? Shrdlu: A pyramid Winograd: Put the red block on the green block Shrdlu: OK Winograd: Why did you move the purple pyramid? Shrdlu: To clear off the green block Winograd: Why did you put the red block on the green block? Shrdlu: You told me to

  7. Winograd: Pick up a big red block Shrdlu: OK Winograd: Grasp the pyramid Shrdlu: I don’t understand which pyramid you mean Winograd: Find a block which is taller than the one you are holding and put it into the box Shrdlu: By `it’, I assume you mean the block which is taller than the one I am holding. OK.

  8. Winograd: What does the box contain? Shrdlu: The blue pyramid and the blue block Winograd: What is the pyramid supported by? Shrdlu: The box Winograd: Can the table pick up blocks? Shrdlu: No

  9. Things SHRDLU doesn’t know: `red’ and `green’ are colours SHRDLU’s world is flat A table has legs but no arms SHRDLU is a computer program

  10. Expert Systems (rule-based) Knowledge can be represented by a number of `if…then’ rules plus an inference engine. E.g, ``IF temperature is high AND rash is present, THEN patient has measles.’’

  11. We can extract the rules from human experts via interviews. This process is known as `knowledge engineering’:

  12. `If an animal has fur, it is a mammal’ `If an animal has feathers, it is a bird’ `If an animal is a bird, it can fly’ `If an animal has scales, it is a fish’ `If an animal is a fish, it can swim’ `If an animal lays eggs and has fur, it is a duck-billed platypus’ This gives us a set of rules that an inference engine (or `expert system shell’) can reason about. Two popular modes of reasoning are forward chaining and backward chaining:

  13. `If an animal has fur, it is a mammal’ `If an animal has feathers, it is a bird’ `If an animal is a bird, it can fly’ `If an animal is a bird, it lays eggs’ `If an animal has scales, it is a fish’ `If an animal is a fish, it can swim’ `If an animal lays eggs and has fur, it is a duck-billed platypus’ Forward chaining: Given a new fact (`Tweety has feathers’), search for all matching conditionals, draw all possible conclusions, and add them to the knowledge base: :- Tweety is a bird :- Tweety can fly :- Tweety lays eggs Potential problem: we run into the combinatorial explosion again

  14. Backward chaining: Given a query (`Does Tweety lay eggs?’), search for all matching consequents and see if the database satisfies the conditionals: `If an animal has fur, it is a mammal’ `If an animal has feathers, it is a bird’ `If an animal is a bird, it can fly’ `If an animal is a bird, it lays eggs’ `If an animal has scales, it is a fish’ `If an animal is a fish, it can swim’ `If an animal lays eggs and has fur, it is a duck-billed platypus’ `Tweety has feathers’

  15. Backward chaining: `Does Tweety lay eggs?’ `If an animal has fur, it is a mammal’ `If an animal has feathers, it is a bird’ `If an animal is a bird, it can fly’ `If an animal is a bird, it lays eggs’ `If an animal has scales, it is a fish’ `If an animal is a fish, it can swim’ `If an animal lays eggs and has fur, it is a duck-billed platypus’ `Tweety has feathers’

  16. Backward chaining: `Does Tweety lay eggs?’ `Is Tweety a bird?’ `If an animal has fur, it is a mammal’ `If an animal has feathers, it is a bird’ `If an animal is a bird, it can fly’ `If an animal is a bird, it lays eggs’ `If an animal has scales, it is a fish’ `If an animal is a fish, it can swim’ `If an animal lays eggs and has fur, it is a duck-billed platypus’ `Tweety has feathers’

  17. Backward chaining: `Does Tweety lay eggs?’ `Is Tweety a bird?’ Does Tweety have feathers?’ `If an animal has fur, it is a mammal’ `If an animal has feathers, it is a bird’ `If an animal is a bird, it can fly’ `If an animal is a bird, it lays eggs’ `If an animal has scales, it is a fish’ `If an animal is a fish, it can swim’ `If an animal lays eggs and has fur, it is a duck-billed platypus’ `Tweety has feathers’

  18. Backward chaining: Conclusion: Yes, Tweety does lay eggs This method is used by Prolog, for example

  19. `If an animal has feathers, it is a bird’ `If an animal is a bird, it can fly’ `If an animal is a bird, it lays eggs’ Potential problem: A lot of rules have exceptions.

  20. Frames (Marvin Minsky, 1974)

  21. A frame allows us to fill in default knowledge about a situation from a partial description. For example, ``Sam was hungry. He went into a Mcdonalds and ordered a hamburger. Later he went to a movie.’’ Did Sam eat the hamburger?

  22. Event So we can economically represent knowledge by defining properties at the most general level, then letting specific cases inherit those properties… Transaction Buying something Buying a hamburger

  23. Return of the perceptron (now called a `neural net’) Changes since 1969: Hidden layers Non-linear activation function Back-propagation allows learning

  24. Rumelhart and McClelland`Parallel Distributed Processing’ Use neural nets to represent knowledge by the strengths of associations between different concepts, rather than as lists of facts, yielding programs that can learn from example.

  25. Conventional Computer Memory Register One Register Two Register Three 01100110 11100110 00101101 . . . .

  26. AI: 1979-2000 Douglas Lenat, `CYC’, Douglas Hofstadter, `Fluid Analogies’ Brian Hayes, `Naïve Physics’

  27. CYC’s data are written in CycL, which is a descendant of Frege’s predicate calculus (via Lisp). For example, (#$isa #$BarackObama #$UnitedStatesPresident) or (#$genls #$Mammal #$Animal)

  28. The same language gives rules for deducing new knowledge: (#$implies (#$and (#$isa ?OBJ ?SUBSET) (#$genls ?SUBSET ?SUPERSET)) (#$isa ?OBJ ?SUPERSET))

  29. What CYCcorp says CYC knows about `intangible things’. Intangible Things are things that are not physical -- are not made of, or encoded in, matter. These include events, like going to work, eating dinner, or shopping online. They also include ideas, like those expressed in a book or on a website. Not the physical books themselves, but the ideas expressed in those books. It is useful for a software application to know that something is intangible, so that it can avoid commonsense errors; like, for example, asking a user the color of next Tuesday's meeting.

  30. Questions CYC couldn’t answer in 1994 What colour is the sky? What shape is the Earth? If it’s 20 km from Vancouver to Victoria, and 20 km from Victoria to Sydney, can Sydney be 400 km from Vancouver? How old are you? (Prof. Vaughan Pratt)

  31. Hofstadter: Fluid Analogies Human beings can understand similes, such as ``Mr Pickwick is like Christmas’’

  32. Example: Who is the Michelle Obama of Canada?

  33. Michaelle Jean, Governor-General

  34. Head of government Spouse Spouse

  35. Head of State Spouse Spouse

  36. One of Hofstadter’s approaches to solving these problems is `Copycat’, a collection of independent competing agents. If efg becomes efw, what does ghi become? If aabc becomes aabd, what does ijkk become?

  37. Inside Copycat: ij(ll) ij(kk) ijkk (ijk)l (ijk)k aabd:jjkk aabc:ijkk aabd:hjkk

  38. If efg becomes efw, what does ghi become? COPYCAT suggests whi and ghw If aabc becomes aabd, what does ijkk become? COPYCAT suggests ijll and ijkl and jjkk and hjkk

  39. Hofstadter: ``What happens in the first 500 milliseconds?”

  40. Find the O

  41. X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X O X X X X X X X X X X X X X

  42. Find the X

  43. X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X

  44. Find the O

  45. X X O X X X O X X O X X X X X X O X X O X X X X X O X X O X O X X X X O X X O X X O X X O X X X X X O X X O X

More Related