1 / 35

Knowledge Repr e sent at i on

Knowledge Repr e sent at i on. Outline. G e n e ral o n to l og y Ca t e g o r i e s a n d ob j e c t s E ve n t s a n d p r o cess e s Re a s on i n g s y s t e m s I n t e rnet s h op pi n g w o rld S u mm ary. Ontologies.

Télécharger la présentation

Knowledge Repr e sent at i on

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. KnowledgeRepresentation

  2. Outline • General ontology • Categoriesandobjects • Eventsandprocesses • Reasoningsystems • Internetshoppingworld • Summary

  3. Ontologies • Anontologyisa “vocabulary”anda “theory”ofa • certain“partofreality” • Special-purposeontologiesapplytorestricted • domains(e.g.electroniccircuits) • General-purposeontologieshavewider applicabilityacross domains,i.e. • Mustincludeconceptsthatcovermanysubdomains • Cannot use special“short-cuts”(suchas ignoringtime) • Mustallowunificationofdifferent typesofknowledge • GPontologiesare usefulinwideningapplicability • ofreasoningsystems,e.g.byincludingtime

  4. Ontological engineering • Representingageneral-purposeontologyisa • difficulttaskcalledontology engineering • ExistingGPontologieshavebeencreatedin • differentways: • Byteamoftrainedontologists • Byimportingconceptsfromdatabase(s) • Byextracting informationfromtextdocuments • Byinvitinganybodytoenter commonsenseknowledge • Ontologicalengineeringhasonlybeenpartially successful,andfewlargeAI systemsare basedonGPontologies(usespecial purposeontologies)

  5. Elements of a general ontology • Categories ofobjects • Measuresofquantities • Compositeobjects • Time,space,andchange • Eventsandprocesses • Physicalobjects • Substances • Mentalobjectsandbeliefs

  6. Top-level ontology of the world Anything AbstractObjects Events Sets Numbers RepresentationObjects Intervals Places PhysicalObjects Processes Categories Sentences Measurements Moments Things Stuff Times Weights Animals Agents Solids LiquidGas Humans

  7. Upper Ontology • The general framework of concepts is called an upper ontology because of the convention of drawing graphs with the general concepts at the top and the more specific concepts below them • Of what use is an upper ontology? • Consider the ontology for circuits that we studied • It makes many simplifying assumptions: time is omitted completely; signals are fixed and do not propagate; the structure of the circuit remains constant. • more general ontology would consider signals at particular times, and would include the wire lengths and propagation delays. • This would allow us to simulate the timing properties of the circuit, and indeed such simulations are often carried out by circuit designers.

  8. Categories and objects • Categoriesareusedtoclassifyobjects • accordingtocommonpropertiesordefinitions • xxTomatesRed(x)Round(x) • Categoriescanberepresentedby • Predicates:Tomato(x) • Objects:TheconstantTomatoesrepresentssetof • tomatoes(reification) • Roles ofcategoryrepresentations x1Tomatoes Instancerelations(is-a): Taxonomicalhierarchies(Subset):TomatoesFruit Inheritanceofproperties (Exhaustive)decompositions

  9. Categories using FOL

  10. Properties of categories • We say that two or more categories are disjoint if they have no members in common. • exhaustive decomposition • A disjoint exhaustive decomposition is known as a partition. • The following examples illustrate these three concepts: • The predicates us to define these concepts are • For example, a bachelor is an unmarried adult male:

  11. Objects and substance • Needtodistinguishbetweensubstanceand • discreteobjects • Substance(“stuff”) • Massnouns - notcountable • Intrinsicproperties • Partofasubstance is(still)thesamesubstance • Discreteobjects(“things”) • Count nouns - countable • Extrinsicproperties • Partsare (generally)not ofsamecategory

  12. Composite objects • Acompositeobjectisanobject thathasother • objectsasparts • ThePartOfrelationdefinestheobject containment,andistransitiveandreflexive PartOf(x,y)PartOf(y,z)PartOf(x,z) • PartOf(x,x) • ObjectscanbegroupedinPartOfhierarchies, • similartoSubsethierarchies • Thestructureofthecompositeobject • describeshowthepartsarerelated

  13. Composite objects • For example, a biped has two legs attached to a body: • For example, we might want to say “The apples in this bag weigh two pounds.” • we need a new concept, which we will call a bunch. • For example, if the apples are Apple1, Apple2, and Apple3, then BunchOf({Apple1,Apple2,Apple3}) ∀x x∈ s ⇒ PartOf (x, BunchOf (s)) ∀ y [∀x x∈ s ⇒ PartOf (x, y)] ⇒ PartOf (BunchOf (s), y) • logical minimization, which means defining an object as the smallest one satisfying certain conditions.

  14. Measurements • Needtobeabletorepresentpropertieslike height,mass,cost,etc.Valuesforsuch propertiesare measures • Unitfunctionsrepresentandconvertmeasures • Length(L1)Inches(1.5)Centimeters(3.81) • lCentimeters(2.54l)Inches(l) • Measures canbeusedtodescribeobjects • Mass(Tomato1)Kilograms(0.16) • ddDaysDuration(d)Hours(24) • Non-numericalmeasurescanalsoberepresen- ted,butnormallythereisanorder(e.g.>).Usedinqualitativephysics

  15. Measurements • Comparative difficulty e1 ∈ Exercises ∧ e2 ∈Exercises ∧ Wrote(Norvig, e1) ∧ Wrote(Russell, e2) ⇒ Difficulty(e1) > Difficulty(e2) e1 ∈ Exercises ∧ e2 ∈Exercises ∧ Difficulty(e1) > Difficulty(e2) ⇒ ExpectedScore(e1) < ExpectedScore(e2)

  16. Objects: Things and stuff • The real world can be seen as consisting of primitive objects (e.g., atomic particles) and composite objects built from them. • There is, however, a significant portion of reality that seems to defy any obvious individuation—division into distinct objects. We give this portion the generic name stuff. • count nouns, such as aardvarks, holes, and theorems, and mass nouns, such as butter, water, and energy. • To represent stuff properly, we begin with the obvious. We need to have as objects in our ontology at least the gross “lumps” of stuff we interact with. b∈ Butter ∧ PartOf (p, b) ⇒ p ∈Butter b∈ Butter ⇒ MeltingPoint(b,Centigrade(30)) • Intrinsic properties and extrinsic properties

  17. Event calculus • Eventcalculus:Howtodeal withchangebasedon • representingpointsoftime • Reifiesfluentsandevents • Afluent:At(Bilal,Berkeley) • The fluentistrue attimet:T(At(Bilal,IQRA),t) • Eventsareinstancesofeventcategories • E1FlyingsFlyer(E1,Bilal)Origin(E1,SF)Destination(E1,KHI) • EventE1 tookplaceoverintervali • Happens(E1,i) • Timeintervalsrepresentedby(start,end)pairs • i=(t1,t2)

  18. Event calculus predicates • T(f, t) • Happens(e,i) • Initiates(e,f, t) Fluentfistrueattimet Evente happensover intervali Eventecausesfluentftostartatt • Terminates(e,f, t)Eventecausesftoceaseatt • Clipped(f,t) • Restored(f,i) Fluentfceases to betrue inint.i Fluentfbecomestrueinintervali

  19. Events • We assume a distinguished event, Start , that describes the initial state by saying which fluents are initiated or terminated at the start time. • We define T by saying that a fluent holds at a point in time if the fluent was initiated by an event at some time in the past and was not made false (clipped) by an intervening event. • A fluent does not hold if it was terminated by an event and not made true (restored) by another event. • Happens(e, (t1, t2)) ∧ Initiates(e, f, t1) ∧ ¬Clipped(f, (t1, t)) ∧ t1 < t ⇒ T(f, t) • Happens(e, (t1, t2)) ∧ Terminates(e, f, t1)∧ ¬Restored (f, (t1, t)) ∧ t1 < t ⇒ ¬T(f, t) where Clipped and Restored are defined by • Clipped(f, (t1, t2)) ⇔ ∃ e, t, t3 Happens(e, (t, t3)) ∧ t1 ≤ t < t2 ∧ Terminates(e, f, t) • Restored (f, (t1, t2)) ⇔ ∃ e, t, t3 Happens(e, (t, t3)) ∧ t1 ≤ t < t2 ∧ Initiates(e, f, t)

  20. Processes • The events we have seen so far are what we call discrete events • Categories of events with sub-intervals are called process categories or liquid event categories

  21. Time intervals • Timeintervalsarepartitionedintomoments(zero duration)andextendedintervals Partition(Moments,ExtendedIntervals,Intervals) • iiIntervals(iMomentsDuration(i)  0) • FunctionsStartandEnddelimitintervals • iInterval(i)Duration(i) (Time(End(i))Time(Start(i))) • Mayuse e.g.January1, 1900asarbitrarytime0 • Time(Start(AD1900))=Seconds(0)

  22. Relations between time intervals j Meet(i,j) i Before(i,j) After(j,i) j i Can beexpressedlogically,e.g. i,jMeet(i,j)Time(End(i)) Time(Start(j)) j During(i,j ) i Overlap(i,j) Overlap(j,i) j

  23. Mental events and mental objects • Needto representbeliefsinselfand other agents, e.g.for controllingreasoning,or for planningactions thatinvolveothers • Howare beliefsrepresented? • Beliefsare reifiedasmentalobjects • Mentalobjectsare representedasstringsin alanguage • Inferencerulesforthislanguagecan bedefined • Rulesfor reasoningabout logicalagents’ use theirbeliefs • a,p,qLogicalAgent(a)Believes(a,p) • Believes(a,"pq")Believes(a,q) a,p LogicalAgent(a)Believes(a,p) Believes(a,"Believes(Name(a),p)")

  24. Mental events • propositional attitudes that an agent can have toward mental objects: attitudes such as Believes, Knows, Wants, Intends, and Informs • For example, suppose we try to assert that Lois knows that Superman can fly: Knows(Lois, CanFly(Superman)) • if it is true that Superman is Clark Kent, then we must conclude that Lois knows that Clark can fly: (Superman = Clark) ∧ Knows(Lois , CanFly(Superman)) |= Knows(Lois, CanFly(Clark )) • This property is called referential transparency

  25. Modal Logic • Modal logic is designed to address this problem. • Regular logic is concerned with a single modality, the modality of truth, allowing us to express “P is true.” • Modal logic includes special modal operators that take sentences (rather than terms) as arguments. • For example, “A knows P” is represented with the notation KAP, where K is the modal operator for knowledge. It takes two arguments, an agent (written as the subscript) and a sentence.

  26. Semantic networks • Graphrepresentationofcategories,objects, • relations,etc. (i.e.essentiallyFOL) • Natural representation ofinheritance anddefault values ∀x x∈ Persons ⇒ [∀ y HasMother(x, y) ⇒ y ∈ FemalePersons] . ∀x x∈ Persons ⇒ Legs(x, 2) .

  27. Semantic Network Human Being Is a Boy Is a Is a Needs Goes to Woman Joe School Is a Food Has a child Kay

  28. Other reasoning systems for categories • Descriptionlogics • Derivedfromsemanticnetworks,butmoreformal • Supportssubsumption,classificationand consistency • Circumscriptionanddefaultlogic • Formalizes reasoning about defaultvalues • Assumesdefaultinabsence ofother input;mustbe • able toretractassumptionifnewevidenceoccurs • Truthmaintenancesystems • Supportsbeliefrevisioninsystemswhereretracting beliefis permitted

  29. Internet shopping world • Anagentthatunderstandsandactsinan • internetshoppingenvironment • ThetaskistoshopforaproductontheWeb, • giventheuser’s productdescription • Theproductdescriptionmaybeprecise,inwhich casetheagentshouldfindthebest price • Inothercases thedescriptionisonlypartial,and theagenthastocompareproducts • Theshoppingagentdependsonhavingproduct • knowledge,incl.category hierarchies

  30. PEAS specification of shopping agent • Performance goal • Recommendproduct(s)to match user’sdescription • Environment • Allofthe Web • Actions • Followinglinks • Retrieve page contents • Sensors • Webpages:HTML,XML

  31. Outline of agent behavior • Startathomepageof knownwebstore(s) • Musthaveknowledge ofrelevantwebaddresses, • suchas www.amazon.cometc. • Spreadout fromhomepage,followinglinksto • relevantpagescontainingproductoffers • Mustbeabletoidentifypagerelevance,using productcategoryontologies, aswell parsepagecontentsto detectproductoffers • Havinglocatedoneormoreproductoffers, • agentmustcompareandrecommendproduct • Comparisonrange fromsimplepricerankingto • complextradeoffsin severaldimensions

  32. Following links • The agent will have knowledge of a number of stores, for example: Amazon ∈OnlineStores ∧ Homepage(Amazon, “amazon.com”) . Ebay ∈OnlineStores ∧ Homepage(Ebay, “ebay.com”) . ExampleStore ∈OnlineStores ∧ Homepage(ExampleStore, “example.com”) • a page is relevant to the query if it can be reached by a chain of zero or more relevant category links from a store’s home page, and then from one more link to the product offer. Relevant(page, query) ⇔ ∃ store, home store ∈OnlineStores ∧ Homepage(store, home) ∧ ∃url , url 2 RelevantChain(home, url 2, query) ∧ Link(url 2, url) ∧ page = Contents(url ) RelevantChain(start , end, query) ⇔ (start = end) ∨ (∃ u, text LinkText(start, u, text ) ∧ RelevantCategoryName(query, text ) ∧ RelevantChain(u, end, query)) .

  33. Following Links

  34. Comparing offers ∃ c, offer c∈ LaptopComputers ∧ offer ∈ ProductOffers ∧ Manufacturer(c,IBM ) ∧ Model (c, ThinkBook970 ) ∧ ScreenSize(c, Inches(14)) ∧ ScreenType(c, ColorLCD) ∧ MemorySize(c,Gigabytes(2)) ∧ CPUSpeed (c,GHz (1.2)) ∧ OfferedProduct(offer, c) ∧ Store(offer , GenStore) ∧ URL(offer , “example.com/computers/34356.html”) ∧ Price(offer , $(399)) ∧ Date(offer ,Today)

  35. Summary • Anontologyisanencodingofvocabularyand relationships.Special-purposeontologiescanbe effectivewithinlimiteddomains • Ageneral-purposeontologyneedstocoverawide varietyofknowledge,andisbasedoncategories andan eventcalculus • Itcovers structuredobjects,timeandspace, change,processes, substances,andbeliefs • Thegeneralontologycansupportagent reasoningina widevarietyofdomains,including theInternetshoppingworld

More Related