1 / 21

Strategies for ontology negotiation: Finding the right level of generality

Strategies for ontology negotiation: Finding the right level of generality. AC Workshop 9 May 2006 Utrecht University, the Netherlands Jurriaan van Diggelen, Edwin de Jong, Marco Wiering. Introduction Approach What is the right level of generality?

mateja
Télécharger la présentation

Strategies for ontology negotiation: Finding the right level of generality

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Strategies for ontology negotiation: Finding the right level of generality AC Workshop 9 May 2006 Utrecht University, the Netherlands Jurriaan van Diggelen, Edwin de Jong, Marco Wiering

  2. Introduction Approach What is the right level of generality? Why is it important for ontology negotiation? Experiments Conclusion introduction Layout of the presentation

  3. introduction The right level of generality is found in 3 steps 3 step dialogue: A: What’s your PhD topic? B: I study agents A: I know agents, what kind of agents? B: I’m working on electronic institutions A: What do you study in electronic institutions? B: I study norm enforcement in electronic institutions A: Ok, that’s interesting, thanks! Agents generality Electronic institutions Norm enforcement in Electronic institutions Alternative : 4 step dialogue Alternative : 2 step dialogue Alternative : 1 step dialogue Computer science Computer science Agents Norm enforcement by contextualizing abstract norms in ISLANDER Electronic institutions Norm enforcement in Electronic institutions Norm enforcement in Electronic institutions

  4. introduction Project: Ontology reconciliation in MAS Problem: In heterogeneous MAS’s, communication is hampered by the lack of shared ontologies • Solution: Ontology negotiation • Enable the agents to resolve ontology problems themselves. • Add ontology alignment layer to the agent communication protocol. • Ontology reconciliation problem is solved decentralized. How do we solve the babylonian confusion of tongues?

  5. approach Ontology negotiation in ANEMONE NCP Normal Communication Protocol CDP Concept Definition Protocol CEP Concept Explication Protocol

  6. approach ANEMONE in a nutshell Sender Receiver Send message Interpret message Otherwise NCP Message overgeneralized? Message not understood? CDP Send definition Interpret definition Definition understood? Definition inadequate? CEP Send explication Interpret explication

  7. Message not understood if Ag-2 has no mapping for the concept. E.g. concept Ag-1 conveys e using e Message is overgeneralized if Ag-2 has an information gap + approach Message interpretation Ag-1 Ag-2 m1~a m1~f m2~b m2~g m4~c m5~h m8~d m9~e m12~j m11~i subset(d,g); disj(d,h) subset(c,g); disj(c,h) equiv(a,f) Not considered overgeneralized by Ag-2 Considered overgeneralized by Ag-2 Ag-1 conveys dusing d Ag-1 conveys dusing a Ag-1 conveys dusing c

  8. Which sign do I choose? One that is most likely to be understood by the other agent. One that has most frequently been used by other agents Which meaning do I choose? One that is at the right level of generality One that is not considered overgeneralized by the hearer The most general meaning that satisfies the first condition + approach Message sending

  9. experiments Model: random ontologies m1 Meaning space m2 m3 m5 m7 m4 m6 m8 m11 m13 m15 m9 m12 m14 m16 m1 m1 Individual meaningspaces selected according to (1,1,2,0) m1 m2 m2 m3 m4 m5 m6 m8 m9 m11 m13 m12 m14 m1~a m1~f m1~k Ontologies: labelled meaning spaces m2~b m2~g m3~i m4~c m5~h m6~l m8~d m9~e m12~j m11~i m13~m m14~n

  10. We adopted a group of 15 agents, with random ontologies consisting of 46 concepts. An experiment consists of t steps. Each step, a random speaker-hearer pair is selected. experiments Experiments

  11. UR Understandings Rate : ratio of conversations that proceed without visiting CDP and CEP. PUR UR w.r.t. a specific Pair of agents MPUR UR w.r.t. a Meaning and a Pair experiments Integration measures • Expected values: • What is the expected (M)PUR after a concept has been taught? • Agents calculate Expected values using • Prior knowledge of the probability that a concept is overgeneralized • Acquired knowledge of these probabilities

  12. Using STS, the speaker’s only goal is to teach a concept that gets the meaning across that it currently intends to convey. Speaker maximizes expected MPUR In STS, the agents exchange specific concepts to ensure that messages are not overgeneralized Ag-1 m1~a m2~b m4~c m8~d m9~e experiments Short term strategy For example: Ag-1 intends to convey d, and speaks d

  13. experiments Results STS • UR increases slowly - • Initial Nr CEP is low + • Avg.dialogue lengh is low +

  14. Using LTS, the speaker’s goal is to teach a concept that is widely applicable. Speaker mazimizes expected PUR In LTS, the agents exchange general concepts that can be used to convey a wide range of meanings. Ag-1 m1~a m2~b m4~c m8~d m9~e experiments Long term strategy For example: Ag-1 intends to convey d, and speaks a

  15. experiments Results LTS • UR increases fast + • Initial Nr CEP is high - • Avg.dialogue lengh is high -

  16. Using MTS, the speaker’s goal is to provide the hearer with a widely applicable concept that gets the current meaning across. They maximize the sum of expected MPUR and PUR In MTS, the agents exchange general concepts that are specific enough not to be overgeneralized. Ag-1 m1~a m2~b m4~c m8~d m9~e experiments Medium term strategy For example: Ag-1 intends to convey d, and speaks c

  17. experiments Results MTS • UR increases fastly + • Initial Nr CEP is low + • Avg.dialogue lengh is low + MTS achieves the best of both worlds

  18. Using LTS, fewer concepts have to be taught as many conversations remain abstract. experiments Comparison

  19. In the next experiments, we make the agents acquire knowledge about the ontology model. Let N1 be the Nr of agents that regarded the meaning overgeneralized Let N2 be the Nr of agents that did not regrard the meaning overgeneralized experiments Learning the ontology model

  20. The results between the different strategies are still significant. experiments Results acquired knowledge

  21. Because Ontology Negotiation works pairwise, attention must be paid to the overall goal of establishing a global semantically integrated system. Strategies for finding the right level of generality are important for fast integration Future research: incorporate tasks. What is considered overgeneralized depends on the tasks that the agents are involved in. conclusion Conclusion

More Related