220 likes | 349 Vues
CROC (Concept Representation Ontology for Concepts) aims to enhance the Semantic Web by making web content comprehensible for intelligent agents. It addresses the interoperability challenge through shared classifications and efficient communication mechanisms, enabling agents to understand and represent concepts effectively. The ontology facilitates the identification, classification, and communication of concepts while accommodating different worldviews and individual agent interests. By leveraging lexical representations and allowing for private conceptions, CROC sets the stage for advanced knowledge representation and agent collaboration.
E N D
Contents • Introduction • Semantic Web • Conceptuology • Language • CROC — a Representational Ontology for Concepts
Semantic Web • Making Web content understandable for intelligent agents • RDF/RDFS/OWL ontologies (state of art) that define classes • The interoperability problem: how to merge different world-views?
Classification • Different classifications: “world-views” • Classification needs identification
Communication (I) • Communication: • (1) expresses using symbols • (2) reads what is expressed • Interoperability problem when one doesn’t know the symbols
CYC: a shared classification? • CYC.com: developing one big classification • One world-view • “not soon” or never complete • agents have own interests and pick up other ideas (“autonomy”) • conceptions may be different from agent to agent
Mapping world-views? • Should we map classifications to solve the interoperability problem? • Rather: think about the identification mechanism (for a Semantic Web!).
Communication (II) • Communication: • (1) represents • (2) identifies and classifies • Problem when the receiving agent cannot identify the representation
Identification: conceptuology • A concept = • (fuzzy / partial) definition? • prototyping? • an ability to reidentify for a purpose [1:Millikan, On Clear and Confused Ideas: An Essay on Substance Concepts] • Most concepts are not classes
1 http://en.wikipedia.org/wiki/Image:Pupppppy.jpg Concept for dogs 2
Common sense • Computers usually don’t have much common sense: they are deaf, blind, tasteless, touchless, etc. • Do they need it for having concepts?
Language • Same concepts, different conceptions • Having concepts entirely through language “It is common [to] have a substance concept entirely through the medium of language. It is possible to have it, that is, while lacking any ability to recognize the substance in the flesh.” [1, Ch. 6]
CROC — a Representational Ontology for Concepts (I) • Lexical representations for concepts • Concepts have names (so can be shared by language) • Where the name fails, CROC uses induction or deduction using the related knowledge to the concept • Representation, using other concepts • Descriptions instead of definitions
Examples (I) A: “Swans are white.” OWL B: (OK, I’ll take that into the class definition.) CROC B: (OK, nice to know.) A: “There is a black swan.” CROC B: (OK, nice to know.) OWL B: (Error in [1], or unalignable classes for “swan”.)
CROC — a Representational Ontology for Concepts (II) • Concepts for every unit of representation • Subjects, subdivided in Kinds (like ‘a dog’), Individuals (like ‘Oscar’), and Stuffs (like ‘gold’) • Substances • Properties (like ‘colour’) • Happenings (events, situations) • Predicates (like ‘poor’, ‘eager’) • Relations (like ‘of’, ‘in’, ‘at’)
CROC — a Representational Ontology for Concepts (III) • Abilities to gather, store and query representational information for reidentification • Storage of statements (happenings) about concepts • Subject templates to gather information • Semantical tableaux for reasoning about statements
Examples (II) A: “I like Cicero’s De Oratore.” B: (I don’t know that word.) “Cicero??” A: (I will answer what I know is relevant for humans.) “Cicero is a human. He was born in Arpinum.” B: (I have other relevant questions about humans.) “Where did he live?” A: “In Rome.”
Examples (III) (continued) B: (I see someone matches all inductive properties.) “Cicero is Marcus Tullius?” A: “Yes.” B: (I will merge the two concepts.)
CROC — a Representational Ontology for Concepts (IV) • Our goal is not primarily knowledge representation, but agent communication and understanding • Agents have their own conceptuology • No need for division of linguistic labour (where only experts ‘own’ the concept) • Private concepts and conceptions are welcome (“autonomy”) • Easy learning of new concepts
Conclusions • Identification by name will be able to solve the interoperability problem (for a great deal) • concepts for every part of the representation • agents can have own conceptuologies • Concepts may be grounded entirely in lexical representations
Future work • Higher-order reasoning: about what other agents believe, etc. • A temporal logic for reasoning with statements • Integrating classification systems (efficient knowledge representation) • The language-thought partnership [Millikan, Language: A Biological Model, Ch. 5]
Thank you for your attention http://sourceforge.net/projects/croc