1 / 50

The Evaluation of Ontologies Editorial Review vs. Democratic Ranking

The Evaluation of Ontologies Editorial Review vs. Democratic Ranking. Barry Smith. The need for ontology evaluation. Ontologies are expensive. $8 million have been invested in the Gene Ontology thus far Has this investment been worthwhile? Are some ontologies more useful than others?

martha
Télécharger la présentation

The Evaluation of Ontologies Editorial Review vs. Democratic Ranking

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Evaluation of OntologiesEditorial Review vs. Democratic Ranking Barry Smith

  2. The need for ontology evaluation Ontologies are expensive. $8 million have been invested in the Gene Ontology thus far Has this investment been worthwhile? Are some ontologies more useful than others? What are ontologies useful for ?

  3. Uses of ‘ontology’ in PubMed abstracts

  4. In the olden days people measured lengths using inches, ulnas, perches, king’s feet, Swiss feet, kanejaku, shaku, whale shaku, etc., etc.

  5. then, on June 22, 1799,everything changed

  6. we now have the International System of Units

  7. Through the SI System science becomes a cumulative, distributed endeavor my measuring equipment can be callibrated against your measuring equipment my hypotheses can be checked against your data

  8. When should a new unit should be included in the SI system? The work of the CIPM International Committee for Weights and Measures rests on an editorial process

  9. Obvious benefits of a peer review process for scientific work Creating an environment which rewardsbetter work Helping people to find better work ...

  10. Proposal: Evaluate ontologies via an editorial process of the sort used for scientific journals and scientific research projects a process of peer review by human experts

  11. Peer review process • appropriate when evaluating science • (not, e.g., when evaluating poetry, or fairy tales, or Chinese mythology ...)

  12. Peer review process appropriate in the domain of biomedical ontology – ontology evaluation here can be of particularly acute concern

  13. The OBO Foundry

  14. THE OBO FOUNDRY a suite of reference ontologies in the biomedical domain satisfying certain basic criteria and subject to an on-going process of peer review

  15. OBO FOUNDRY CRITERIA • The ontology isopenand available to be used by all. • The ontology is in, or can be instantiated in, a common formal language. • The developers of the ontology agree in advance to collaboratewith developers of other OBO Foundry ontology where domains overlap.

  16. UPDATE: The developers of each ontology commit to its maintenance in light of scientific advance, and to soliciting community feedback for its improvement. • ORTHOGONALITY: They commit to working with other Foundry members to ensure that, for any particular domain, there is community convergence on a single controlled vocabulary. CRITERIA

  17. CRITERIA CRITERIA • IDENTIFIERS: The ontology possesses a unique identifierspace within OBO. • VERSIONING: The ontology provider has procedures for identifying distinct successive versions. • The ontology includes textual definitions for all terms.

  18. CRITERIA • CLEARLY BOUNDED: The ontology has a clearly specified and clearly delineated content. • DOCUMENTATION: The ontology is well-documented. • USERS: The ontology has a plurality of independent users.

  19. CRITERIA • COMMON ARCHITECTURE: The ontology uses relations which are unambiguously defined following the pattern of definitions laid down in the OBO Relation Ontology

  20. The OBO Foundry provides guidelines (traffic laws) to new groups of ontology developers in ways which can • help to ensure interoperability through prospective synchronization • counteract dispersion of effort • prevent some common types of nonsense

  21. How the editorial process works Example: the Foundry seeks orthogonality This brings division of labor and other benefits Foundry editors adjudicate in areas of overlap

  22. How the editorial process works Foundry editors balance • the flexibility that is indispensable to scientific advance • the institution of principles that is indispensable to successful coordination

  23. Peer review is a top down approach, relying on authority

  24. An alternative, bottom up approach democratic ranking

  25. If we build it, will they come? Social engineering of new technology to disseminate biomedical ontologies Mark A. Musen and the BioPortal Team Stanford University

  26. If we build it, will they come? Social engineering of new technology to disseminate biomedical ontologies presentation to Ontolog Forum July 6, 2007

  27. Mark Musen and Natasha Noy With thanks to

  28. In biology, lots of ontology developers are almost hobbyists • Nearly always, ontologies are created to address pressing practical needs • Biologists ... may have little appreciation for metaphysics, principles of knowledge representation, or computational logic • There simply aren’t enough good ontologists to go around

  29. Issues in assuring ontology quality • Unlike the case with journal submissions, it makes no sense for ontologies to be peer-reviewed by just a handful of experts • Open, community-based review of ontologies may be haphazard and chaotic • Top–down solutions may offer rigid review critieria at the expense of scalability • There is a pressing need for empirical evaluation of methods for ontology evaluation

  30. OBO Foundry must address lots of questions • Can the top-down approach scale? How many ontologies can be managed by a small panel of curators? • Who gets to reject an ontology on the basis of form or content? What is the appeals process? How do we know whom to believe? • Who will curate the curators?

  31. NCBO will offer • Technology for uploading, browsing, and using biomedical ontologies • Methods to make the online “publication” of ontologies more like that of journal articles • Tools to enable the biomedical community to put ontologies to work on a daily basis

  32. http://bioportal.bioontology.org

  33. Browsing/Visualizing Ontologies Local Neighborhood view

  34. Hierarchy-to-root view

  35. Goals for BioPortal • Web accessible repository of ontologies for the biomedical community • Archived locally • Anywhere in cyberspace • Support for ontology • Peer review • Annotation (marginalia) • Versioning • Alignment • Search

  36. Ontologies are not like journal articles • It is difficult to judge methodological soundness simply by inspection • We may wish to use an ontology even though some portions • Are not well designed • Make distinctions that are different from those that we might want

  37. Ontologies are not like journal articles • The utility of ontologies • Depends on the task • May be highly subjective • The expertise and biases of reviewers may vary widely with respect to different portions of an ontology • Users should want the opinions of more than 2–3 hand-selected reviewers • Peer review needs to scale to the entire user community

  38. Community-Based Annotation • Makes ontology evaluation a democratic process • Assumes users’ applications of ontologies will lead to insights not achievable by inspection alone • Assumes end-users will be motivated to comment on and engage in dialog about ontologies in the repository

  39. Solution Snapshot

  40. Open ratings for ontologies • Any user can • rate an ontology • add a “marginal note” • Ontology evaluation becomes a community-based initiative • A web of trust can enable users to filter comments or ratings to avoid “noise”

  41. Arguments in favor of the top-down approach in the scientific domain • marginalia will contain a great deal of irrelevantalia • scientists need ontologies, but are normally not experts in ontology; they are looking for authoritative guidelines • the Foundry process is yielding guidelines on how to build ontologies compatible with those which already exist

  42. Arguments against the top-down approach • ontologies are not like journal articles, and it is difficult to judge methodological soundness simply by inspection. • the evaluation process does not yield a quantifiable result. – but scientific journals face exactly similar problems, yet peer review, there, works well

  43. In defense of democratic rankings • ranking by large numbers of users will tend to counteract such biases (but will the ranking service in fact attract users?) • ranking by large numbers of users has a greater opportunity to scale up when ontologies proliferate

  44. We have common goals • Both approaches seek quality assurance to support ontology selection. • Both approaches need to address the fact that the expertise and biases of reviewers may vary widely with respect to different ontologies or to different portions of an ontology.

  45. One big difference For Musen et al. there are no restrictions on entry The bottom-up approach seeks community-based annotation of ontologies, with no difference being made between experts and non-experts

  46. One big difference In the OBO Foundry reviews are created precisely by the peers of the ontology authors themselves—by persons with established and recognized expertise and with a demonstrated willingness to invest due diligence in ontology development, use, and evaluation.

  47. Both are needed • in domains such as refrigerators • but in science?

More Related