1 / 15

Kristian Kersting University of Freiburg Germany

„Application of Probabilistic ILP II“, FP6-508861 www.aprill.org. Probabilistic Logic Learning . al and Relational. Probability. Logic. Learning. James Cussens University of York UK. Kristian Kersting University of Freiburg Germany.

zebulon
Télécharger la présentation

Kristian Kersting University of Freiburg Germany

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. „Application of Probabilistic ILP II“, FP6-508861 www.aprill.org Probabilistic Logic Learning al and Relational Probability Logic Learning James Cussens University of York UK Kristian Kersting University of Freiburg Germany

  2. Special thanks to the APrIL II consortium • „Application of Probabilistic ILP“ • 3 years EU project • 5 institutes • www.aprill.org Heikki Mannila Stephen Muggleton, Mike Sternberg Subcontractor: James Cussens Luc De Raedt Subcontractor: Manfred Jaeger François Fages Paolo Frasconi

  3. ... special thanks ... ... for discussions, materials, and collaborations to Alexandru Cocura,Uwe Dick, Pedro Domingos, Peter Flach, Thomas Gaertner, Lise Getoor, Martin Guetlein, Bernd Gutmann, Tapani Raiko, Reimund Renner, Richard Schmidt, Ingo Thon, ...

  4. Tutorial´s Aims • Introductory survey • Identification of important probabilistic, relational/logical and learning concepts

  5. Objectives One of the key open questions of AI concerns Probabilistic Logic Learning: The integration of probabilistic reasoningwith Probabilitiy first order / relational logic representationsand Logic Learning machine learning.

  6. Text Classification Computer troubleshooting Economic Why do we need PLL? Robotics Medicine Diagnosis Prediction Classification Decision-making Description Web Mining Computational Biology Let‘s look at an example PLMs

  7. Web Mining / Linked Bibliographic Data / Recommendation Systems / … [illustration inspired by Lise Getoor] book book author book author publisher book publisher Real World

  8. Web Mining / Linked Bibliographic Data / Recommendation Systems / … books B2 authors publishers B1 B3 series A2 author-of publisher-of A1 P2 B4 P1 Fantasy Science Fiction Real World

  9. Not flat but structured representations: Multi-relational, heterogeneous and semi-structured Structured Domains Dealing with noisy data, missing data and hidden variables Uncertainty Knowledge Acquisition Bottleneck, Data cheap Machine Learning Why do we need PLL? Real World Applications Let‘s look at some more examples

  10. AA AA AA aa aa aa AA aa Aa AA Aa Aa AA Aa Aa aa Aa aa Aa Aa aa AA Aa Blood Type / Genetics/ Breeding • 2 Alleles: A and a • Probability of Genotypes AA, Aa, aa ? Father Mother Offspring Prior for founders CEPH Genotype DB,http://www.cephb.fr/

  11. Others Social Networks Protein Secondary Structure Data Cleaning Scene interpretation ? Phylogenetic Trees Metabolic Pathways

  12. SRL ; Why do we need PLL ? Statistical Learning (SL) Probabilistic Logics Uncertainty • attribute-value representations: some learning problems cannot (elegantly) be described using attribute value representations • no learning: to expensive to handcraft models + soft reasoning, expressivity + soft reasoning, learning PLL Real World Applications Structured Domains Machine Learning Inductive Logic Programming (ILP) Multi-Relational Data Mining (MRDM) - crisp reasoning: some learning problems cannot (elegantly) be described without explicit handling of uncertainty + expressivity, learning

  13. Why do we need PLL? • Rich Probabilistic Models • Comprehensibility • Generalization (similar situations/individuals) • Knowledge sharing • Parameter Reduction / Compression • Learning • Reuse of experience (training one RV might improve prediction at other RV) • More robust • Speed-up

  14. When to apply PLL ? • When it is impossible to elegantly represent your problem in attribute value form • variable number of ‘objects’ in examples • relations among objects are important • Background knowledge can be defined intensionally : • define ‘benzene rings’ as view predicates

  15. Overview • Introduction to PLL • Foundations of PLL • Logic Programming, Bayesian Networks, Hidden Markov Models, Stochastic Grammars • Frameworks of PLL • Independent Choice Logic,Stochastic Logic Programs, PRISM, • Bayesian Logic Programs, Probabilistic Logic Programs,Probabilistic Relational Models • Logical Hidden Markov Models • Applications

More Related