1 / 18

Knowledge Engineering for Bayesian Networks

Knowledge Engineering for Bayesian Networks. Ann Nicholson. School of Computer Science and Software Engineering Monash University. Overview. Representing uncertainty Introduction to Bayesian Networks Syntax, semantics, examples The knowledge engineering process Open research questions.

humphrey
Télécharger la présentation

Knowledge Engineering for Bayesian Networks

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Knowledge Engineering for Bayesian Networks Ann Nicholson School of Computer Science and Software Engineering Monash University

  2. Overview • Representing uncertainty • Introduction to Bayesian Networks • Syntax, semantics, examples • The knowledge engineering process • Open research questions

  3. Sources of Uncertainty • Ignorance • Inexact observations • Non-determinism • AI representations • Probability theory • Dempster-Shafer • Fuzzy logic

  4. Probability theory for representing uncertainty • Assigns a numerical degree of belief between 0 and 1 to facts • e.g. “it will rain today” is T/F. • P(“it will rain today”) = 0.2 prior probability (unconditional) • Posterior probability (conditional) • P(“it wil rain today” | “rain is forecast”) = 0.8 • Bayes’ Rule: P(H|E) = P(E|H) x P(H) P(E)

  5. Bayesian networks • Directed acyclic graphs • Nodes: random variables, • R: “it is raining”, discrete values T/F • T: temperature, cts or discrete variable • C: colour, discrete values {red,blue,green} • Arcs indicate dependencies (can have causal interpretation)

  6. X Flu Y Te Q Th Bayesian networks • Conditional Probability Distribution (CPD) • Associated with each variable • probability of each state given parent states “Jane has the flu” P(Flu=T) = 0.05 Models causal relationship “Jane has a high temp” P(Te=High|Flu=T) = 0.4 P(Te=High|Flu=F) = 0.01 Models possible sensor error “Thermometer temp reading” P(Th=High|Te=H) = 0.95 P(Th=High|Te=L) = 0.1

  7. Flu Flu TB Flu Flu Y Te Te Te Y Te Th Th Th Diagnostic inference Causal inference Intercausal inference Intercausal inference BN inference • Evidence: observation of specific state • Task: compute the posterior probabilities for query node(s) given evidence. Flu

  8. BN software • Several commerical packages • Netica, Hugin, Analytica (all with demo versions) • Free software: Smile, Genie, JavaBayes, … • [Add Almond and Murphy BN info sites] • http://HTTP.CS.Berkeley.EDU/~murphyk/Bayes/bnsoft.html • Examples

  9. Decision networks • Extension to basic BN for decision making • Decision nodes • Utility nodes • EU(Action) =  p(o|Action,E) U(o) o • choose action with highest expect utility • Example

  10. Elicitation from experts • Variables • important variables? values/states? • Structure • causal relationships? • dependencies/independencies? • Parameters (probabilities) • quantify relationships and interactions? • Preferences (utilities)

  11. Knowledge Engineering Process • These stages are done iteratively • Stops when further expert input is no longer cost effective • Process is difficult and time consuming • As yet, not well integrated with methods and tools developed by the Intelligent Decision Support community.

  12. Knowledge discovery • There is much interest in automated methods for learning BNS from data • parameters, structure (causal discovery) • Computationally complex problem, so current methods have practical limitations • e.g. limit number of states, require variable ordering constraints, do not specify all arc directions • Evaluation methods

  13. The knowledge engineering process 1. Building the BN • variables, structure, parameters, preferences • combination of expert elicitation and knowledge discovery 2. Validation/Evaluation • case-based, sensitivity analysis, accuracy testing 3. Field Testing • alpha/beta testing, acceptance testing 4. Industrial Use • collection of statistics 5. Refinement • Updating procedures, regression testing

  14. Case Study: Seabreeze prediction • 2000 Honours project, joint with Bureau of Meteorology (PAKDD’2001 paper, TR) • BN network built based on existing simple expert rule • Several years data available for Sydney seabreezes • CaMML and Tetrad-II programs used to learn BNs from data • Comparative analysis showed automated methods gave improved predictions.

  15. Case Study: Intelligent tutoring Adaptive Bayesian Network Inputs Student Generic BN model of student Decimal comparison test (optional) Item Answers Answer • Diagnose misconception • Predict outcomes • Identify most useful information Information about student e.g. age (optional) Computer Games Hidden number Answer Classroom diagnostic test results (optional) Feedback Answer Flying photographer • Select next item type • Decide to present help • Decide change to new game • Identify when expertise gained System Controller Module Item type Item Decimaliens New game Sequencing tactics Number between Help Help …. Report on student Classroom Teaching Activities Teacher

  16. Consulting experiences • In 1999/2000, Kevin Korb and myself • Clients: NAB, North Ltd • Process • approached by technical person interested in the technology • gave workshops on BN technology • brainstorming for BN elicitation (iterative) • technical person satisfied with preliminary results • BN technology not “sold” to managers

  17. Open Research Questions • Tools needed to support expert elicitation • reduce reliance on BN expert • example - visualisation of explanatory methods • Combining expert elicitation and automated methods • Evaluation measures and methods • Industry adoption of BN technology

  18. Visit to UniMelb • March-June (away some of April/May) • Work on BN textbook (joint with Kevin Korb) • Continue ongoing research projects • Talk with DIS academics with any common interests.

More Related