1 / 6

Last Lecture

Last Lecture. Expected Fisher information calculations Assignments and Term Projects Due Wednesday 12/18/02 5pm Office Hours 12/13/02 Friday 10am till 12pm 12/15/02 Monday 10am till 12pm Bringing it all together without focusing on the maths. Modelling Uncertainty.

fergal
Télécharger la présentation

Last Lecture

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Last Lecture • Expected Fisher information calculations • Assignments and Term Projects • Due Wednesday 12/18/02 5pm • Office Hours • 12/13/02 Friday 10am till 12pm • 12/15/02 Monday 10am till 12pm • Bringing it all together without focusing on the maths CSI 661 - Uncertainty in A.I. Lecture 25

  2. Modelling Uncertainty • Model behavior/situations using random variables • Use prior knowledge to specify the structure between the random variables • Place prior distributions over parameters of random variables • Why is there uncertainty? CSI 661 - Uncertainty in A.I. Lecture 25

  3. Learning with Uncertainty • Use Bayes theorem to update our beliefs in these values given the data • Learn parameters • Learn model structure • The EM algorithm can find the “best” model • But there is uncertainty if this is truly the best model • Use Optimal Bayesian Learning to remove this uncertainty CSI 661 - Uncertainty in A.I. Lecture 25

  4. Making Decisions • Suppose we find the best model or collection of models • How can we make a decision? • Try to minimize our risk • Minimize maximum risk over many situations • If we were to encounter the same situation again and again, we would minimize our risk. CSI 661 - Uncertainty in A.I. Lecture 25

  5. Learning by Compact Encoding • Form of Bayesian learning • P(H,D) = 2-(ML(H)+ML(D|H) • Why? • Compare models of different complexity • Invariant to non-linear data transformations • Consistency • Quantification of Occam’s razor CSI 661 - Uncertainty in A.I. Lecture 25

  6. Further Reading • Bayesian belief network • Uncertainty in A.I. Conference • Bayesian Learning • Neural Information Processing (NIPS) Conference • Various Journals • Machine Learning, A.I., A.I. Research, Machine Learning Research, Experimental A.I., Computational Intelligence … • Journal club … CSI 661 - Uncertainty in A.I. Lecture 25

More Related