1 / 15

CS 461: Machine Learning Lecture 4

CS 461: Machine Learning Lecture 4. Dr. Kiri Wagstaff wkiri@wkiri.com. Plan for Today. Solution to HW 2 Support Vector Machines Neural Networks Perceptrons Multilayer Perceptrons. Review from Lecture 3. Decision trees Regression trees, pruning, extracting rules Evaluation

allie
Télécharger la présentation

CS 461: Machine Learning Lecture 4

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CS 461: Machine LearningLecture 4 Dr. Kiri Wagstaff wkiri@wkiri.com CS 461, Winter 2009

  2. Plan for Today • Solution to HW 2 • Support Vector Machines • Neural Networks • Perceptrons • Multilayer Perceptrons CS 461, Winter 2009

  3. Review from Lecture 3 • Decision trees • Regression trees, pruning, extracting rules • Evaluation • Comparing two classifiers: McNemar’s test • Support Vector Machines • Classification • Linear discriminants, maximum margin • Learning (optimization): gradient descent, QP CS 461, Winter 2009

  4. Neural Networks Chapter 11 It Is Pitch Dark CS 461, Winter 2009

  5. Math Perceptron Graphical CS 461, Winter 2009 [Alpaydin 2004  The MIT Press]

  6. “Smooth” Output: Sigmoid Function • Why? • Converts output to probability! • Less “brittle” boundary CS 461, Winter 2009

  7. Regression: Classification: Softmax K outputs CS 461, Winter 2009 [Alpaydin 2004  The MIT Press]

  8. Training a Neural Network • Randomly initialize weights • Update = Learning rate * (Desired - Actual) * Input CS 461, Winter 2009

  9. Learning Boolean AND Perceptron demo CS 461, Winter 2009 [Alpaydin 2004  The MIT Press]

  10. Multilayer Perceptrons = MLP = ANN CS 461, Winter 2009 [Alpaydin 2004  The MIT Press]

  11. x1 XOR x2 = (x1 AND ~x2) OR (~x1 AND x2) CS 461, Winter 2009 [Alpaydin 2004  The MIT Press]

  12. Examples • Digit Recognition • Ball Balancing CS 461, Winter 2009

  13. ANN vs. SVM • SVM with sigmoid kernel = 2-layer MLP • Parameters • ANN: # hidden layers, # nodes • SVM: kernel, kernel params, C • Optimization • ANN: local minimum (gradient descent) • SVM: global minimum (QP) • Interpretability? About the same… • So why SVMs? • Sparse solution, geometric interpretation, less likely to overfit data CS 461, Winter 2009

  14. Summary: Key Points for Today • Support Vector Machines • Neural Networks • Perceptrons • Sigmoid • Training by gradient descent • Multilayer Perceptrons • ANN vs. SVM CS 461, Winter 2009

  15. Next Time • Midterm Exam! • 9:10 – 10:40 a.m. • Open book, open notes (no computer) • Covers all material through today • Neural Networks(read Ch. 11.1-11.8) • Questions to answer from the reading • Posted on the website (calendar) • Three volunteers? CS 461, Winter 2009

More Related