1 / 24

CSC 9010

CSC 9010. ANN Lab Paula Matuszek Spring, 2011. CSC 9010 Spring 2011. Paula Matuszek. Knowledge-Based Systems and Artificial Neural Nets. Knowledge Representation means: Capturing human knowledge In a form computer can reason about Is this relevant to ANNs?

stash
Télécharger la présentation

CSC 9010

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CSC 9010 ANN Lab Paula Matuszek Spring, 2011 CSC 9010 Spring 2011. Paula Matuszek

  2. Knowledge-Based Systems and Artificial Neural Nets • Knowledge Representation means: • Capturing human knowledge • In a form computer can reason about • Is this relevant to ANNs? • Consider the question we’ve been modeling all semester: • what class should I take? • What would be involved in representing this as a neural net? CSC 9010 Spring 2011. Paula Matuszek

  3. Knowledge for ANNs • What human knowledge do we still need to capture? • Input variables • Output variables • Training and test cases • Let’s look at some. • http://www.aispace.org/neural/ CSC 9010 Spring 2011. Paula Matuszek

  4. Example from AISpace • Mail • Examples • Set properties • Initialize the parameters • Solve • How do we use it? Calculate output CSC 9010 Spring 2011. Paula Matuszek

  5. Our “Which class to take” problem • Inputs? • Outputs? • Sample data CSC 9010 Spring 2011. Paula Matuszek

  6. Some Examples • Example 1: • 3 inputs, 1 output, all binary • Example 2: • same inputs, output inverted CSC 9010 Spring 2011. Paula Matuszek

  7. Getting the right inputs • Example 3 • Same inputs as 1 and 2 • Same output as 1 • Outcomes reversed for half the cases CSC 9010 Spring 2011. Paula Matuszek

  8. Getting the right inputs • Example 3 • Same inputs as 1 and 2 • Same output as 1 • Outcomes reversed for half the cases • Network is not converging • The output here cannot be predicted from these inputs. • Whatever is determining whether to take the class, we haven’t captured it CSC 9010 Spring 2011. Paula Matuszek

  9. Representing non-numeric values • Example 4 • Required is represented as “yes” or “no” CSC 9010 Spring 2011. Paula Matuszek

  10. Representing non-numeric values • Example 4 • Required is represented as “yes” or “no” • Actual model still uses 1 and 0; transformation is done by applet. CSC 9010 Spring 2011. Paula Matuszek

  11. More non-numeric values • Example 5 • Workload is low, med, high: text values but they can be ordered. CSC 9010 Spring 2011. Paula Matuszek

  12. More non-numeric values • Example 5 • Workload is low, med, high: text values but they can be ordered. • Applet asks us to assign values. 1, 0.5, 0 is typical. CSC 9010 Spring 2011. Paula Matuszek

  13. Unordered values • Example 6 • Input variables here include professor • Non-numeric, can’t be ordered. CSC 9010 Spring 2011. Paula Matuszek

  14. Unordered values • Example 6 • Input variables here include professor • Non-numeric, can’t be ordered. • Still need numeric values • Solution is to treat n possible values as n separate binary values • Again, applet does this for us CSC 9010 Spring 2011. Paula Matuszek

  15. Variables with more values • Example 7 • GPA and number of classes taken are integer values • Takes considerably longer to solve • Looks for a while like it’s not converging CSC 9010 Spring 2011. Paula Matuszek

  16. And Reals • Example 8 • GPA is a real. • Takes about 20,000 steps to converge CSC 9010 Spring 2011. Paula Matuszek

  17. And multiple outputs • Small Car database from AIspace • For any given input case, you will get a value for each possible outcome. • Typical for, for instance, character recognition. CSC 9010 Spring 2011. Paula Matuszek

  18. Training and Test Cases • The basic training approach will fit the training data as closely as possible. • But we really want something that will generalize to other cases • This is why we have test cases. • The training cases are used to compute the weights • The test cases tell us how well they generalize • Both training and test cases should represent the overall population as well as possible. CSC 9010 Spring 2011. Paula Matuszek

  19. Representative Training Cases • Example 9 • Training cases and test cases are similar CSC 9010 Spring 2011. Paula Matuszek

  20. Representative Training Cases • Example 9 • Training cases and test cases are similar • (actually identical...) • Training error and test error are comparable CSC 9010 Spring 2011. Paula Matuszek

  21. Non-Representative Training Cases • Example 10 • Training cases and test cases represent different circumstances • We’ve missed including any cases involving Lee in the training • Training error goes down, but test error goes up. • In reality these are probably bad training AND test cases; neither seems representative. CSC 9010 Spring 2011. Paula Matuszek

  22. So: • Getting a good ANN still involves understanding your domain and capturing knowledge about it • choosing the right inputs and outputs • choosing representative training and test sets • Beware “convenient” training sets • You can represent any kind of variable: numeric or not, ordered or not. • Not every set of variables and training cases will produce something that can be trained. CSC 9010 Spring 2011. Paula Matuszek

  23. Once it’s trained... • When your ANN is trained, you can feed it a specific set of inputs and get one or more outputs. • These outputs are typically interpreted as some decision: • take the class • this is probably a “5” • The network itself is black box. • If the situation changes the ANN should be retrained • new variables • new values for some variables • new patterns of cases CSC 9010 Spring 2011. Paula Matuszek

  24. One last note • These have all been simple cases, as examples • Most of my examples could in fact be predicted much more easily and cleanly with a decision tree, or even a couple of IF statements • A more typical use for any connectionist system has many more inputs and many more training cases CSC 9010 Spring 2011. Paula Matuszek

More Related