1 / 17

Software Agents: Completing Patterns and Constructing User Interfaces Jeffrey C. Schlimmer

Software Agents: Completing Patterns and Constructing User Interfaces Jeffrey C. Schlimmer Leonard A. Hermens School of Electrical Engineering & Computer Science, Washington State University, Pullman Presented by Marko Puljic Motivation Learning Prediction. Introduction

Télécharger la présentation

Software Agents: Completing Patterns and Constructing User Interfaces Jeffrey C. Schlimmer

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Software Agents: Completing Patterns and Constructing User Interfaces • Jeffrey C. Schlimmer • Leonard A. Hermens • School of Electrical Engineering & Computer Science, • Washington State University, Pullman • Presented by Marko Puljic • Motivation • Learning • Prediction

  2. Introduction • Note-Taking system: • Actively predicts what the user is going to write based on the previously taken notes. • Motivation for Building Note-Taking System • People like to record the information for the retrieval, (fast data structures and algorithms) • Speed up of information entry and reduction of errors • Physical storage is excessive, duplication and distribution is inexpensive, ( due to high density devices and high speed networks)

  3. Motivation in General for the Pattern Recognition problem `Given some examples of complex signals and the correct decisions for them, make decisions automatically for a stream of future examples.‘ Examples: identifying fingerprints, highlighting potential tumors on a mammogram, handwriting recognition, visual inspection of manufactured products for quality control, speech recognition,

  4. User’s Perspective + software has to improve the speed and accuracy as the user enters notes about various domains of interest + agent continuously predicts a likely completion as the user writes + there is small completion button which color ranges in saturation form 1 (green) when the agent is confident to 0 (white) when the agent has no confidence.

  5. Agent • learns to assist the users by watching them complete tasks • Helps to capture and organize the information • Goal and Drives • Predict the input that will be given by user • Learn the pattern • Prompt the prediction based on the input • Environment • hardware and user’s input string • Perception • string • Behind the interface, the software is acting on behalf of the user, helping to capture and organize the information

  6. Learning a Syntax + to characterize the syntax agent learns finite-state machines (FSMs) + to generate predictions, agent learns decision tree classifiers situated at states within the FSMs. + FSMs are well understood and relatively expressive – Angluin (1982) and Berwick and Pilato (1987) present a straightforward algorithm for learning a specific subclass of FSMs called k-reversible FSMs.

  7. Learning (tokenization, merging, classifiers) Tokenization Example: 4096 K PowerBook 170, 1.4MB and 120MB Int. Drives, FPU, 2400/9600 Baud :NULL “4096” “ K” “ PowerBook” “170, “ . .

  8. Merging Strings (e.g. adding a second pattern)

  9. 3 patterns merged

  10. Example of many notes merged

  11. notes merged with classifiers 1 2 3 4 5 6 7

  12. Learning Embedded Classifiers + A classifier is the state in the FSM, which has more than one transition. + The classifier gives advice about which transition to take or whether to terminate. It is necessary to decide whether to: 1) terminate or 2) continue prediction, and then which transition to predict + The classifiers are updated incrementally after the user finishes each note. + classifier predicts based on previous transitions and the frequency of current state’s transitions.

  13. Learning Embedded Classifiers Decision tree is embedded in the classifier, (examples): Decision tree embedded in state 3: If state 1 exited with “2048” Then predict “ 20” Else if with “4096” Then predict “ 40” Else if with “6144” Then predict “ 40” Else if with “8192” Then predict “ 40” Decision tree embedded in state 7: If state 7 has not been visited Then predict “ FAX” Else if state 7 exited with “ Fax” Then predict “ Modem”

  14. Parsing (how to predict) e.g. Sequence {:NULL, “12288”, “K”, “PB”} + identify the state requiring a minimum number of insertions, omissions, and replacements necessary to parse the new sequence: “12288” is novel token, “K” is OK, “”PowerBook” is replaced by “PB” + the initial state had a transition for the first token. + state 1 doesn’t have a transition for the next token “12288”, so greedy search is started to find a state that accepts either “12288”, “K”, “PB”. The state before state 2 accepts “K”. + another greedy search starts from state with “K” that accepts “PB”. “PB” cannot be found, so parsing assumes that it should skip to the next transition “PowerBook”. + system generates a prediction from state 2 to prompt the user.

  15. notes merged with classifiers 1 2 3 4 5 6 7

  16. Contextual Prompting • + Calculation to compute the confidence: • F(prediction) • F(totoal) X (1+skipped) • F(prediction) – frequency of the predicted arc, (the number of times this choice was taken while parsing previously observed notes) • F(totoal) – total frequency of all arcs (and terminate) • Skipped – number of tokens skipped during parsing. • Stopping Criterion • Next prediction is to terminate • At least one token has been predicted and the confidence of prediction is lower • Next prediction is same as last prediction • More than 10 tokens have already been predicted

  17. Multiple FSMs + may be necessary to learn a separate syntax for each domain – problem of deciding which notes should be clustered together to share a FSM + tactic: a new note is grouped with the FSM that skips the fewest of its tokens. Example: A new FSM is constructed only if all other FSMs skip more than half of the new note’s tokens.

More Related