1 / 26

Learning on the Fly: Rapid Adaptation to the Image

Learning on the Fly: Rapid Adaptation to the Image. Erik Learned-Miller with Vidit Jain, Gary Huang, Laura Sevilla Lara, Manju Narayana , Ben Mears . “Traditional” machine learning. Learning happens from large data sets With labels: supervised learning

reegan
Télécharger la présentation

Learning on the Fly: Rapid Adaptation to the Image

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Learning on the Fly:Rapid Adaptation to the Image Erik Learned-Millerwith Vidit Jain, Gary Huang, Laura Sevilla Lara, ManjuNarayana, Ben Mears

  2. “Traditional” machine learning • Learning happens from large data sets • With labels: supervised learning • Without labels: unsupervised learning • Mixed labels: semi-supervised learning,transfer learning, learning from one (labeled) example, self-taught learning, domain adaptation

  3. Learning on the Fly • Given: • A learning machine trained with traditional methods • a single test image (no labels) • Learn from the test image!

  4. Learning on the Fly • Given: • A learning machine trained with traditional methods • a single test image (no labels) • Learn from the test image! • Domain adaptation where the “domain” is the new image • No covariate shift assumption. • No new labels

  5. An Example in Computer Vision • Parsing Images of Architectural ScenesBerg, Grabler, and Malik ICCV 2007. • Detect easy or “canonical” stuff. • Use easily detected stuff to bootstrap models of harder stuff.

  6. Claim • This is so easy and routine for humans that it’s hard to realize we’re doing it. • Another example…

  7. Learning on the fly…

  8. Learning on the fly…

  9. Learning on the fly…

  10. What about traditional methods… • Hidden Markov Model for text recognition: • Appearance model for characters • Language model for labels • Use Viterbi to do joint inference

  11. What about traditional methods… • Hidden Markov Model for text recognition: • Appearance model for characters • Language model for labels • Use Viterbi to do joint inference • DOESN’T WORK!Prob( |Label=A) cannot be well estimated, fouling up the whole process.

  12. Lessons • We must assess when our models are broken, and use other methods to proceed…. • Current methods of inference assume probabilities are correct! • “In vision, probabilities are often junk.” • Related to similarity becoming meaningless beyond a certain distance.

  13. 2 Examples • Face detection (CVPR 2011) • OCR (CVPR 2010)

  14. Preview of results: Finding false negatives Viola-Jones Learning on the Fly

  15. Eliminating false positives Viola-Jones Learning on the Fly

  16. Eliminating false positives Viola-Jones Learning on the Fly

  17. Run a pre-existing detector...

  18. Run a pre-existing detector... Key Face Non-face Close to boundary

  19. Gaussian Process Regression learn smooth mappingfrom appearance to score negative positive apply mapping to borderline patches

  20. Major Performance Gains

  21. Comments • No need to retrain original detector • It wouldn’t change anyway! • No need to access original training data • Still runs in real-time • GP regression is done forevery new image.

  22. Noisy Document Initial Transcription We fine herefore tlinearly rolatcd to thewhen this is calculated equilibriurn. In short,on the null-hypothesis:

  23. Premise • We would like to fine confident words to build a document-specific model, but it is difficult to estimate Prob(error). • However, we can bound Prob(error). • Now, select words with • Prob(error)<epsilon.

  24. “Clean Sets”

  25. Document specific OCR • Extract clean sets (error bounded sets) • Build document-specific models from clean set characters • Reclassify other characters in document • 30% error reduction on 56 documents.

  26. Summary • Many applications of learning on the fly. • Adaptation and bootstrapping new models is more common in human learning than is generally believed. • Starting to answer the question: “How can we do domain adaptation from a single image?”

More Related