1 / 33

Transfer for Supervised Learning Tasks

Transfer for Supervised Learning Tasks. HAITHAM BOU AMMAR MAASTRICHT UNIVERSITY. Motivation. Model. Assumptions: Training and Test are from same distribution Training and Test are in same feature space. Not True in many real-world applications. Training Data. ?. Test Data.

evelia
Télécharger la présentation

Transfer for Supervised Learning Tasks

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Transfer for Supervised Learning Tasks HAITHAM BOU AMMAR MAASTRICHT UNIVERSITY

  2. Motivation Model • Assumptions: • Training and Test are from same distribution • Training and Test are in same feature space Not True in many real-world applications Training Data ? Test Data

  3. Examples: Web-document Classification ? Model Physics Machine Learning Life Science

  4. Examples: Web-document Classification ? Content Change ! Assumption violated! Model Learn a new model Physics Machine Learning Life Science

  5. Learn new Model : • Collect new Labeled Data • Build new model Reuse & Adapt already learned model !

  6. Examples: Image Classification Task One Model One Features Task One

  7. Motorcycles Cars Examples: Image Classification Features Task One Reuse Features Task Two Model Two Task Two

  8. Traditional Machine Learning vs. Transfer Source Task Target Task Different Tasks Learning System Learning System Learning System Learning System Knowledge Traditional Machine Learning Transfer Learning

  9. Questions ?

  10. Transfer Learning Definition • Notation: • Domain : • Feature Space: • Marginal Probability Distribution: • with • Given a domain then a task is : Label Space P(Y|X)

  11. Transfer Learning Definition Given a source domain and source learning task, a target domain and a target learning task, transfer learning aims to help improve the learning of the target predictive function using the source knowledge, where or

  12. Transfer Definition • Therefore, if either : Domain Differences Task Differences

  13. Examples: Cancer Data Smoking Age Smoking Age Height

  14. Examples: Cancer Data Task Source: Classify into cancer or no cancer Task Target: Classify into cancer level one, cancer level two, cancer level three

  15. Quiz When doesn’t transfer help ? (Hint: On what should you condition?)

  16. Questions ?

  17. Questions to answer when transferring When to Transfer ? What to Transfer ? Model ? Features ? How to Transfer ? Instances ? Map Model ? Unify Features ? In which Situations Weight Instances ?

  18. Algorithms: TrAdaBoost • Assumptions: • Source and Target task have same feature space: • Marginal distributions are different: Not all source data might be helpful !

  19. Algorithms: TrAdaBoost (Quiz) What to Transfer ? How to Transfer ? Instances Weight Instances

  20. Algorithm: TrAdaBoost • Idea: • Iteratively reweight source samples such that: • reduce effect of “bad” source instances • encourage effect of “good” source instances • Requires: • Source task labeled data set • Very small Target task labeled data set • Unlabeled Target data set • Base Learner

  21. Algorithm: TrAdaBoost Weights Initialization Hypothesis Learning and error calculation Weights Update

  22. Questions ?

  23. Motorcycle Car Natural scenes Algorithms: Self-Taught Learning • Problem Targeted is : • Little labeled data • A lot of unlabeled data • Build a model on the labeled data

  24. Algorithms: Self-Taught Learning • Assumptions: • Source and Target task have different feature space: • Marginal distributions are different: • Label Space is different:

  25. Algorithms: Self-Taught Learning (Quiz) What to Transfer ? Features How to Transfer ? 1. Discover Features 2. Unify Features

  26. Algorithms: Self-Taught Learning • Framework: • Source Unlabeled data set: • Target Labeled data set: Natural scenes Build classifier for cars and Motorbikes

  27. Algorithms: Self-Taught Learning • Step One: Discover high level features from Source data by Re-construction Error Regularization Term Constraint on the Bases

  28. Algorithm: Self-Taught Learning High Level Features Unlabeled Data Set

  29. Algorithm: Self-Taught Learning • Step Two: Project target data onto the attained features by • Informally, find the activations in the attained bases such that: • Re-construction is minimized • Attained vector is sparse

  30. Algorithms: Self-Taught Learning High Level Features

  31. Algorithms: Self-Taught Learning • Step Three: Learn a Classifier with the new features Target Task Source Task Learn new features (Step 1) Project target data (Step 2) Learn Model (Step 3)

  32. Conclusions • Transfer learning is to re-use source knowledge to help a target learner • Transfer learning is not generalization • TrAdaBoot transfers instances • Self-Taught learning transfer unlabeled features

  33. hmm

More Related