1 / 38

Recognizing C omplex Human A ctivities – From Top Down to Bottom Up and Back

Recognizing C omplex Human A ctivities – From Top Down to Bottom Up and Back. Dr.- Ing . Ulf Blanke Wearable Computing Lab | ETH Zürich Samsung Jul 25, 2014. Vita. 2001-2006 Dipl. (M.Sc.) Informatik , TU Darmstadt 2007-2011 PhD, Multimodal Interactive Systems Group, TU Darmstadt

dixie
Télécharger la présentation

Recognizing C omplex Human A ctivities – From Top Down to Bottom Up and Back

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Recognizing Complex Human Activities – From Top Down to Bottom Up and Back Dr.-Ing. Ulf Blanke Wearable Computing Lab | ETH Zürich Samsung Jul 25, 2014

  2. Vita 2001-2006 Dipl. (M.Sc.) Informatik, TU Darmstadt 2007-2011 PhD, Multimodal Interactive Systems Group, TU Darmstadt - 3y scholarship, German Research Foundation - Prof. Dr. BerntSchiele Post-Doc, Max Planck Institut for Informatics, Saarbrücken - Computer Vision and Multimodal Computing 2011-2012 Senior Researcher at AGT International (R&D Division) - Integrated safety and security solutions - Headquarter: Switzerland, R&D: Darmstadt 2012 ...Senior Scientist andPioneer Fellow at ETH-Z - Wearable Computing Lab, Prof. Gerhard Tröster Ulf Blanke | Recognizing complex human activities – From Top Down to Bottom Up and Back | 2

  3. Overview Composite Activities- Challenges - Discovering and combining relevant events (LoCA09)- Transfer and recombine relevant events (ISWC10) Overview of current projects … … 1 2 3 t 1 2 3 t Ulf Blanke | Recognizing complex human activities – From Top Down to Bottom Up and Back | 3

  4. Composite Activities Recognizing composite activities by decomposition into isolated activity events • Excellent work addressing isolated activity recognition • Only little work on composite activities time Data fromwearablesensors Ulf Blanke | Recognizing complex human activities – From Top Down to Bottom Up and Back | 4

  5. ChallengesAtomic activity events Inner-class variability and intra-class similarity Large corpus of irrelevant and ambiguous data Similarity across different activities Variability within activity Drilling t time Screwing Screwing Ulf Blanke | Recognizing complex human activities – From Top Down to Bottom Up and Back | 5

  6. ChallengesComposite Activities Variation (duration, irrelevant data, e.g. by interruptions) Changing order of underlying activity events Other challenges later Towards less supervision time Ulf Blanke | Recognizing complex human activities – From Top Down to Bottom Up and Back | 6

  7. Recognizing Composite ActivitiesOne way of doing it Composite … … Layer 3 … Layer 2 … Layer L1 Data t Ulf Blanke | Recognizing complex human activities – From Top Down to Bottom Up and Back | 7

  8. Recognizing Composite ActivitiesLearning relevant events Which low level events are important for composite activities? Learning (automatic Selection) Composite activities lunch dinner walking picking up food eating Doing dishes eating Prep food Sensor data Ulf Blanke | Recognizing complex human activities – From Top Down to Bottom Up and Back | 8

  9. Recognizing Composite ActivitiesSpotting and combining relevant events Recognizing composite activities by activity spotting feasible? Composite activities lunch dinner Activity Spotting walking picking up food eating Doing dishes eating Prep food Sensor data Ulf Blanke | Recognizing complex human activities – From Top Down to Bottom Up and Back | 9

  10. Research QuestionsActivity spotting for composite activities Can we learn distinctive parts of composite activities? Can we gain computational efficiency by reducing to data important for recognition? 1 2 Ulf Blanke | Recognizing complex human activities – From Top Down to Bottom Up and Back | 10

  11. Approach composite activities Joint boosting Histogram calculation K-means clustering Feature-Calculation Sensor data Ulf Blanke | Recognizing complex human activities – From Top Down to Bottom Up and Back | 11

  12. Low Level Activity Selection (Joint)Boosting lunch + lunch dinner dinner lunch others Combinationof low level activities to infer high-level activities (2) Automatic Selection of most discriminative low level activities Boosting (Friedman2000) (3) Sharing features(i.e. low level activities) across high level activities JointBoosting (Torralba2004) Ulf Blanke | Recognizing complex human activities – From Top Down to Bottom Up and Back | 12

  13. Experimental SetupDataset Lunch 2 acceleration sensors walking walking eating having a coffee standing in line commuting commuting lunch Pocket Wrist dinner working working • 7 days of a life from a single person (Huynh08) • Two layers of annotation • 4 high level routines, more than 20 low level activities Ulf Blanke | Recognizing complex human activities – From Top Down to Bottom Up and Back | 13

  14. Experimental SetupFixed Parameters High-level activities Joint boosting • Histograms • over 30min window • Mean and Variance • over 0.4s window • on (x,y,z)-acceleration • of pocket and wrist Low level activities K-means clusters Doing dishes Feature-Calculation Sensor data Ulf Blanke | Recognizing complex human activities – From Top Down to Bottom Up and Back | 14

  15. Results Cluster Hard Assignments Cluster Soft Assignments 100 100 90 90 80 80 70 70 60 in % 60 in % 50 50 40 40 30 30 20 Amount of data used Precision 20 10 10 0 60 0 100 80 100 40 20 Number of classifiers 80 60 40 20 Number of classifiers  Observed data reduced dramatically at superior performance Recall Ulf Blanke | Recognizing complex human activities – From Top Down to Bottom Up and Back | 15

  16. ResultsWhich low level activities are used? Distribution of low level labels for clustering Time 36 walking (99.23%) 6 driving car (21.71%) sitting/desk activities (47.24%) driving bike (16.76%) 42 sitting / desk activities (97.86%) 48 driving car (32.90%), sitting/desk activities (31.20%) Time Time 29 walking (96.09%) 13 driving bike (47.86%) walking (22.51%) picking up food (16.81%) 53 queuing in line (43.86%) picking up food (14.59%) Dinner Commute Lunch Dinner Commute Lunch Work Ulf Blanke | Recognizing complex human activities – From Top Down to Bottom Up and Back | 16

  17. Overview Composite Activities- Challenges - Discovering and combining relevant events (LoCA09)- Transfer and recombine relevant events (ISWC10) Overview of current projects … … 1 2 3 t 1 2 3 t Ulf Blanke | Recognizing complex human activities – From Top Down to Bottom Up and Back | 19

  18. Composite activitiesKnowledge transfer Composite C1 Composite C2 ? New Composite Work on knowledge transfer • (Zheng09), (Kasteren10), (Banos12), … • different aspects of transferring knowledge Here: “Partonomy” (Miller&Johnson-Laird76, Tversky90…) • Borrowed from object perception • relationshipbetween sub-parts Ulf Blanke | Recognizing complex human activities – From Top Down to Bottom Up and Back | 20

  19. Research Questions For composite activity recognition… 1 Does a partonomy-approach improve state of the art? Can we transfer knowledge of activity events to learn and recognize new activities with minimal training? Can we use composition knowledge to improve recognition of underlying activity events? 2 3 Ulf Blanke | Recognizing complex human activities – From Top Down to Bottom Up and Back | 21

  20. Partonomy-Based Activity RecognitionLn-Composite Activity Modeling Bottom-up “construction” to hierarchy of multiple layers Ln-Composite Layer n … Layer 3 L3-Composite L3-Composite … … Layer 2 L2-Composite L2-Composite L2-Composite L2-Composite … Layer L1 From step 1: scores x Data t Ulf Blanke | Recognizing complex human activities – From Top Down to Bottom Up and Back | 22

  21. Spotting atomic activitiesPipeline Groundtruth Sensors Raw data stream + Segm. (1) Feature Calculation (e.g. mean, var, FFT)* (2) Classifier • Central time t of segment • Normalized confidences of event classes U per segment s Ulf Blanke | Recognizing complex human activities – From Top Down to Bottom Up and Back | 23

  22. Step 2: Ln-Composite Activity ModelingConditional Random Field Ln-composite activity Pairwisepotentials: Co-occurrence of relevant events (temporal dist. and class of event) y y y z0 z1 z1 z0 z2 z2 z0 z3 z1 Unary potentials: scores of individual events L1-activity events x0 x1 x2 x0 x1 x0 x1 x2 x3 t Probability for composite model The right events, combined at the right time Ulf Blanke | Recognizing complex human activities – From Top Down to Bottom Up and Back | 24

  23. Output of Step 2 Before Non-Maximum surpression Ulf Blanke | Recognizing complex human activities – From Top Down to Bottom Up and Back | 25

  24. ExperimentsBookshelf Dataset … Make back part Assemble box L2 Join 2 parts … … … L1 • 5 XsensIMU’s at upper body • 10 subjects • 6 L2-composite activities Ulf Blanke | Recognizing complex human activities – From Top Down to Bottom Up and Back | 26

  25. ExperimentsResults on Bookshelf-Dataset • Partonomy vs. single-layer for L2-composites • Joint boosting, sliding window • Leave-one-subject out cross-validation vs. 83% 85% 72% 62% Avg. EER for all 6 classes • Reduction: • Single-layer:-10% • Partonomy: -2% Training samples Ulf Blanke | Recognizing complex human activities – From Top Down to Bottom Up and Back | 27

  26. Research Questions For composite activity recognition… 1 Does a partonomy-approach improve state of the art? Can we transfer knowledge of activity events to learn and recognize new activities with minimal training? Can we use composition knowledge to improve recognition of underlying activity events? 2 3 Ulf Blanke | Recognizing complex human activities – From Top Down to Bottom Up and Back | 28

  27. ExperimentsBookshelf Dataset … Make back part Assemble box L2 Join 2 parts … … … L1 • 5 XsensIMU’s at upper body • 10 subjects • 6 L2-composite activities Ulf Blanke | Recognizing complex human activities – From Top Down to Bottom Up and Back | 29

  28. ExperimentsTarget-Dataset Mirror Finish backpart … L3 … L2 Prepare frames 1 Join 2 parts prepare backside 2 … 4x … … … L1 • 5 XsensIMU’s at upper body • 6 subjects • 10 L1-events, 6 L2-composites and 4 L3-composites Ulf Blanke | Recognizing complex human activities – From Top Down to Bottom Up and Back | 30

  29. ExperimentsL2-composite activities 100% 71% 1 - Equal error rate 26% 50% 99% 50% Hanging up on wall Fix side frame Mark Mark Drill Drill Screw hammer Hang up 90 % 26% 65% 50% • With transferred activity event detectors composite activity recognition possible with minimal training Ulf Blanke | Recognizing complex human activities – From Top Down to Bottom Up and Back | 31

  30. Contribution and Conclusion Discovering and combining events (LoCA09) • automatic discovery of relevant parts using Joint boosting • Efficient method, outperforms approach using all data Transferring and recombining events (ISWC10) • Outperforms direct approach • Knowledge transfer possible • Improves lower level recognition Ulf Blanke | Recognizing Composite Human Activity | 36

  31. Today… Selected Publications User Independent, Multi-Modal Spotting of Subtle Arm Actions with Minimal Training Data. G. Bauer, U. Blanke, P. Lukowicz, and B. Schiele. 10th Percom Workshop (ComoRea 2013). IEEE South by South-East or sitting at the desk. Can orientation be a place?U. Blanke, R. Rehner and B. Schiele, (ISWC 2011). IEEE Remember and Transfer what you have Learned - Recognizing Composite Activities based on Activity Spotting.U. Blanke and B. Schiele, (ISWC 2010), IEEE. Towards Human Motion Capturing using Gyroscopeless Orientation Estimation.U. Blanke and B. Schiele, (ISWC 2010), IEEE. Visualizing Sleeping Trends from Postures.M. Borazio, U. Blanke and K. Van Laerhoven, (ISWC 2010), IEEE. All for one or one for all? – Combining Heterogeneous Features for Activity Spotting.U. Blanke, B. Schiele, M. Kreil, P. Lukowicz, B. Sick and T. Gruber (CoMoRea in conj. with Percom 2010), IEEE. An Analysis of Sensor-Oriented vs. Model-Based Activity Recognition.A. Zinnen, U. Blanke and B. Schiele, (ISWC 2009). IEEE. Daily Routine Recognition through Activity Spotting.U. Blanke and B. Schiele, (LoCA 2009), Springer. Sensing Location in the Pocket.U. Blanke and B. Schiele, (Ubicomp 2008, adjunct proceedings) Scalable Recognition of Daily Activities with Wearable Sensors.Tâm. Huynh, U. Blanke and B. Schiele, (LoCA 2007), Springer.

  32. Overview Composite Activities- Challenges - Discovering and combining relevant events (LoCA09)- Transfer and recombine relevant events (ISWC10) Overview of current projects … … 1 2 3 t 1 2 3 t Ulf Blanke | Recognizing complex human activities – From Top Down to Bottom Up and Back | 38

  33. Collective Crowd behavior GPS 1M visitors

  34. Supervised projects towards less supervision Activities, travel purposes, places Parkinson’s Disease Ulf Blanke | Recognizing complex human activities – From Top Down to Bottom Up and Back | 41

  35. Pastprojects Place detection Ulf Blanke | Recognizing complex human activities – From Top Down to Bottom Up and Back | 42

  36. Pastprojects • Sleep studies Ulf Blanke | Recognizing complex human activities – From Top Down to Bottom Up and Back | 43

  37. Co-Supervision of PhD Students PhD Students SinzianaMazilu Zack Zhu Long-Van Nguyen-Dinh Development team (Project ZüriFäscht) Robin Guldener Sascha Negele William Ross David Bannach Dominik Riehm Tobias Franke Kelly Streich Enes Poyarez Torben Schnuchel Ulf Blanke | Recognizing complex human activities – From Top Down to Bottom Up and Back | 44

  38. Thankyouforyourkindattention.

More Related