210 likes | 309 Vues
Mobile devices can now observe various daily activities continuously, leading to a need for models recognizing interleaved activities. This study focuses on improving such recognition using Interleaved Hidden Markov Models (HMMs) and discusses their effectiveness compared to Conditional Random Fields. Through a detailed exploration of Interleaved HMM concepts, the researchers demonstrate a successful approach for enhancing multi-task activity recognition, achieving notable performance with minimal training data. The study includes a comparison of results obtained using Interleaved HMMs versus CRF, highlighting the advantages of the former in practical scenarios. Overall, Interleaved HMMs show promise as a reliable tool for accurate activity recognition in real-time environments with limited training resources.
E N D
Improving the Recognition of Interleaved Activities Joseph Modayil Tongxin Bai Henry Kautz
Background • With the availability of cheaper and more ubiquitous sensors, mobile devices are able to observe people’s activities continuously • This supports many emerging applications • Very meaningful research
Goal • People often multitask as they perform activities of daily living, switching between many different activities Learn a model that recognizes which activities are being performed given a sequence of observations
Markov Model • The probabilistic description is truncated to the current and the predecessor state:
A Sample Markov Model • Consider a simple 3-state Markov model of the weather
A Sample Markov Model • Given that the weather on day 1 is sunny (state 3), what’s the probability that the weather for the next 7 days will be “sun-sun-rain-rain-sun-cloudy-sun”?
Hidden Markov Models • Markov Model: each state corresponds to an observable event. • Hidden Markov Model: a doubly embedded stochastic process with an underlying stochastic process that is not observable (hidden state)
Hidden Markov Models • Characterized by the following: State transition probability distribution Observation symbol probability distribution in state j The initial state distribution
Interleaved HMM • Activities: Observations: Object reading: Transition probabilities: Emission probabilities:
Interleaved HMM • The probability of the most likely state sequence:
Interleaved HMM • Hypothesis space: H • For normal HMM: H=A Transition probabilities Emission probabilities Why?
Interleaved HMM • Each state consists of a current activity and a record of the last object while performing each activity. State space is: L is a Cartesian product of |A| copies of O. The hypothesis at time t is given:
Interleaved HMM • We denote an element by where indicates the last object observed in activity I The emission probability is then: id(x,y)=1 if x=y =0 else The transition probability is:
Interleaved HMM • The number of parameters is approximately But the size of the state space is So the state space can NOT be explored completely at each time step. How to solve the problem?
Interleaved HMM • Use a beam search to define a likelihood update equation over a beam in the search space This method can effectively approximate the full state space and contributes most for little added complexity
Defining the best state sequence • Most likely activity sequence derived by Viterbi algorithm • We can also estimate the current state for the most likely path from the evidence seen up to time t This option is desirable in real-time environments.
Results • Lab data HMM: 66% IHMM: 100% Real data by Patterson
Question Compare this method with CRF Which one is better?
Conclusion • IHMM provides a simple but effective way to improve multi-task activity recognition • The model performs well using only a little training data