1 / 51

智慧型家庭網路之技術與應用

智慧型家庭網路之技術與應用. Professor Yau-Hwang Kuo Director Center for Research of E-life Digital Technology (CREDIT) National Cheng Kung University Tainan, Taiwan. Outline. Introduction Structure of Smart Home Network Realization of Device & Network Layers Agent-based Platform Affective HCI

syshe
Télécharger la présentation

智慧型家庭網路之技術與應用

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 智慧型家庭網路之技術與應用 Professor Yau-Hwang Kuo Director Center for Research of E-life Digital Technology (CREDIT) National Cheng Kung University Tainan, Taiwan

  2. Outline • Introduction • Structure of Smart Home Network • Realization of Device & Network Layers • Agent-based Platform • Affective HCI • Integrated Perception • Cognition Layer • Smart Home Services • Conclusion

  3. Trend of Digital Home • House_n (MIT)、Aware Home (Geogria Tech.)、Interactive Workspace (Stanford Univ.)、MavHome (UTA)。 • Digital Home Working Group: HP, Intel, IBM,... • ECHONET: Energy Conversation and Homecare Network. • CELF: Consumer Electronic Linux Forum. • OSGi: Open Service Gateway Initiative • Easy Living: Microsoft

  4. Scenarios of Digital Life • smart digital housekeeper. • ubiquitous digital nursing agent. • affective digital tutor. • ubiquitous home security monitor. • ubiquitous home content service. • universal cyber circles. • ubiquitous universal messaging service. • personal knowledge warehouse/navigation. • nomadic personal digital secretary. • secure traffic navigator.

  5. Microsoft’s View for Digital Home Solution • Total connectivity • No more islands of functionality • Personalized experiences • Customized entertainment, communications, and control • Ubiquitous access • Your PCs, devices, and content,securely accessible everywhere

  6. Microsoft’s View for Digital Home Solution • Technology “by invitation only”, not imposed • Highly personal and personalized space • Virtually random, unmanaged “build out” • Complex mix of products and services

  7. Issues of Digital Home • 人機互動能否人性化? • robustness、adaptability、multi-modal collaboration 人性化互動特質。 • 感官、認知、情緒、協調、合作實現人性化互動的技術要素。 • ubiquitous multi-modal affective human-machine interaction 數位家庭的人性化互動需求。

  8. Issues of Digital Home (cont.) • 人際互動能否得到提昇擴大? • 去空間限制、去時間限制、去工具限制、去安全限制。 • 家電間的協力合作能力能否得到提昇? • connectivity among appliances、autonomous collaboration of appliances、interoperability of appliances。

  9. Issues of Digital Home (cont.) • 人在數位生活空間的自由度是否得到提昇? • 可移動性、可轉移性、可調整性。 • ubiquitous integration home network、location-awareness、universal access、multi-modal human-machine interaction

  10. Issues of Digital Home (cont.) • 人在數位生活空間的便利度是否得到提昇? • 生活機能完整性、設備與網路無縫結合度、生活機能可獲性(availability)、用戶干預度、操作易度、穩私與安全等。 • 人在數位生活空間所獲得的生活輔助機能能否得到提昇?  Smart home network is necessary!

  11. Goals:Infrastructure & applications • Create a new life space supported by a smart home service network and attached digital appliances. • Develop e-services over the smart home network and digital appliances to realize a new life style. • Develop a service modeling and execution environment over the smart home network to realize various e-services.

  12. Goals:technologies • Develop nomadic HCI technology • Speech, vision, physiology, sensors. • Develop affective HCI technology • Develop agent-based home service network middleware. • Develop embedded platform & SoC for smart appliances.

  13. Layered Structure of Smart Home Service Network Applications (health care, entertainment, surveillance, etc.) Application Layer Service Model Execution Platform (script translation, scheduling, QoS) Emotion / Semantics / Behavior / Intention Understanding Cognition / Affection Layer Corpus of Knowledge (Ontology) Inference Engine Natural Language Processing (text, spoken) Integrated Perception Perception Layer Speech Vision (face) Vision (gesture) Physiology Smell Mobile Agent Platform Agent Layer Network Layer Home Network (802.11, Bluetooth, HomePlug) + Mobile Internet (SIP +3G) Home Comm. Gateway; Home Perception Server; Home Media Center Networked Physiology & Environment Monitoring Appliances Networked Microphones; Cameras; Speakers Wireless A/V Streaming Appliances Device Layer

  14. Device & Network Layers: types of digital appliances • Client-type devices • 802.11g-based multifunctional audio/voice adaptor • 802.11g/MPEG-4-based multifunctional video adaptor • 802.11g/MPEG-4-based smart IP camera • Bluetooth-based ECG device • Gateway-type devices • Multimedia communication gateway • Server-type devices • House control server • Human-machine interaction server • Content server • Application server

  15. Device & Network Layers: relationship among server appliances 屋 控 伺 服 器 Internet/ WWW house control & housekeeping devices 通 訊 伺 服 器 CO FTTH/ 3G/ WiMAX WiFi/ Home Plug 用 戶 端 設 備 Telephony 主 人 WiFi/Home Plug WiFi/ Home Plug 應 用 伺 服 器 A/V devices WiFi/ Home Plug 內 容 伺 服 器 data store

  16. Architecture of agent platform ASI_1_1 SDH Scenario Server Script DB ASI ASI_1_2 Register What to do ? BIS_1_1 BIS Service Server Scheduling Algorithm BIS_2_1 User Request XML PMS_1_1 PMS PMS_1_2 How to do ? LKN_1_1 LKN XML Service Agent Location Server LKN_2_1 Where to do ? FEA_1_1 FEA FEA_2_1 Task agent Task agent Task agent Common API

  17. Agent-based Runtime Environment • Execution environment: IBM Aglets system • Common API

  18. Adaptive Service Provider:architecture

  19. Adaptive Service Provider:functionalities • Functionalities • Registry mechanism for subsystem, device and functionalities • Service provider for user requests • Load balanced service scheduling algorithm according to system resources • Agent cooperation mechanism

  20. Adaptive Service Provider:components • Service server • Subsystem and devices functionalities registration • Service portal for users • Monitoring each subsystem and device • Service agents • Provide service for each user request • Service composition • Task assignment and task agent dispatch according to predefined XML-based scenarios

  21. Adaptive Service Provider:components (cont.) • Task agent • Execute each functionality on each subsystem • Common API • Service scheduling algorithm • Provide a task list for service agent according to registry and pre-defined scenarios in database • A Petri net based & load balanced scheduling algorithm for adaptive service path in each subsystem and device

  22. Agent-based Middleware:mobility management • Location detection • Device-followed type: mobile IP; signal analysis • Device-free type: speech interaction; vision monitoring. • Seamless handoff and transcoding for ubiquitous service following • Roaming path tracking and prediction

  23. Agent-based Middleware:appliance collaboration management • Collaboration among homogeneous appliances: data fusion, task migration. • Collaboration among heterogeneous appliances: multi-modal HCI. • Scheduling, concurrency control & synchronization of collaborative tasks. • Self-organization for service deployment

  24. Agent-based Middleware:interoperability management • Device bridge • Protocol bridge • Transcryption • Transcoding • Content translation & adaptation

  25. Agent-based Middleware:remote access management • Remote service deployment • remote service access • remote service management • auto-configuration • service re-direction • service aggregation • UI remoting

  26. Agent-based Middleware:other management functions • load management: • Client-server load partition • Server load sharing  Load scheduling of appliance farm • availability management • Fault tolerance • Just-in-time activation of appliances • service quality management

  27. Dialog System Synthesis ASR Speech Text Text Speech Emotion Emotion Affective Speech Conversation

  28. Text Input Text Analysis Syntactic Analysis Unit Selection Speech Smoothing Sad Speech Segmentation Happy Neutral Emotion Selection Database Selection Angry User’s Action Emotional Speech Database Emotional Speech Synthesis

  29. Behavior Understanding by Vision • High-Level behavior understanding from videos • State Machine • Human Activity Recognition • Two-Stage recognition process • Accident/Abnormal behavior detection • Context & domain knowledge Combination

  30. System Architecture

  31. Method – Activity Recognition • Activity Recognition • Level 1 - postures • Posture Sequence • Level 2 – motion/history • History Map Matching

  32. Method – Behavior understanding • Behavior • Normal behavior • State Machine • Activity + Contexts • Abnormal behavior • Normal behavior + domain knowledge • Accident • Unreasonable activity + domain knowledge

  33. Results Facial Expression Analysis Face Acquisition Facial Feature Extraction Facial Expression Classification Acquisition Segmentation Deformation Extraction Key frame Selection Recognition Motion Extraction Representation Eye Region Eye Points Displacement Vectors Fuzzy Neural Network Mouth Region Mouth Points YCbCr Color space Image Sequence Invariant Moments Region Of Interest Optical Flow Key Frame

  34. Integrated Perception:fuzzification of reference perceptual models • Manipulate all kinds of perception in a uniform process to ease the perceptual integration. • Due to high vagueness of perception, fuzzy logic based approach is a good choice to establish the reference models of perception. • The reference models which fuzzify perceptual attributes and perceptual decision subspaces will be embedded into the integrated perception model.

  35. FL-based Acoustic Reference Model for Emotion Recognition AAU1 model fuzzification of acoustic features (AFs) and construction of acoustic action units (AAUs) SVM clustering for emotion 1 speech corpus AAU2 model feature extraction SVM clustering for emotion 2 … … … AAUS model SVM clustering for emotion V

  36. FL-based Acoustic Reference Model for Emotion Recognition (cont.) • Adopt SVM clustering approach in the subspace of each emotion type to gather the clusters of acoustic training patterns. • Inspect all produced SVM clusters in the whole feature space and merge the highly overlapped clusters. • Each cluster is modeled as an AAU represented with its fuzzy cluster center where each feature is a fuzzy set whose membership function is determined by the least-square curve fitting approach on the feature values of training samples included in the cluster.

  37. FL-based Acoustic Reference Model for Emotion Recognition (cont.) • The mapping between AAUs and emotion types is dependent on the SVM clustering result of each emotion type. • Each emotion type is associated with a set of clusters of acoustic samples. The weight of each cluster is determined by the ratio of the number of samples it contains with respect to the total amount of samples of the same emotion.

  38. FL-based Facial Reference Model for Emotion Recognition graphical head model morphological process to simulate AUs correspondence FACS AUs identification process feature points (FPs) extraction process Membership grade FAU1 FAU2 fuzzy logic based reference model for FACS FP1 value Membership grade FAUi FAUj FAUk FP2 value

  39. FL-based Facial Reference Model for Emotion Recognition (cont.) • Intend to construct a computational reference model for FACS action units based on the measurable features of facial expression. • An approach similar to the construction of acoustic reference model is adopted. • The training samples are generated from a generic head model with necessary morphological manipulation.

  40. FL-based Facial Reference Model for Emotion Recognition (cont.) • The membership functions will be determined by the least-square curve fitting approach according to the sample patterns produced from the morphological process. • Each AU may just represent a partial facial expression and relate to more than one emotion.

  41. Fuzzy Neural Network for Integrated Emotion Recognition {< total ordering of emotion types>, group level of agreement} Fuzzy group decision process emotion type layer Fear Anger Surprise Fear Anger Surprise representative concept layer FAU1 FAU2 FAUK AAU1 AAU2 AAUS scaled feature layer primary feature layer FPn AF1 AFm FP1 Acoustic Features Face Features Expression

  42. Fuzzy Neural Network for Integrated Emotion Recognition (cont.) • All kinds of perceptual information are fused by the FNN model to realize emotion recognition. • Each appliance will have an instance of the corresponding FNN to join the emotion recognition job. • A two-layered (emotion type & concept layers) BP learning algorithm is adopted by using the training samples in constructing reference models. The fuzzy group decision process does not join the learning. • Scaling input value to [0,1] in the second layer is realized by the membership function of the corresponding fuzzy set.

  43. Fuzzy Neural Network for Integrated Emotion Recognition (cont.) • The links between AUs and scaled features are not fully connected. • The FAU/AAU nodes realize normalized weighted sum for the membership grades of input features weighted by their respective link strength. • Each emotion type node determines output value by the normalized weighted sum of its inputs from the representative concept layer.

  44. Cognition Layer:understanding and response • Understand the semantics of multi-modal expression. • Classify and recognize the intention/ need/emotion of semantic expression. • Summarize the semantics of multi-modal expression according to classified result.

  45. Cognition Layer:understanding and response (cont.) • Predict the user behavior sequence according to the classified result. • Schedule the response sequence according to the prediction result. • Determine the instant response.

  46. Stimulus Perception Cognition Semantic Expression Semantic Feature Extraction Event Detector (Neural Network- based Approach) Speech Processing spoken language Features Conceptualization Concepts Events Personal Event / Emotion Log Ontology Contextual Rules Vision Processing gesture Stimulus Semantic Summary Extraction face expression Emotion Attributes Emotion Recognition Emotion Types Semantic Summary Signal Processing Emotion Sequence Case base Event Sequence Case base Stimulus Response Templates physiological signals User Behavior Prediction (Episode-based Approach) text Video Processing Emotion Episode Discovery Prediction Result Emotion Episodes Speech Processing Response Roadmap Response Instant Response Determination Response Response Scheduling Application Control

  47. Smart Home Services • nomadic content services • health care by integrated perception • smart home surveillance • smart e-mail and calendar arrangement

  48. Conclusion • Life style of human being will be heavily affected by ICT, but the technological gap is still big. • Ubiquitous HCI and OCI technologies will be important to realize digital life style. • Cognitive computing and affective computing are important to improve the effectiveness of HCI technology.

  49. Description of Context-Aware Middleware User Profile Admission Control Personal Agent Context Reasoning Context Aggregator Resource Management Service Agent Wrapper Wrapper Device Service

More Related