1 / 36

The EyesWeb Open Platform The DIST – InfoMus Lab Staff eyesweb

InfoMus Lab – DIST – University of Genova www.infomus.org. The EyesWeb Open Platform The DIST – InfoMus Lab Staff www.eyesweb.org. The EyesWeb Project www.eyesweb.org. Research : Computational models of expressive content, emotional, and KANSEI communication in movement, music, visual media.

shilah
Télécharger la présentation

The EyesWeb Open Platform The DIST – InfoMus Lab Staff eyesweb

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. InfoMus Lab – DIST – University of Genovawww.infomus.org The EyesWeb Open PlatformThe DIST – InfoMus Lab Staffwww.eyesweb.org

  2. The EyesWeb Projectwww.eyesweb.org • Research: • Computational models of expressive content, emotional, and KANSEI communication in movement, music, visual media. • Integration of music, visual, movement (incl. dance) languages from a multimodal perspective. • Open h/w and s/w Architecture: • modular h/w and s/w real-time platform to support research, enable experiments, support artistic productions and in general multimodal applications • Applications: • music, theatre, interactive art, multimodal interfaces, museums, edutainment, multimedia ...

  3. EyesWeb... • Fast development process of experimental Expressive Media applications: • Fully open and easy to extend • SDK for development of user’s new modules • Industry standards compliance: MS COM/DCOM, Steinberg VST and ASIO (support to audio multi-channel h/w boards), OSC, MIDI, tcp/ip, serial, firewire, USB webcam, wide range of sensors and frame grabbers, ...

  4. ...EyesWeb • A number of specific libraries and extensions available: • Motion analysis lib for real-time extraction of cues from video streams • Low-level: e.g. trackers (e.g. Multiple Coloured Blobs Tracking; LK; SMI/MEI; ...) • Expressive gesture processing library • High-Level: expressive cues, General Space, Space Maps and Shapes, trajectories analysis. • Pro features to create commercial apps • Specific extensions available on demand • Third-party libraries and apps available

  5. EyesWeb Users Community • Growing users community (>10000) • Software platform freely available: • http://www.infomus.org • http://www.eyesweb.org • Public newsgroups • Constant s/w and material udates from web sites and EyesWeb public users newsgroup

  6. Requirements for EyesWeb XMI • Defined starting from • Requests and suggestions by users in the EyesWeb newsgroups • Requests and suggestions by partners in EU and national projects • Our own experience in EU and national research projects and in collaboration with industries • Concrete issues emerged while designing and developing interactive multimedia installations and artistic performances • Leaded to a very deep revision of the system with redesign and re-implementation from scratch of the most critical modules • Deal with usability, performance, robustness, interoperability, optimization.

  7. Support to Cross-Modality and Multimodality • The same module can operate on different datatypes: e.g., the same “segmentation” Eyw module can be used for different streams (audio, movement cues, sensor data ...) • Support to resampling, interpolation, adaptation to simultaneous processing of different streams • Data conversion, synchronization, etc. • Extension toward object-oriented programming • These features are not emerging to the programmer and the user unless requested, but are provided (and extensible) in the EyesWeb visual language

  8. Usability • Integration in the EyesWeb language of aspects related to multimodal processing of different data stream. • E.g., user do not have to explicitly care about synchronization of streams, the language automatically support this. • Subpatching mechanism fully supported • First model: different subpatches of the same type are disjoint; modifying one subpatch does not alter the other instances • Second model: different instances of subpatches of the same type do share the subpatch class: modifying a subpatch acts on all instances of such subpatch.

  9. Usability • Hiding inner implementation details • E.g., users do not have to explicitly care about threads, clocks, active and passive blocks: these are automatically handled by the system • Improved management of available blocks • Blocks grouped in catalogs • Mechanisms for finding a block easily, for filtering out blocks belonging to a given catalog etc. • Redesign of the Graphical User Interface • Improved editing of patches (e.g., in terms of links among blocks, pins, organization of blocks in a patch) • Improved access to parameters

  10. Performance and optimization • Optimization of the kernel for management of audio and video streams • Support for multiprocessor systems • The new version of the EyesWeb kernel is built in such a way that dual (or multi) processors computers can be exploited at best. • Kernel objects • Have a privileged access to the kernel • Kernel can optimize access and use of kernel objects • Basic datatypes (e.g., images, sound buffers) are built-in in the kernel as kernel objects • Kernel objects also include control flow blocks (switch, for or while loops, conditional operations, etc.), basic clocks (multimedia timer) or devices(e.g., keyboard, serial, or mouse)

  11. Robustness • Complete separation of the kernel from the user interface • The kernel does not suffer of possible bugs of the user interface • Different user interfaces can be developed and employed, depending on the specific application • Standalone applications which do not need user interface and have to operate for days in unsupervised environments can run with a minimal (or without) GUI • Interaction between the interface and the kernel has been greatly simplified

  12. Interoperability • Simplified connection of the EyesWeb platform with other software platforms • Simplified hosting of standard plugins • EyesWeb already hosts standard plugins (e.g., Steinberg VST, FreeFrame) • The simplification of the communication paradigm between the EyesWeb kernel and the EyesWeb interface allows fast development of hosts for further plugin systems. • Interaction with hardware managed by a new level of abstraction: devices • Available physical hardware resources can be mapped to virtual devices used by blocks. • Patches are more portable among different computers, even if the hardware configurations are different

  13. Other features • Multiple versions allowed on a single PC • Avoid using system registry/avoid centralized configuration • Use configuration files instead of registry • A configuration file only affects one folder. Different configurations can coexist on the same computer • Multiple instances allowed at the same time • Portable patches • Patches are saved as xml files • Support for xml is embedded in the kernel • Patches can be edited with standard text/xml editors • Portable among versions and Oss • Can be managed via source control systems

  14. Other features • Better audio support • Clock may be provided by devices • Clock of renderer modules has priority on other clocks • Try using a single clock for the whole patch (avoid clicks/jitter) • Warns if different clocks must be used • Parameters no more constrained to a limited set of predefined types • New SDK no more COM dependent • C++ pure virtual interfaces • Easier access to inputs/outputs (no need to lock/unlock) • No multithreading issues

  15. Future enhancements • Multiplatform (Linux) • Embedded systems • Distributed systems

  16. Expressive Gesture in Human Movement • Human full-body movement as a main non-verbal communication channel • Dance, the artistic expression of movement, as a testbed for studying expressive gesture mechanisms • Development of • a multimodal conceptual framework • a collection of algorithms for gesture processing • experiments for testing and validation

  17. Conceptual Framework Concepts Gestures Cues Physical Signals High-level expressive information Motion segmentation and representation E.g., energy, contraction, stability, kinematics Motion detection and tracking

  18. Motion tracking: LK tracking • The Lucas – Kanade (LK) algorithm is used to track features in the input image. • Trajectories are then analyzed (e.g., to extract information about motion directness, kinematics).

  19. Motion tracking by colors

  20. Tracking of subregions

  21. Tracking of multiple dancers

  22. Frequency analysis E.g., analysis of peaks in different bands can help in distinguishing movements of different body parts

  23. Contour and convex hull

  24. Expressive Cues Global measures depending on full body movement (e.g., body orientation, overall motion direction) Measures from psychological research (e.g., Boone & Cunningham’s amount of upward movement) Cues from Rudolf Laban’s Theory of Effort (e.g., directness, impulsiveness) Cues derived from analogies with audio analysis (e.g., Inter Onset Intervals, frequency analysis) Kinematical measures such as velocity, acceleration, average and peak velocity and acceleration.

  25. SMI and Quantity of Motion • SMIs (Silhouette Motion Images) carry information on variations of the silhouette in the last few frames. • SMI’s area (Quantity of Motion) is a measure of the detected amount of motion.

  26. Silhouette Shape • Ellipse approximating the body silhouette (analogy between image and mechanical moments). • Eccentricity related to contraction/expansion; orientation of the axes related to body orientation.

  27. Contraction Index A measure of how the dancer’s body uses the space surrounding it.

  28. Upward Movement Identified by Boone and Cunningham (1998) as a primary indicator of positive expressive intentions (e.g., joy).

  29. Asymmetry Index Asymmetry of a body posture computed with respect to the axes of its bounding box.

  30. Gesture Segmentation • Motion Phases: the dancer is moving • Pause Phases: the dancer does not appear to move • Values of cues can be associated to each phase.

  31. Gesture Representation RHYTHM Movements emphasized through pauses Languages for Choreography Duration, peak velocity, pauses expansion( 26 , 8 , 0.0324653 , 0.0242855 , 2 , 0.0391145 , -0.407343 , 0.342657 ). pause( 34 , 3 ). contraction( 37 , 8 , 0.0462466 , 0.0494994 , 6 , 0.111976 , 0.297261 , 0.490809 ).

  32. Gesture Representation • Gestures as trajectories in semantic spaces • Clustering of such trajectories for classification Fluency Quantity of Motion

  33. Directness Index • A measure of how a trajectory is direct or indirect. • The trajectory can be the output of a (physical) motion tracker or a trajectory in a semantic space.

  34. EyesWeb Expressive Gesture Processing Library MotionAnalysis: motion trackers (e.g., LK), movement expressive cues (QoM, CI, ...). SpaceProcessing: processing of (physical or abstract) spaces. Use of metaphors from physics: hysteresis, abstract potentials, superposition principle ... TrajectoryProcessing: processing of 2D (physical or abstract) trajectories (e.g., kinematics, directness, …). Machine learning techniques (e.g., SVMs, clustering, neural networks, …)

  35. Experiments • For understanding and analyzing the expressive content expressive gesture convey • For evaluating and validating the developed models and algorithms • For identifying cross-modal interactions among expressive gestures (e.g., in movement and in music)

  36. For more info… • www.eyesweb.org • www.infomus.org • news://infomus.dist.unige.it • info@infomus.org

More Related