1 / 33

Yingcai Xiao

Interactive Visualization with NUI and Game Engines. Yingcai Xiao. Interactive Visualization. Allow the user to control how data are represented as graphics and the graphics are viewed. Human-Computer Interaction (HCI) is critical to Interactive Visualization (IV). HCI.

Télécharger la présentation

Yingcai Xiao

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Interactive Visualization with NUI and Game Engines • Yingcai Xiao

  2. Interactive Visualization Allow the user to control how data are represented as graphics and the graphics are viewed. Human-Computer Interaction (HCI) is critical to Interactive Visualization (IV).

  3. HCI • Three types of HCI: • CLI: command line interface (with keyboard) • GUI: graphical user interface (mouse) • NUI: natural user interface with A/V (Kinect)

  4. NUI: • Three parts of NUI: • Hardware: e.g., Kinect • Software: drivers (OpenUI), middleware • Application: integration of HW enabling software with applications.

  5. OpenNI

  6. OpenNI • Production Nodes: • a set of components that have a productive role in the data creation process required for Natural Interaction based applications. • the API of the production nodes only defines the language. • The logic of data generation must be implemented by the modules that plug into OpenNI. • E.g. for a production node that represents the functionality of generating hand-point data, the logic of hand-point data generation must come from an external middleware component that is both plugged into OpenNI, and also has the knowledge of how to produce such data.

  7. OpenNI • body imaging (2) joint recognition (3) hand waving

  8. OpenNI: Sensor-Related Production Nodes •  Device: represents a physical device (a depth sensor, or an RGB camera). Its main role is to enable device configuration. •  Depth Generator: generates a depth-map. Must be implemented by any 3D sensor that wishes to be certified as OpenNI compliant. •  Image Generator: generates colored image-maps. Must be implemented by any color sensor that wishes to be certified as OpenNI compliant •  IR Generator: generates IR image-maps. Must be implemented by any IR sensor that wishes to be certified as OpenNI compliant. •  Audio Generator: generates an audio stream. Must be implemented by any audio device that wishes to be certified as OpenNI compliant.

  9. OpenNI: Middleware-Related Production Nodes •  Gestures Alert Generator: Generates callbacks to the application when specific gestures are identified. •  Scene Analyzer: Analyzes a scene, including the separation of the foreground from the background, identification of figures in the scene, and detection of the floor plane. The Scene Analyzer’s main output is a labeled depth map, in which each pixel holds a label that states whether it represents a figure, or it is part of the background. •  Hand Point Generator: Supports hand detection and tracking. This node generates callbacks that provide alerts when a hand point (meaning, a palm) is detected, and when a hand point currently being tracked, changes its location. •  User Generator: Generates a representation of a (full or partial) body in the 3D scene.

  10. OpenNI: Recording Production Notes •  Recorder: Implements data recordings •  Player: Reads data from a recording and plays it •  Codec: Used to compress and decompress data in recordings

  11. OpenNI: Capabilities • Supports the registration of multiple middleware components and devices. OpenNI is released with a specific set of capabilities, with the option of adding further capabilities in the future. Each module can declare the capabilities it supports. • Currently supported capabilities: •  Alternative View: Enables any type of map generator to transform its data to appear as if the sensor is placed in another location. •  Cropping: Enables a map generator to output a selected area of the frame. •  Frame Sync: Enables two sensors producing frame data (for example, depth and image) to synchronize their frames so that they arrive at the same time.

  12. OpenNI: Capabilities • Currently supported capabilities: •  Mirror: Enables mirroring of the data produced by a generator. •  Pose Detection: Enables a user generator to recognize when the user is posed in a specific position. •  Skeleton: Enables a user generator to output the skeletal data of the user. This data includes the location of the skeletal joints, the ability to track skeleton positions and the user calibration capabilities. •  User Position: Enables a Depth Generator to optimize the output depth map that is generated for a specific area of the scene.

  13. OpenNI: Capabilities • Currently supported capabilities: •  Error State: Enables a node to report that it is in "Error" status, meaning that on a practical level, the node may not function properly. •  Lock Aware: Enables a node to be locked outside the context boundary. •  Hand Touching FOV Edge: Alert when the hand point reaches the boundaries of the field of view.

  14. OpenNI: Generating and Reading Data • Production nodes that also produce data are called Generator. • Once these are created, they do not immediately start generating data, to enable the application to set the required configuration. • The xn::Generator::StartGenerating() function is used to begin generating data. • The xn::Generator::StopGenerating stops it. • Data Generators "hide" new data internally, until explicitly requested to expose the most updated data to the application, using the UpdateData request function. • OpenNI enables the application to wait for new data to be available, and then update it using the xn::Generator::WaitAndUpdateData() function.

  15. Interactive Visualization with a Game Engine

  16. Video Game Interactive animation: user-> interface -> game object action -> feedback (A/V, haptic) Game objects can represent data.

  17. Video Game Display User Controller Game (Software)

  18. Video Game Display Device Driver (GDI) Input Device Driver Game (Software)

  19. Software for Kinect-based game development OpenNI: a general-purpose framework for obtaining data from 3D sensors SensorKinect: the driver for interfacing with the Microsoft Kinect NITE: a skeleton-tracking and gesture-recognition library Unity3D: a game engine ZigFu: Unity Package for Kinect (Assets and Scripts)

  20. Unity3D • Engine • IDE • Assets • Tutorial • Examples

  21. Game Assets for Interactive Visualization Game Assets: Game Objects and Animation Scripts Game objects can be real world objects, artistic virtual objects, or data objects. Scripts can be used to make interactive. For interactive visualization, we just create assets. The game engine will take care everything else (including physics).

  22. Game Assets Creators Blender: blender.org Maya: AutoDesk.com 3ds Max: AutoDesk.com MotionBuilder: AutoDesk.com Visualization and Animation at AutoDesk Student Center Jmol: http://jmol.sourceforge.net/ (for visualization of chemical and biological structures) All free for students.

  23. Unity 3D IDE IDE: Integrated Development Environment Project: directory and files for a specific game project. C:\Users\xiao\Documents\New Unity Project 1 \Assets (anything you can reuse) \Library (binary files)

  24. Unity 3D: Assets C:\Users\xiao\Documents\New Unity Project 1\Assets (anything you can reuse) \Standard Assets \OpenNI \Scripts \_Scenes \Materials \Artwork

  25. Unity 3D: Standard Assets C:\Users\xiao\Documents\New Unity Project 1\Assets\Standard Assets Objects: (Look) \Tree \Terrain \Charater Lights: (Look) \Light Flares \Light Cookies Code: (Feel: control, interaction, animation, …) \Scripts

  26. Unity 3D: Objects • C:\Users\xiao\Documents\New Unity Project 1\Assets\Standard Assets\Charater: • Prefab: (Predefined Objects) • First Person, 3rd Person • \Source: • \Prototype (Look) • Constructor.FBX • \Materials (properties) • \Textures (images) • \Scripts(Feel: actions) • Java Scripts: ThirdPersonController.js • C#: MouseLook.cs

  27. Unity 3D: Scripts • Languages: • Interpreted : Java Script • Compiled: C# • Usages: • General: under Project\Scripts • ExitOnEscape.cs • Objects: attached to objects • ThirdPersonController.js

  28. Unity 3D: Library • cashe: • for speeding up processing • metadata: • data that describes data • previews: • for previewing scenes • ScriptAssemblies: • compiled object assemblies for scripts

  29. Unity 3D: GUI • Start Unity • File->Create Project • Select Assets (Character, Lights, Scripts, Sky, Terrain, Tree) • Assets->Import Package->Custom Package • UnityOpenNIBindings-v1.4.unitypackage • File->New Scene • File->Save Scene

  30. Unity 3D: GUI • Terrain->Create Terrain • Terrain->Set Resolution: width = 300; height = 300; length = 300; • GameObject->Create Other->Directional Light • Adjust light direction to the terrain

  31. Unity 3D: Game Objects • Background Objects: Terrain and Sky • Terrain: elevation grid, adjustable height, texture, • Sky: texture, static view • Add-ons: trees, stones, … • Foreground Objects: • Objects can be animated.

  32. Unity 3D: Game Objects • Rigid Objects: non-deformable with physical properties (gravity, inertial). • Non-rigid Objects: • Deformable: changeable geometry • Breakable: changeable topology. • Intangible Objects: No predefined shape. • Fire, clouds, …

  33. Summary • Interaction • NUI • Kinect • Visualization • Game Object • Game Engine

More Related