200 likes | 214 Vues
EEC-693/793 Applied Computer Vision with Depth Cameras. Lecture 15 Wenbing Zhao wenbing@ieee.org. Outline. Approaches to gesture recognition Rules based Single pose based (this lecture) Multiple poses based Machine learning based. Gesture Recognition Engine.
E N D
EEC-693/793Applied Computer Vision with Depth Cameras Lecture 15 Wenbing Zhao wenbing@ieee.org
Outline Approaches to gesture recognition Rules based Single pose based (this lecture) Multiple poses based Machine learning based
Gesture Recognition Engine The recognition engine typically performs the following tasks: It accepts user actions in a form of skeleton data It matches the data points with predefined logic for a specific gesture It executes actions if the gesture is recognized It responds to the users
Gesture Recognition Engine GestureType RecognitionResult GestureEventArgs publicenumGestureType { HandsClapping, TwoHandsRaised } publicenumRecognitionResult { Unknown, Failed, Success } publicclassGestureEventArgs : EventArgs { publicGestureType gsType { get; internalset; } publicRecognitionResult Result { get; internalset; } public GestureEventArgs(GestureType t, RecognitionResult result) { this.Result = result; this.gsType = t; } }
Gesture Recognition Engine public class GestureRecognitionEngine { public GestureRecognitionEngine() { } public event EventHandler<GestureEventArgs> GestureRecognized; public Skeleton Skeleton { get; set; } public GestureType GestureType { get; set; } public void StartRecognize(GestureType t) { this.GestureType = t; switch (t) { case GestureType.HandsClapping: this.MatchHandClappingGesture(this.Skeleton); break; case GestureType.TwoHandsRaised: this.MatchTwoHandsRaisedGesture(this.Skeleton); break; default: break; } }
Gesture Recognition Engine float previousDistance = 0.0f; private void MatchHandClappingGesture(Skeleton skeleton) { if (skeleton == null) { return; } if (skeleton.Joints[JointType.HandRight].TrackingState == JointTrackingState.Tracked && skeleton.Joints[JointType.HandLeft].TrackingState == JointTrackingState.Tracked) { float currentDistance = GetJointDistance(skeleton.Joints[JointType.HandRight], skeleton.Joints[JointType.HandLeft]); if (currentDistance < 0.1f && previousDistance > 0.1f) { if (this.GestureRecognized != null) { this.GestureRecognized(this, new GestureEventArgs(GestureType.HandsClapping, RecognitionResult.Success)); } } previousDistance = currentDistance; } }
Gesture Recognition Engine private void MatchTwoHandsRaisedGesture(Skeleton skeleton) { if (skeleton == null) { return; } float threshold = 0.3f; if (skeleton.Joints[JointType.HandRight].Position.Y > skeleton.Joints[JointType.Head].Position.Y + threshold && skeleton.Joints[JointType.HandLeft].Position.Y > skeleton.Joints[JointType.Head].Position.Y + threshold) { if (this.GestureRecognized != null) { this.GestureRecognized(this, new GestureEventArgs(GestureType.TwoHandsRaised,RecognitionResult.Success)); } } }
Gesture Recognition Engine private float GetJointDistance(Joint firstJoint, Joint secondJoint) { float distanceX = firstJoint.Position.X - secondJoint.Position.X; float distanceY = firstJoint.Position.Y - secondJoint.Position.Y; float distanceZ = firstJoint.Position.Z - secondJoint.Position.Z; return (float)Math.Sqrt(Math.Pow(distanceX, 2) + Math.Pow(distanceY, 2) + Math.Pow(distanceZ, 2)); }
Vectors, Dot Product, Angles Segments of body can be represented using vectors You can create a class/struct for Vector3 or use Unity Vector3 type Dot product of two vectors Angle formed by two vectors in degrees Vector3 seg; seg.X = Joint1.Position.X – Joint2.Position.X; seg.Y = Joint1.Position.Y – Joint2.Position.Y; seg.Z = Joint1.Position.Z – Joint2.Position.Z; Vector3 seg1, seg2; float dotproduct = seg1.X*seg2.X+seg1.Y*seg2.Y+seg1.Z*seg2.Z; Vector3 seg1, seg2; float dotproduct = seg1.X*seg2.X+seg1.Y*seg2.Y+seg1.Z*seg2.Z; float seg1magnitude = Math.Sqt(seg1.X*seg1.X+seg1.Y*seg1.Y+seg1.Z*seg1.Z); float seg2magnitude = Math.Sqt(seg2.X*seg2.X+seg2.Y*seg2.Y+seg2.Z*seg2.Z); float angle = Math.Acos(dotproduct/seg1magnitude/seg2magnitude)*180/Math.PI;
Build a Gesture Recognition App The app can recognize two gestures Clapping hand Two hands raised Setting up project Create a new C# WPF project with a name GestureRecognitionBasic Add Microsoft.Kinect reference, import name space, etc Add GUI component Create a new C# file named GestureRecognitionEngine.cs, and copy the code to this class In solution explore, right click, then Add => New Item
Build a Gesture Recognition App User interface TextBox Canvas Image
Build a Gesture Recognition App Add member variables Modify constructor KinectSensor sensor; privateWriteableBitmap colorBitmap; privatebyte[] colorPixels; Skeleton[] totalSkeleton = newSkeleton[6]; Skeleton skeleton; GestureRecognitionEngine recognitionEngine; public MainWindow() { InitializeComponent(); Loaded += newRoutedEventHandler(WindowLoaded); }
Build a Gesture Recognition App private void WindowLoaded(object sender, RoutedEventArgs e) { if (KinectSensor.KinectSensors.Count > 0) { this.sensor = KinectSensor.KinectSensors[0]; if (this.sensor != null && !this.sensor.IsRunning) { this.sensor.Start(); this.sensor.ColorStream.Enable(); this.colorPixels = new byte[this.sensor.ColorStream.FramePixelDataLength]; this.colorPixels = new WriteableBitmap(this.sensor.ColorStream.FrameWidth, this.sensor.ColorStream.FrameHeight, 96.0, 96.0, PixelFormats.Bgr32, null); this.image1.Source = this.colorBitmap; this.sensor.ColorFrameReady += this.colorFrameReady; this.sensor.SkeletonStream.Enable(); this.sensor.SkeletonFrameReady += skeletonFrameReady; recognitionEngine = new GestureRecognitionEngine(); recognitionEngine.GestureRecognized += gestureRecognized; } } }
Build a Gesture Recognition App Gesture recognized event handler colorFrameReady(), DrawSkeleton(), drawBone(), ScalePosition() same as before void gestureRecognized(object sender, GestureEventArgs e) { textBox1.Text = e.gsType.ToString(); }
Build a Gesture Recognition App Handle skeleton frame ready event void skeletonFrameReady(object sender, SkeletonFrameReadyEventArgs e) { canvas1.Children.Clear(); using (SkeletonFrame skeletonFrame = e.OpenSkeletonFrame()) { if (skeletonFrame == null) { return; } skeletonFrame.CopySkeletonDataTo(totalSkeleton); skeleton = (from trackskeleton in totalSkeleton where trackskeleton.TrackingState == SkeletonTrackingState.Tracked select trackskeleton).FirstOrDefault(); if (skeleton == null) return; DrawSkeleton(skeleton); recognitionEngine.Skeleton = skeleton; recognitionEngine.StartRecognize(GestureType.HandsClapping); recognitionEngine.StartRecognize(GestureType.TwoHandsRaised); } }
Challenge Tasks Add recognition of two more gestures Right hand is raised Left hand is raised Add the GestureRecognitionEngine.cs to a Unity+Kinect app, and add visual feedback on the gestures recognized