Help Getting started

Aug 1, 2011 at 3:31 PM
Edited Aug 1, 2011 at 3:33 PM

Morning all,

please forgive me if this is not the place to be asking for help with this. If anyone knows of another place where this code is being discussed then point me in that direction!

Anyway, I'm having real trouble getting a working example of gesture recognition up and running. I think I've got it integrated OK into my code, but I can't get a gesture to trigger. I'm only an off and on coder, so it's entirely possible that I'm missing something very simple, or indeed just totally misunderstanding some basic principle!

So after defining the detector...

 

Private _runtime As New Nui.Runtime()
Private _SwipeGestureDector As New SwipeGestureDetector

 

and specifying the handlers...

 

AddHandler _runtime.SkeletonFrameReady, AddressOf _runtime_
AddHandler _SwipeGestureDector.OnGestureDetected, AddressOf _SwipeGestureDector_OnGestureDetected

 

Within the SkeletonFrameReady event I've got...

Private Sub _runtime_SkeletonFrameReady(ByVal sender As Object, ByVal e As SkeletonFrameReadyEventArgs)

            For Each sd As SkeletonData In e.SkeletonFrame.Skeletons

                _SwipeGestureDector.Add(sd.Position, _runtime.SkeletonEngine)


Does this seem OK as far as it goes? I couldn't really work out if I was passing the correct things to SwipeGestureDetector.Add or not.

Ideally, if anyone has a working example of this method that I can use as a jumping off point, it would be a real help to me and, I'm sure, all those others like me whose enthusiasm outstrips their actual ability ;)

Thanks in advance,

Tom

Coordinator
Aug 2, 2011 at 7:13 AM

Hi!

You have to transmit the position of a joint and not the skeleton :)

So for each SkeletonData, you have to browse the entire collection of joints to find the one you look for and then transmit its position to the swipedetector.

David.

Aug 2, 2011 at 1:31 PM

Ahhh - of course, that makes sense. It's working a treat now - so thanks a lot!

Cheers,

Tom

Sep 21, 2011 at 12:47 PM

Hello,

I follwed the same code as written above but I am not able to detect the gesture.How can I use the ongesturedetected event for displaying a message for example.

I used as the event handler

 sg.OnGestureDetected += OnGestureDetected;

and then in skeleletalframeready event i used

 foreach (SkeletonData data in skeletonFrame.Skeletons)
            {
                if (SkeletonTrackingState.Tracked == data.TrackingState)
                {
                   
                    foreach (Joint joint in data.Joints)
                    {
                       
                        sg.Add(joint.Position, nui.SkeletonEngine);
                       
                    }
                }

and for the ongesturedetectedevent

 void OnGestureDetected(string gesture)
        {
            System.Windows.MessageBox.Show(gesture);
           
        }

I dont what is the problem here.Is it possible for someone to provide a working simple example in order to start,it would be really helpful.

Thanks in advance,

Abhishek

Coordinator
Sep 22, 2011 at 5:40 AM

AFAIK, your code should work. Can you send me a complete sample?

Sep 22, 2011 at 10:13 AM

Hello,

 

Here is my complete code ,there are no errors but still it doesnt detect a gesture,for example simple hand swipe.


using System;
using System.IO;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Windows;
using System.Windows.Controls;
using System.Windows.Data;
using System.Windows.Documents;
using System.Windows.Input;
using System.Windows.Media;
using System.Windows.Media.Imaging;
using System.Windows.Navigation;
using System.Windows.Shapes;
using Microsoft.Research.Kinect.Nui;
using BeTheController;
using Kinect.Toolbox;
namespace SkeletalViewer
{
    /// <summary>
    /// Interaction logic for MainWindow.xaml
    /// </summary>
    public partial class MainWindow : Window
    {
       
        //BTCRunTime BTC = new BTCRunTime();
       
        public MainWindow()
        {

            InitializeComponent();
           

        }

        Runtime nui;
        BTCRunTime BTC = new BTCRunTime();
      
        int totalFrames = 0;
        int lastFrames = 0;
        DateTime lastTime = DateTime.MaxValue;

        // We want to control how depth data gets converted into false-color data
        // for more intuitive visualization, so we keep 32-bit color frame buffer versions of
        // these, to be updated whenever we receive and process a 16-bit frame.
        const int RED_IDX = 2;
        const int GREEN_IDX = 1;
        const int BLUE_IDX = 0;
        byte[] depthFrame32 = new byte[320 * 240 * 4];
        SwipeGestureDetector sg = new SwipeGestureDetector();
       
       
        Dictionary<JointID,Brush> jointColors = new Dictionary<JointID,Brush>() {
            {JointID.HipCenter, new SolidColorBrush(Color.FromRgb(169, 176, 155))},
            {JointID.Spine, new SolidColorBrush(Color.FromRgb(169, 176, 155))},
            {JointID.ShoulderCenter, new SolidColorBrush(Color.FromRgb(168, 230, 29))},
            {JointID.Head, new SolidColorBrush(Color.FromRgb(200, 0,   0))},
            {JointID.ShoulderLeft, new SolidColorBrush(Color.FromRgb(79,  84,  33))},
            {JointID.ElbowLeft, new SolidColorBrush(Color.FromRgb(84,  33,  42))},
            {JointID.WristLeft, new SolidColorBrush(Color.FromRgb(255, 126, 0))},
            {JointID.HandLeft, new SolidColorBrush(Color.FromRgb(215,  86, 0))},
            {JointID.ShoulderRight, new SolidColorBrush(Color.FromRgb(33,  79,  84))},
            {JointID.ElbowRight, new SolidColorBrush(Color.FromRgb(33,  33,  84))},
            {JointID.WristRight, new SolidColorBrush(Color.FromRgb(77,  109, 243))},
            {JointID.HandRight, new SolidColorBrush(Color.FromRgb(37,   69, 243))},
            {JointID.HipLeft, new SolidColorBrush(Color.FromRgb(77,  109, 243))},
            {JointID.KneeLeft, new SolidColorBrush(Color.FromRgb(69,  33,  84))},
            {JointID.AnkleLeft, new SolidColorBrush(Color.FromRgb(229, 170, 122))},
            {JointID.FootLeft, new SolidColorBrush(Color.FromRgb(255, 126, 0))},
            {JointID.HipRight, new SolidColorBrush(Color.FromRgb(181, 165, 213))},
            {JointID.KneeRight, new SolidColorBrush(Color.FromRgb(71, 222,  76))},
            {JointID.AnkleRight, new SolidColorBrush(Color.FromRgb(245, 228, 156))},
            {JointID.FootRight, new SolidColorBrush(Color.FromRgb(77,  109, 243))}
        };

        private void Window_Loaded(object sender, EventArgs e)

        {
           
            nui = new Runtime();
            try
            {
                nui.Initialize(RuntimeOptions.UseDepthAndPlayerIndex | RuntimeOptions.UseSkeletalTracking | RuntimeOptions.UseColor);
            }
            catch (InvalidOperationException)
            {
                System.Windows.MessageBox.Show("Runtime initialization failed. Please make sure Kinect device is plugged in.");
                return;
            }


            try
            {
                nui.VideoStream.Open(ImageStreamType.Video, 2, ImageResolution.Resolution640x480, ImageType.Color);
                nui.DepthStream.Open(ImageStreamType.Depth, 2, ImageResolution.Resolution320x240, ImageType.DepthAndPlayerIndex);
            }
            catch (InvalidOperationException)
            {
                System.Windows.MessageBox.Show("Failed to open stream. Please make sure to specify a supported image type and resolution.");
                return;
            }


           //  lastTime = DateTime.Now;
           // BTC.Start(nui);
            //BTC.LoadPoses("C:\\Users/asrinivas/Desktop/skeletal/Skeletal_Tracking/gn/poses.pbtc");
           // BTC.LoadGestures("C:\\Users/asrinivas/Desktop/skeletal/Skeletal_Tracking/gn/gestures.gbtc");
           // BTC.OnEventRecognized += new BTCRunTime.EventRecognizedEventHandler(Game_OnEventRecognized);
           // BTC.OnEventStopRecognized += new BTCRunTime.EventStopRecognizedEventHandler(Game_OnEventStopRecognized);
            nui.DepthFrameReady += new EventHandler<ImageFrameReadyEventArgs>(nui_DepthFrameReady);
            nui.SkeletonFrameReady += new EventHandler<SkeletonFrameReadyEventArgs>(nui_SkeletonFrameReady);
            nui.VideoFrameReady += new EventHandler<ImageFrameReadyEventArgs>(nui_ColorFrameReady);
            sg.OnGestureDetected += On_GestureDetected;
          
         
          }
        void Game_OnEventRecognized(EventRecognizedEventArgs gca)
        {
            foreach (KinectEvent Event in gca.Events)
            {
                if (Event.Type == KEvents.Gesture)
                {
                    if (Event.Value == "gr") System.Windows.MessageBox.Show("RightGesture");
                }
            }
        }
        void Game_OnEventStopRecognized(EventStopRecognizedEventArgs gca) { foreach (KinectEvent Event in gca.Events)
        { if (Event.Type == KEvents.Pose)
        { if ((Event.Value == "s1"))
        { } } } }

        // Converts a 16-bit grayscale depth frame which includes player indexes into a 32-bit frame
        // that displays different players in different colors
        byte[] convertDepthFrame(byte[] depthFrame16)
        {
            for (int i16 = 0, i32 = 0; i16 < depthFrame16.Length && i32 < depthFrame32.Length; i16 += 2, i32 += 4)
            {
                int player = depthFrame16[i16] & 0x07;
                int realDepth = (depthFrame16[i16+1] << 5) | (depthFrame16[i16] >> 3);
                // transform 13-bit depth information into an 8-bit intensity appropriate
                // for display (we disregard information in most significant bit)
                byte intensity = (byte)(255 - (255 * realDepth / 0x0fff));

                depthFrame32[i32 + RED_IDX] = 0;
                depthFrame32[i32 + GREEN_IDX] = 0;
                depthFrame32[i32 + BLUE_IDX] = 0;

                // choose different display colors based on player
                switch (player)
                {
                    case 0:
                        depthFrame32[i32 + RED_IDX] = (byte)(intensity / 2);
                        depthFrame32[i32 + GREEN_IDX] = (byte)(intensity / 2);
                        depthFrame32[i32 + BLUE_IDX] = (byte)(intensity / 2);
                        break;
                    case 1:
                        depthFrame32[i32 + RED_IDX] = intensity;
                        break;
                    case 2:
                        depthFrame32[i32 + GREEN_IDX] = intensity;
                        break;
                    case 3:
                        depthFrame32[i32 + RED_IDX] = (byte)(intensity / 4);
                        depthFrame32[i32 + GREEN_IDX] = (byte)(intensity);
                        depthFrame32[i32 + BLUE_IDX] = (byte)(intensity);
                        break;
                    case 4:
                        depthFrame32[i32 + RED_IDX] = (byte)(intensity);
                        depthFrame32[i32 + GREEN_IDX] = (byte)(intensity);
                        depthFrame32[i32 + BLUE_IDX] = (byte)(intensity / 4);
                        break;
                    case 5:
                        depthFrame32[i32 + RED_IDX] = (byte)(intensity);
                        depthFrame32[i32 + GREEN_IDX] = (byte)(intensity / 4);
                        depthFrame32[i32 + BLUE_IDX] = (byte)(intensity);
                        break;
                    case 6:
                        depthFrame32[i32 + RED_IDX] = (byte)(intensity / 2);
                        depthFrame32[i32 + GREEN_IDX] = (byte)(intensity / 2);
                        depthFrame32[i32 + BLUE_IDX] = (byte)(intensity);
                        break;
                    case 7:
                        depthFrame32[i32 + RED_IDX] = (byte)(255 - intensity);
                        depthFrame32[i32 + GREEN_IDX] = (byte)(255 - intensity);
                        depthFrame32[i32 + BLUE_IDX] = (byte)(255 - intensity);
                        break;
                }
            }
            return depthFrame32;
        }

        void nui_DepthFrameReady(object sender, ImageFrameReadyEventArgs e)
        {
            PlanarImage Image = e.ImageFrame.Image;
            byte[] convertedDepthFrame = convertDepthFrame(Image.Bits);

            depth.Source = BitmapSource.Create(
                Image.Width, Image.Height, 96, 96, PixelFormats.Bgr32, null, convertedDepthFrame, Image.Width * 4);

            ++totalFrames;

            DateTime cur = DateTime.Now;
            if (cur.Subtract(lastTime) > TimeSpan.FromSeconds(1))
            {
                int frameDiff = totalFrames - lastFrames;
                lastFrames = totalFrames;
                lastTime = cur;
                frameRate.Text = frameDiff.ToString() + " fps";
            }
        }

        private Point getDisplayPosition(Joint joint)
        {
            float depthX, depthY;
            nui.SkeletonEngine.SkeletonToDepthImage(joint.Position, out depthX, out depthY);
            depthX = depthX * 320; //convert to 320, 240 space
            depthY = depthY * 240; //convert to 320, 240 space
            int colorX, colorY;
            ImageViewArea iv = new ImageViewArea();
            // only ImageResolution.Resolution640x480 is supported at this point
            nui.NuiCamera.GetColorPixelCoordinatesFromDepthPixel(ImageResolution.Resolution640x480, iv, (int)depthX, (int)depthY, (short)0, out colorX, out colorY);

            // map back to skeleton.Width & skeleton.Height
            return new Point((int)(skeleton.Width * colorX / 640.0), (int)(skeleton.Height * colorY / 480));
        }

        Polyline getBodySegment(Microsoft.Research.Kinect.Nui.JointsCollection joints, Brush brush, params JointID[] ids)
        {
            PointCollection points = new PointCollection(ids.Length);
            for (int i = 0; i < ids.Length; ++i )
            {
                points.Add(getDisplayPosition(joints[ids[i]]));
            }

            Polyline polyline = new Polyline();
            polyline.Points = points;
            polyline.Stroke = brush;
            polyline.StrokeThickness = 5;
            return polyline;
        }

        void nui_SkeletonFrameReady(object sender, SkeletonFrameReadyEventArgs e)
        {
            SkeletonFrame skeletonFrame = e.SkeletonFrame;
            int iSkeleton = 0;
            Brush[] brushes = new Brush[6];
            brushes[0] = new SolidColorBrush(Color.FromRgb(255, 0, 0));
            brushes[1] = new SolidColorBrush(Color.FromRgb(0, 255, 0));
            brushes[2] = new SolidColorBrush(Color.FromRgb(64, 255, 255));
            brushes[3] = new SolidColorBrush(Color.FromRgb(255, 255, 64));
            brushes[4] = new SolidColorBrush(Color.FromRgb(255, 64, 255));
            brushes[5] = new SolidColorBrush(Color.FromRgb(128, 128, 255));

            skeleton.Children.Clear();
             //Stream stream=File.OpenRead("C:\\Users/asrinivas/Desktop/hello.replay");
            //
              //      BinaryReader reader = new BinaryReader(stream);
                //    SkeletonTrackingState TrackingState = (SkeletonTrackingState)reader.ReadInt32();
                  //  Microsoft.Research.Kinect.Nui.Vector Position = reader.ReadVector();
                    //int TrackingID = reader.ReadInt32();
                    //int UserIndex = reader.ReadInt32();
                    //SkeletonQuality Quality = (SkeletonQuality)reader.ReadInt32();
                  //  int jointsCount = reader.ReadInt32();
                   // List<Joint> Joints = new List<Joint>();
                    //List<Vector2> points= new List<Vector2>();

                    //for (int index = 0; index < jointsCount; index++)
            //{
              //  Joint joint = new Joint
                //                  {
                  //                    ID = (JointID)reader.ReadInt32(),
                    //                  TrackingState = (JointTrackingState)reader.ReadInt32(),
                      //                Position = reader.ReadVector()
                        //          };
                        //Vector2 point=new Vector2(joint.Position.X,joint.Position.Z);
                        //points[index]=point;

                //Joints.Add(joint);
            //}
            foreach (SkeletonData data in skeletonFrame.Skeletons)
            {
                if (SkeletonTrackingState.Tracked == data.TrackingState)
                {
                    // Draw bones
                    Brush brush = brushes[iSkeleton % brushes.Length];
                    skeleton.Children.Add(getBodySegment(data.Joints, brush, JointID.HipCenter, JointID.Spine, JointID.ShoulderCenter, JointID.Head));
                    skeleton.Children.Add(getBodySegment(data.Joints, brush, JointID.ShoulderCenter, JointID.ShoulderLeft, JointID.ElbowLeft, JointID.WristLeft, JointID.HandLeft));
                    skeleton.Children.Add(getBodySegment(data.Joints, brush, JointID.ShoulderCenter, JointID.ShoulderRight, JointID.ElbowRight, JointID.WristRight, JointID.HandRight));
                    skeleton.Children.Add(getBodySegment(data.Joints, brush, JointID.HipCenter, JointID.HipLeft, JointID.KneeLeft, JointID.AnkleLeft, JointID.FootLeft));
                    skeleton.Children.Add(getBodySegment(data.Joints, brush, JointID.HipCenter, JointID.HipRight, JointID.KneeRight, JointID.AnkleRight, JointID.FootRight));

                    // Draw joints
                    foreach (Joint joint in data.Joints)
                    {
                        Point jointPos = getDisplayPosition(joint);
                        Line jointLine = new Line();
                        jointLine.X1 = jointPos.X - 3;
                        jointLine.X2 = jointLine.X1 + 6;
                        jointLine.Y1 = jointLine.Y2 = jointPos.Y;
                        jointLine.Stroke = jointColors[joint.ID];
                        jointLine.StrokeThickness = 6;
                        skeleton.Children.Add(jointLine);
                        sg.Add(joint.Position, nui.SkeletonEngine);
                       
                    }
                  
                   
                   

                    }
                //RecordedPath path = new RecordedPath(points.Count);
                //bool m = path.Match(points, 1, 0.7f, 0);
                //if (m)
                //{
                  //  System.Windows.MessageBox.Show("hello");

//                }
                iSkeleton++;

                }
               
               
               

  
                //ProcessForwardBackGesture(data.Joints[JointID.Head], data.Joints[JointID.HandRight], data.Joints[JointID.HandLeft]);
               

            } // for each skeleton
       


        void On_GestureDetected(string gesture)
        {
            System.Windows.MessageBox.Show(gesture);
           
        }




        private void ProcessForwardBackGesture(Joint head, Joint rightHand, Joint leftHand)
        {
            if (rightHand.Position.X > head.Position.X + .45 && rightHand.Position.Y < head.Position.X + .45 && rightHand.Position.Y > head.Position.Y - .35)
            {
                System.Windows.MessageBox.Show("Right");
            }
           

          
        }
        void nui_ColorFrameReady(object sender, ImageFrameReadyEventArgs e)
        {
            // 32-bit per pixel, RGBA image
            PlanarImage Image = e.ImageFrame.Image;
            video.Source = BitmapSource.Create(
                Image.Width, Image.Height, 96, 96, PixelFormats.Bgr32, null, Image.Bits, Image.Width * Image.BytesPerPixel);
        }

        private void Window_Closed(object sender, EventArgs e)
        {
            nui.Uninitialize();
            BTC.Stop();
            Environment.Exit(0);
        }
         /// <summary>

   /// The gesture controller

   /// </summary>

 

   }

    }


Sep 22, 2011 at 10:18 AM

Hello,

I also tried to match the recorded path in a replaz file to current gesture by using the code below but it shows the following Invalid operation error in the match() method called using a recorderpath instance in the skeltalframeready event method.Can you please inform me what is the problem here.

 


using System;
using System.IO;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Windows;
using System.Windows.Controls;
using System.Windows.Data;
using System.Windows.Documents;
using System.Windows.Input;
using System.Windows.Media;
using System.Windows.Media.Imaging;
using System.Windows.Navigation;
using System.Windows.Shapes;
using Microsoft.Research.Kinect.Nui;
using BeTheController;
using Kinect.Toolbox;
namespace SkeletalViewer
{
    /// <summary>
    /// Interaction logic for MainWindow.xaml
    /// </summary>
    public partial class MainWindow : Window
    {
       
        //BTCRunTime BTC = new BTCRunTime();
       
        public MainWindow()
        {

            InitializeComponent();
           

        }

        Runtime nui;
        BTCRunTime BTC = new BTCRunTime();
      
        int totalFrames = 0;
        int lastFrames = 0;
        DateTime lastTime = DateTime.MaxValue;

        // We want to control how depth data gets converted into false-color data
        // for more intuitive visualization, so we keep 32-bit color frame buffer versions of
        // these, to be updated whenever we receive and process a 16-bit frame.
        const int RED_IDX = 2;
        const int GREEN_IDX = 1;
        const int BLUE_IDX = 0;
        byte[] depthFrame32 = new byte[320 * 240 * 4];
        SwipeGestureDetector sg = new SwipeGestureDetector();
       
       
        Dictionary<JointID,Brush> jointColors = new Dictionary<JointID,Brush>() {
            {JointID.HipCenter, new SolidColorBrush(Color.FromRgb(169, 176, 155))},
            {JointID.Spine, new SolidColorBrush(Color.FromRgb(169, 176, 155))},
            {JointID.ShoulderCenter, new SolidColorBrush(Color.FromRgb(168, 230, 29))},
            {JointID.Head, new SolidColorBrush(Color.FromRgb(200, 0,   0))},
            {JointID.ShoulderLeft, new SolidColorBrush(Color.FromRgb(79,  84,  33))},
            {JointID.ElbowLeft, new SolidColorBrush(Color.FromRgb(84,  33,  42))},
            {JointID.WristLeft, new SolidColorBrush(Color.FromRgb(255, 126, 0))},
            {JointID.HandLeft, new SolidColorBrush(Color.FromRgb(215,  86, 0))},
            {JointID.ShoulderRight, new SolidColorBrush(Color.FromRgb(33,  79,  84))},
            {JointID.ElbowRight, new SolidColorBrush(Color.FromRgb(33,  33,  84))},
            {JointID.WristRight, new SolidColorBrush(Color.FromRgb(77,  109, 243))},
            {JointID.HandRight, new SolidColorBrush(Color.FromRgb(37,   69, 243))},
            {JointID.HipLeft, new SolidColorBrush(Color.FromRgb(77,  109, 243))},
            {JointID.KneeLeft, new SolidColorBrush(Color.FromRgb(69,  33,  84))},
            {JointID.AnkleLeft, new SolidColorBrush(Color.FromRgb(229, 170, 122))},
            {JointID.FootLeft, new SolidColorBrush(Color.FromRgb(255, 126, 0))},
            {JointID.HipRight, new SolidColorBrush(Color.FromRgb(181, 165, 213))},
            {JointID.KneeRight, new SolidColorBrush(Color.FromRgb(71, 222,  76))},
            {JointID.AnkleRight, new SolidColorBrush(Color.FromRgb(245, 228, 156))},
            {JointID.FootRight, new SolidColorBrush(Color.FromRgb(77,  109, 243))}
        };

        private void Window_Loaded(object sender, EventArgs e)

        {
           
            nui = new Runtime();
            try
            {
                nui.Initialize(RuntimeOptions.UseDepthAndPlayerIndex | RuntimeOptions.UseSkeletalTracking | RuntimeOptions.UseColor);
            }
            catch (InvalidOperationException)
            {
                System.Windows.MessageBox.Show("Runtime initialization failed. Please make sure Kinect device is plugged in.");
                return;
            }


            try
            {
                nui.VideoStream.Open(ImageStreamType.Video, 2, ImageResolution.Resolution640x480, ImageType.Color);
                nui.DepthStream.Open(ImageStreamType.Depth, 2, ImageResolution.Resolution320x240, ImageType.DepthAndPlayerIndex);
            }
            catch (InvalidOperationException)
            {
                System.Windows.MessageBox.Show("Failed to open stream. Please make sure to specify a supported image type and resolution.");
                return;
            }


           //  lastTime = DateTime.Now;
           // BTC.Start(nui);
            //BTC.LoadPoses("C:\\Users/asrinivas/Desktop/skeletal/Skeletal_Tracking/gn/poses.pbtc");
           // BTC.LoadGestures("C:\\Users/asrinivas/Desktop/skeletal/Skeletal_Tracking/gn/gestures.gbtc");
           // BTC.OnEventRecognized += new BTCRunTime.EventRecognizedEventHandler(Game_OnEventRecognized);
           // BTC.OnEventStopRecognized += new BTCRunTime.EventStopRecognizedEventHandler(Game_OnEventStopRecognized);
            nui.DepthFrameReady += new EventHandler<ImageFrameReadyEventArgs>(nui_DepthFrameReady);
            nui.SkeletonFrameReady += new EventHandler<SkeletonFrameReadyEventArgs>(nui_SkeletonFrameReady);
            nui.VideoFrameReady += new EventHandler<ImageFrameReadyEventArgs>(nui_ColorFrameReady);
            sg.OnGestureDetected += On_GestureDetected;
          
         
          }
        void Game_OnEventRecognized(EventRecognizedEventArgs gca)
        {
            foreach (KinectEvent Event in gca.Events)
            {
                if (Event.Type == KEvents.Gesture)
                {
                    if (Event.Value == "gr") System.Windows.MessageBox.Show("RightGesture");
                }
            }
        }
        void Game_OnEventStopRecognized(EventStopRecognizedEventArgs gca) { foreach (KinectEvent Event in gca.Events)
        { if (Event.Type == KEvents.Pose)
        { if ((Event.Value == "s1"))
        { } } } }

        // Converts a 16-bit grayscale depth frame which includes player indexes into a 32-bit frame
        // that displays different players in different colors
        byte[] convertDepthFrame(byte[] depthFrame16)
        {
            for (int i16 = 0, i32 = 0; i16 < depthFrame16.Length && i32 < depthFrame32.Length; i16 += 2, i32 += 4)
            {
                int player = depthFrame16[i16] & 0x07;
                int realDepth = (depthFrame16[i16+1] << 5) | (depthFrame16[i16] >> 3);
                // transform 13-bit depth information into an 8-bit intensity appropriate
                // for display (we disregard information in most significant bit)
                byte intensity = (byte)(255 - (255 * realDepth / 0x0fff));

                depthFrame32[i32 + RED_IDX] = 0;
                depthFrame32[i32 + GREEN_IDX] = 0;
                depthFrame32[i32 + BLUE_IDX] = 0;

                // choose different display colors based on player
                switch (player)
                {
                    case 0:
                        depthFrame32[i32 + RED_IDX] = (byte)(intensity / 2);
                        depthFrame32[i32 + GREEN_IDX] = (byte)(intensity / 2);
                        depthFrame32[i32 + BLUE_IDX] = (byte)(intensity / 2);
                        break;
                    case 1:
                        depthFrame32[i32 + RED_IDX] = intensity;
                        break;
                    case 2:
                        depthFrame32[i32 + GREEN_IDX] = intensity;
                        break;
                    case 3:
                        depthFrame32[i32 + RED_IDX] = (byte)(intensity / 4);
                        depthFrame32[i32 + GREEN_IDX] = (byte)(intensity);
                        depthFrame32[i32 + BLUE_IDX] = (byte)(intensity);
                        break;
                    case 4:
                        depthFrame32[i32 + RED_IDX] = (byte)(intensity);
                        depthFrame32[i32 + GREEN_IDX] = (byte)(intensity);
                        depthFrame32[i32 + BLUE_IDX] = (byte)(intensity / 4);
                        break;
                    case 5:
                        depthFrame32[i32 + RED_IDX] = (byte)(intensity);
                        depthFrame32[i32 + GREEN_IDX] = (byte)(intensity / 4);
                        depthFrame32[i32 + BLUE_IDX] = (byte)(intensity);
                        break;
                    case 6:
                        depthFrame32[i32 + RED_IDX] = (byte)(intensity / 2);
                        depthFrame32[i32 + GREEN_IDX] = (byte)(intensity / 2);
                        depthFrame32[i32 + BLUE_IDX] = (byte)(intensity);
                        break;
                    case 7:
                        depthFrame32[i32 + RED_IDX] = (byte)(255 - intensity);
                        depthFrame32[i32 + GREEN_IDX] = (byte)(255 - intensity);
                        depthFrame32[i32 + BLUE_IDX] = (byte)(255 - intensity);
                        break;
                }
            }
            return depthFrame32;
        }

        void nui_DepthFrameReady(object sender, ImageFrameReadyEventArgs e)
        {
            PlanarImage Image = e.ImageFrame.Image;
            byte[] convertedDepthFrame = convertDepthFrame(Image.Bits);

            depth.Source = BitmapSource.Create(
                Image.Width, Image.Height, 96, 96, PixelFormats.Bgr32, null, convertedDepthFrame, Image.Width * 4);

            ++totalFrames;

            DateTime cur = DateTime.Now;
            if (cur.Subtract(lastTime) > TimeSpan.FromSeconds(1))
            {
                int frameDiff = totalFrames - lastFrames;
                lastFrames = totalFrames;
                lastTime = cur;
                frameRate.Text = frameDiff.ToString() + " fps";
            }
        }

        private Point getDisplayPosition(Joint joint)
        {
            float depthX, depthY;
            nui.SkeletonEngine.SkeletonToDepthImage(joint.Position, out depthX, out depthY);
            depthX = depthX * 320; //convert to 320, 240 space
            depthY = depthY * 240; //convert to 320, 240 space
            int colorX, colorY;
            ImageViewArea iv = new ImageViewArea();
            // only ImageResolution.Resolution640x480 is supported at this point
            nui.NuiCamera.GetColorPixelCoordinatesFromDepthPixel(ImageResolution.Resolution640x480, iv, (int)depthX, (int)depthY, (short)0, out colorX, out colorY);

            // map back to skeleton.Width & skeleton.Height
            return new Point((int)(skeleton.Width * colorX / 640.0), (int)(skeleton.Height * colorY / 480));
        }

        Polyline getBodySegment(Microsoft.Research.Kinect.Nui.JointsCollection joints, Brush brush, params JointID[] ids)
        {
            PointCollection points = new PointCollection(ids.Length);
            for (int i = 0; i < ids.Length; ++i )
            {
                points.Add(getDisplayPosition(joints[ids[i]]));
            }

            Polyline polyline = new Polyline();
            polyline.Points = points;
            polyline.Stroke = brush;
            polyline.StrokeThickness = 5;
            return polyline;
        }

        void nui_SkeletonFrameReady(object sender, SkeletonFrameReadyEventArgs e)
        {
            SkeletonFrame skeletonFrame = e.SkeletonFrame;
            int iSkeleton = 0;
            Brush[] brushes = new Brush[6];
            brushes[0] = new SolidColorBrush(Color.FromRgb(255, 0, 0));
            brushes[1] = new SolidColorBrush(Color.FromRgb(0, 255, 0));
            brushes[2] = new SolidColorBrush(Color.FromRgb(64, 255, 255));
            brushes[3] = new SolidColorBrush(Color.FromRgb(255, 255, 64));
            brushes[4] = new SolidColorBrush(Color.FromRgb(255, 64, 255));
            brushes[5] = new SolidColorBrush(Color.FromRgb(128, 128, 255));

            skeleton.Children.Clear();
             Stream stream=File.OpenRead("C:\\Users/asrinivas/Desktop/hello.replay");
           
                    BinaryReader reader = new BinaryReader(stream);
                    SkeletonTrackingState TrackingState = (SkeletonTrackingState)reader.ReadInt32();
                    Microsoft.Research.Kinect.Nui.Vector Position = reader.ReadVector();
                    int TrackingID = reader.ReadInt32();
                    int UserIndex = reader.ReadInt32();
                    SkeletonQuality Quality = (SkeletonQuality)reader.ReadInt32();
                    int jointsCount = reader.ReadInt32();
                    List<Joint> Joints = new List<Joint>();
                    List<Vector2> points= new List<Vector2>();

                    for (int index = 0; index < jointsCount; index++)
            {
                Joint joint = new Joint
                                  {
                                      ID = (JointID)reader.ReadInt32(),
                                      TrackingState = (JointTrackingState)reader.ReadInt32(),
                                      Position = reader.ReadVector()
                                  };
                        Vector2 point=new Vector2(joint.Position.X,joint.Position.Z);
                        points[index]=point;

                Joints.Add(joint);
            }
            foreach (SkeletonData data in skeletonFrame.Skeletons)
            {
                if (SkeletonTrackingState.Tracked == data.TrackingState)
                {
                    // Draw bones
                    Brush brush = brushes[iSkeleton % brushes.Length];
                    skeleton.Children.Add(getBodySegment(data.Joints, brush, JointID.HipCenter, JointID.Spine, JointID.ShoulderCenter, JointID.Head));
                    skeleton.Children.Add(getBodySegment(data.Joints, brush, JointID.ShoulderCenter, JointID.ShoulderLeft, JointID.ElbowLeft, JointID.WristLeft, JointID.HandLeft));
                    skeleton.Children.Add(getBodySegment(data.Joints, brush, JointID.ShoulderCenter, JointID.ShoulderRight, JointID.ElbowRight, JointID.WristRight, JointID.HandRight));
                    skeleton.Children.Add(getBodySegment(data.Joints, brush, JointID.HipCenter, JointID.HipLeft, JointID.KneeLeft, JointID.AnkleLeft, JointID.FootLeft));
                    skeleton.Children.Add(getBodySegment(data.Joints, brush, JointID.HipCenter, JointID.HipRight, JointID.KneeRight, JointID.AnkleRight, JointID.FootRight));

                    // Draw joints
                    foreach (Joint joint in data.Joints)
                    {
                        Point jointPos = getDisplayPosition(joint);
                        Line jointLine = new Line();
                        jointLine.X1 = jointPos.X - 3;
                        jointLine.X2 = jointLine.X1 + 6;
                        jointLine.Y1 = jointLine.Y2 = jointPos.Y;
                        jointLine.Stroke = jointColors[joint.ID];
                        jointLine.StrokeThickness = 6;
                        skeleton.Children.Add(jointLine);
                        //sg.Add(joint.Position, nui.SkeletonEngine);
                       
                    }
                  
                   
                   

                    }
                RecordedPath path = new RecordedPath(points.Count);
                bool m = path.Match(points, 1, 0.7f, 0);
                if (m)
                {
                    System.Windows.MessageBox.Show("hello");

                }
                iSkeleton++;

                }
               
               
               

  
                //ProcessForwardBackGesture(data.Joints[JointID.Head], data.Joints[JointID.HandRight], data.Joints[JointID.HandLeft]);
               

            } // for each skeleton
       


        void On_GestureDetected(string gesture)
        {
            System.Windows.MessageBox.Show(gesture);
           
        }




        private void ProcessForwardBackGesture(Joint head, Joint rightHand, Joint leftHand)
        {
            if (rightHand.Position.X > head.Position.X + .45 && rightHand.Position.Y < head.Position.X + .45 && rightHand.Position.Y > head.Position.Y - .35)
            {
                System.Windows.MessageBox.Show("Right");
            }
           

          
        }
        void nui_ColorFrameReady(object sender, ImageFrameReadyEventArgs e)
        {
            // 32-bit per pixel, RGBA image
            PlanarImage Image = e.ImageFrame.Image;
            video.Source = BitmapSource.Create(
                Image.Width, Image.Height, 96, 96, PixelFormats.Bgr32, null, Image.Bits, Image.Width * Image.BytesPerPixel);
        }

        private void Window_Closed(object sender, EventArgs e)
        {
            nui.Uninitialize();
            BTC.Stop();
            Environment.Exit(0);
        }
         /// <summary>

   /// The gesture controller

   /// </summary>

 

   }

    }


Apr 30, 2012 at 3:45 AM

Hey everyone,

Currently I have a project for school and we have been using the first Kinect SDK not the beta 2 sdk. Should the toolkit still work for that or should I get a lower version of toolkit?

Apr 30, 2012 at 5:45 PM

Never mind got everything working