Skip to main content

Body tracking

LightBuzz has created the world’s most accurate body tracking software solution. Companies and universities worldwide use our SDK to develop commercial apps for desktop and mobile devices.

After Microsoft announced the (not so sudden) death of Kinect, LightBuzz has committed ourselves to providing viable alternatives for our customers and their businesses. LightBuzz has been developing motion-analysis applications for the past seven years, so we know first-hand how demanding such projects are.

Today, I would like to introduce you to the Orbbec Astra sensor and the Nuitrack body-tracking SDK. This is an in-depth tutorial for software developers who would like to use a new depth sensor and a new body-tracking SDK to develop amazing motion applications. We are going to develop a simple Windows app that will do the following:

  • Visualize and display the depth feed
  • Track the skeleton of the closest person
  • Display the human body joints on top of the depth stream

Here’s how it’s going to look like:

During the past few years, LightBuzz has helped numerous companies embrace motion technology in their business. In case you are planning to migrate your existing Kinect project or just need to create a motion-analysis product from scratch, feel free to start a project with LightBuzz.

Contact us

So, let’s meet Orbbec Astra and Nuitrack!



The Sensor: Orbbec Astra

Orbbec Astra is a tiny yet powerful depth camera. It comes with a 1280×960 RGB color camera and a 640×480 depth camera. It weighs just 300 grams (0.6 pounds). More important, the camera only needs a USB 2 port. No separate power supply is required.

Orbbec Astra sensor

Forget about that bulky Kinect adapter and its USB 3 limitations. Orbbec Astra is as portable as your laptop.

Here is a table of its hardware specs:

Color resolutionDepth resolutionRangeField of viewMicrophones
1920×960640×4800.6 – 8.0m (2 – 26ft)60° (hor) x 49.5° (vert) x 73° (diag)2

Let’s now meet the software we’ll be using to detect and track the human body joints.

The SDK: Nuitrack

Nuitrack is a cross-platform framework that allows us to access the camera streams and track the human body. Nuitrack provides the following stream readers:

ColorSensorProvides access to the raw RGB color data.
DepthSensorProvides access to the raw depth data.
UserTrackerDetermines the depth points that belong to the tracked players.
SkeletonTrackerProvides access to the 3D and 2D coordinates of the human body joints.

Nuitrack is compatible with desktop computers and mobile devices. I’ll talk more about its mobile capabilities in my next article.

Set up Astra & Nuitrack

Before diving into the development work, you first need to install the proper software packages on your Windows computer.

Step 1 – Orbbec drivers

Start by downloading the official Orbbec Astra drivers. The download link can be found on Orbbec’s website. If you miss this step, your camera will not function properly.

Step 2 – Nuitrack

Now that the camera drivers are installed, it’s time to download Nuitrack from its official website. Go to the Nuitrack Downloads page and select either or In my case, I am running a 64-bit machine, so I selected the latter option.

Download and extract the package on your computer. It’s important to select a location that’s easy for you to remember. You’ll need to use the location of the Nuitrack folder in step #3 below. In my case, the path is:


To develop motion-tracking apps using Nuitrack, you’ll also need to download the Nuitrack SDK. The SDK can be stored in a separate folder.

Nuitrack is a paid package, however, you can try it for free for as long as you like. The trial version has a 3-minute time limit. Upon reaching the 3-minute limit, you’ll need to restart the app.

Nuitrack supports a variety of cameras, but, for now, we’ll be focusing on Orbbec Astra.

Step 3 – OpenNI

Remember OpenNI? OpenNI was one of the first open-source software kits for accessing the streams of depth cameras. Nuitrack is using OpenNI, too, so you should install the bundled OpenNI package:


Step 4 – Environment Variables

We are almost done… Nuitrack will be referenced from your projects using the path we specified back in step #2. The path should be stored in two environment variables.

Hit the Start button and search for “Environment Variables”.

Windows Search Environment Variables

The System Properties window should open. Click the “Environment Variables” button and navigate to the System variables panel.

Add a new property named NUITRACK_HOME. Assign the installation folder of the Nuitrack SDK as its value:

Environment Variables - Nuitrack Home

Finally, search for the Path environment variable and click Edit. Add the Nuitrack folder with the “bin” extension:

Environment Variables - Nuitrack Path

Step 5 – Test!

If everything was done correctly, you should be able to run one of the Nuitrack samples.

Connect the Orbbec Astra to a USB port and navigate to the Nuitrack installation folder. Open the “bin” directory (e.g. “C:\Users\Vangos\Desktop\Nuitrack\nuitrack\bin”). Then, double-click the nuitrack_c11_sample.exe to test the camera. If you see something like this, congratulations!

Nuitrack sample

Your camera and SDK are working properly!

Developing a body-tracking application

Let’s get to the good parts now. We’ll develop our first body-tracking application using Nuitrack and Unity3D. I assume you have already downloaded the Nuitrack SDK.

Launch Unity and open the VicoVRSDK.unitypackage sample. This will automatically launch Unity3D and will prompt you to create a new project:

Nuitrack Unity sample project

Unity will then prompt you to import the package. You only need the contents of the Nuitrack and Plugins folder, but you may import everything, so you can experiment with the built-in demos.

Nuitrack Unity import package

To get started, we’ll add a new Unity scene with the following components:

  • A Canvas element; this will contain the user interface of the application.
  • A RawImage element; this will display a visualization of the depth stream.
  • 17 RawImage elements; each one corresponds to a human joint.

Also, add a NuitrackSample.cs script where you’ll reference all of the above. To use Nuitrack, you first need to import its namespace:

using nuitrack;

To access the depth and skeletal information, you need to create a DepthSensor and a SkeletonTracker object.

The DepthSensor object will give us access to the raw depth stream. The raw depth stream is an array of distance values.

The SkeletonTracker object will let us access a high-level representation of the body data.

private DepthSensor depthSensor;
private SkeletonTracker skeletonTracker;

In your Start() method, initialize Nuitrack, create an instance of the DepthSensor & SkeletonTracker object, and subscribe to their corresponding events:

private void Start()
    depthSensor = DepthSensor.Create();
    depthSensor.OnUpdateEvent += DepthSensor_OnUpdateEvent;
    skeletonTracker = SkeletonTracker.Create();
    skeletonTracker.OnSkeletonUpdateEvent += SkeletonTracker_OnSkeletonUpdateEvent;

Last but not least, call the Run() method.

In your OnApplicationQuit() method, remember to dispose of the unmanaged resources and unsubscribe from the events:

private void OnApplicationQuit()
    if (depthSensor != null)
        depthSensor.OnUpdateEvent -= DepthSensor_OnUpdateEvent;
    if (skeletonTracker != null)
        skeletonTracker.OnSkeletonUpdateEvent -= SkeletonTracker_OnSkeletonUpdateEvent;

In your Unity Update() method, simply call Nuitrack.Update():

private void Update()

Displaying the Depth stream

To display the depth stream, we need a reference to the RawImage component. We also need a Texture2D object that will be updated whenever we have a new frame available. The raw frame data will be stored into a simple byte array. To check whether the current frame is new, we’ll hold a reference to its timestamp.

private RawImage image;
private Texture2D texture;
private byte[] colorData;
private ulong previousTimestamp;
private readonly ushort MaxDepth = 8000;

Bringing everything together, this how you can create a visualization of the depth frame:

private void DepthSensor_OnUpdateEvent(ColorFrame frame)
    if (frame != null)
        if (frame.Timestamp != previousTimestamp)
            previousTimestamp = frame.Timestamp;
            if (texture == null)
                texture = new Texture2D(frame.Cols, frame.Rows, TextureFormat.RGBA32, false);
                colorData = new byte[frame.Cols * frame.Rows * 4];
                image.texture = texture;
            int index = 0;
            for (int i = 0; i < frame.Rows; i++)
                for (int j = 0; j < frame.Cols; j++)
                    ushort depth = frame[i, j];
                    byte color = (byte)(depth * 255 / MaxDepth);
                    colorData[index + 0] = (byte)(255 * color);
                    colorData[index + 1] = (byte)(255 * color);
                    colorData[index + 2] = 0;
                    colorData[index + 3] = 255;
                    index += 4;

You can experiment with the byte array to create a visualization of a different color. I have chosen to create a shade of blue visualization. The MaxDepth value is the maximum depth allowed (aka 8000 millimeters; aka 8 meters; aka 26 feet).

Nuitrack depth sampleAll of the depth values are measured in millimeters. One millimeter is approximately 0.039 inches. Nuitrack is using the metric system.

Tracking the user Body/Skeleton

As of now, our application can show the depth visualization, but it cannot detect any people. Let’s go to the event handler of the SkeletonTracker. The following code snippet shows how you can acquire the 3D and 2D coordinates of the tracked bodies. Keep in mind that Nuitrack can track up to 6 people.

private void SkeletonTracker_OnSkeletonUpdateEvent(SkeletonData skeletonData)
    if (skeletonData != null)
        Debug.Log("Tracked users: " + skeletonData.NumUsers);
        Skeleton body = skeletonData.Skeletons.Closest();
        if (body != null)
            var head3D = body.Joints[(int)JointType.Head].Real;
            var head2D = depthSensor.ConvertRealToProjCoords(head3D);
            var neck3D = body.Joints[(int)JointType.Neck].Real;
            var neck2D = depthSensor.ConvertRealToProjCoords(neck3D);
            var torso3D = body.Joints[(int)JointType.Torso].Real;
            var torso2D = depthSensor.ConvertRealToProjCoords(torso3D);
            // etc...

The 3D coordinates (X, Y, Z) are measured in millimeters.

The 2D coordinates (X, Y) are measured in pixels within the bounds of the 640×480 depth frame.

To detect the closest body, I created a simple extension method that sorts the detected skeletons based on their descending distance from the sensor:

public static class NuitrackExtensions
    public static Skeleton Closest(this Skeleton[] skeletons)
        Skeleton body = null;
        float minDistance = 0f;
        foreach (Skeleton current in skeletons)
            if (body == null)
                body = current;
                float distance = body.Joints[(int)JointType.Waist].Real.Z;
                if (distance < minDistance)
                    minDistance = distance;
                    body = current;
        return body;

You can use the information above to update the position of the RawImages that correspond to the human body joints. You can also draw lines between the points. Here is how the result would look like:

Nuitrack skeleton sample

During the past few years, LightBuzz has helped numerous companies embrace motion technology in their business. In case you are planning to migrate your existing Kinect project or just need to create a motion-analysis product from scratch, feel free to start a project with me.

Contact us


Here are a few comments regarding common failures.

The application cannot detect the camera

Ensure you have installed the camera drivers from the manufacturer’s website. Also, check the nuitrack_c11_sample.exe file to ensure Nuitrack is installed properly.

The application is crashing at runtime while in use

The Nuitrack trial has a 3-minute limit. This means you can test your body-tracking apps for no longer than 3 minutes. An app using the trial version of Nuitrack will automatically crash after 3 minutes of activity. To distribute your app, you need to purchase the commercial version.

Unity 2017 Error “PlayerSettings.mobileMTRendering”

Unity 2017 may display the following error:

Assets/Platform Changer/Editor/PlatformChanger.cs(77,28):
error CS0117:
`UnityEditor.PlayerSettings' does not contain a definition for `mobileMTRendering'

The mobileMTRendering property is missing. Navigate to the PlatformManager.cs file and change line 77 from:

PlayerSettings.mobileMTRendering = GetPlatform().multithreadedRendering;


PlayerSettings.MTRendering = GetPlatform().multithreadedRendering;

Unity throws ModuleNotInitializedException

The ModuleNotInitialized exception usually means that no depth camera is connected to the computer. In case your camera is connected, it means there was a problem with the installation of Nuitrack or the drivers. Unplug the camera, re-install the drivers, and plug the camera again.

So, did you like the Orbbec Astra camera? Did you enjoy the Nuitrack SDK? Let me know in the comments below!

‘Til the next time… Keep coding!

Body tracking

LightBuzz has created the world’s most accurate body tracking software solution. Companies and universities worldwide use our SDK to develop commercial apps for desktop and mobile devices.

Vangos Pterneas

Vangos Pterneas is a software engineer, book author, and award-winning Microsoft Most Valuable Professional (2014-2019). Since 2012, Vangos has been helping Fortune-500 companies and ambitious startups create demanding motion-tracking applications. He's obsessed with analyzing and modeling every aspect of human motion using AI and Maths. Vangos shares his passion by regularly publishing articles and open-source projects to help and inspire fellow developers.


  • Satomi Watanabe says:

    Suddenly I’m sorry.
    I have a question about Nuitrack SDK.

    Can Nuitrack SDK recognize voice?

    Could you please tell me?

  • An says:

    I want to apply Realsense D415 for Vitruvius.
    But I don’t know how to apply camera Realsense D415.
    Could you support me via email:

  • Nicolas says:

    I do the same thing as the DepthSensor, but the ColorSensor not works, after Nuitrack.Update(); colorFrame is never update, depthFrame yes.
    What is different to initialise with ColorSensor ?


    • Hi Nicolas. This is a little tricky and I had to contact the company behind Nuitrack to find out. Here is what you need to do:

      0) Close Unity3D.
      1) Navigate to your Nuitrack installation folder (e.g. C:\Program Files\Nuitrack\nuitrack\data).
      2) Open the nuitrack.config file using a text editor.
      3) Find the following entry:

      "AstraProPerseeDepthProvider": {
      "Windows": {
      "CameraID": -1
      "POSIX": {
      "PID": "0x0501",
      "VID": "0x2bc5"

      4) Change the CameraID value to 0, 1, 2, or 3.
      5) Save the file.
      6) Run Unity3D again.

    • Nicolas says:

      Thanks, it works with 0

  • Mykel Kristoffer Divino says:


    I’m initializing another PC for my other project and I’m having the error:

    [DllNotFoundException: libnuitrack
    nuitrack.NativeNuitrack.Init (System.String config)
    nuitrack.Nuitrack.Init (System.String config, NuitrackMode mode)
    NuitrackManager.NuitrackInit () (at Assets/NuitrackSDK/Nuitrack/NuitrackManager.cs:183)
    NuitrackManager.Awake () (at Assets/NuitrackSDK/Nuitrack/NuitrackManager.cs:110)]

    What step did I miss I wonder.


  • Ehren says:


    Is there a version of Visual Studio 2017 for the body tracking code? I’ll be very excited to receive your reply.~~


  • Ehren says:

    Yes, I’m working on Visual Studio 2017 Community. Actually I want to know how to build a body tracking application in VS 2017 with Nuitrack SDK. The procedure presented above is for Unity3D. But I never used Unity3D. So is there any code and procedure of using Nuitrack in VS 2017 IDE to study.~~

    • You could download the official Nuitrack binaries and add them to a different type of project (e.g. WinForms or WPF). Most of the source code would remain the same. The interaction with the UI would be different, though, based on your target platform.

  • Justin de Guia says:

    Hi. Pretty much a newbie on Unity. May i ask to clarify this further?

    A Canvas element; this will contain the user interface of the application.
    A RawImage element; this will display a visualization of the depth stream.
    17 RawImage elements; each one corresponds to a human joint.


    • Hello Justin. These are all Unity user interface elements. Canvas is the main 2D render frame. All 2D UI elements go in there. A RawImage element displays an image. Using the RawImage elements, you can visualize a dynamic stream (such as the color or depth data streams) or a static picture (e.g., a PNG or JPG file).

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.