Skip to main content

Since the new Kinect for Azure device was out, I have been receiving only one question: how to animate an avatar in Unity3D?

Today, I am going to show you how to add avateering into your Kinect applications using a few lines of C# code!

Prerequisites

To run the demos, you need a computer with the following specifications:

  • 7th Gen Intel® CoreTM i5 Processor (Quad-Core 2.4 GHz or faster)
  • 4 GB Memory
  • NVIDIA GeForce GTX 1070 or better
  • Dedicated USB3 port
  • Windows 10

To write and execute code, you need to install the following software:

Did you know?…

LightBuzz has been helping Fortune-500 companies and innovative startups create amazing body-tracking applications and games. If you are looking for software developers specialized in biomechanics to create your next motion project, get in touch with us.

Azure Kinect Avateering step-by-step

Before even writing a single line of code, we need to prepare a 3D humanoid avatar to animate. Humanoid models contain built-in information about joints and bones.

Humanoid model structure

A humanoid model should have the following joint and bone structure:

Azure Kinect 3D model skeleton

The Unity Engine has already defined a hierarchy of joints. The hierarchy is a structure of interconnected bones that form the human skeleton.

Commonly used models, such as Unity Chan, follow these principles. Your models should follow the exact same hierarchy, too.

Opening the avatar file in the Editor, its joint/bone structure would look like this:

Azure Kinect Avateering Unity3D model joint bone structure

You can read more about configuring your avatar here

You or your artist can design humanoid avatars in programs such as Blender or 3D Studio Max. However, in this tutorial, I am going to use the free humanoid avatars of LightBuzz. Personally, I like the Codeman avatar (the Codeman looks like Iron Man, but he’s a nerd programmer 😉 ).

So, after downloading the 3D model, do the following:

  • Open Unity3D and import the Azure Kinect SDK.
  • Create a new scene or use one of the demo ones.
  • Drag-and-drop the avatar in your Unity3D Project folder.
  • Finally, drag-and-drop the .fbx file from the Projects folder to your scene.

Your scene should look like this:

Azure Kinect Avateering Unity3D scene

Step 1 – Connect the visual elements

Upon creating your scene and its core elements, it’s time to connect the avatar with the tracked body. Create a C# MonoBehaviour script and import the Azure Kinect SDK namespace:

using LightBuzz.Kinect4Azure;

The Azure Kinect SDK for Unity3D includes its own Avatar class. To use that class, write:

using Avatar = LightBuzz.Kinect4Azure.Avateering.Avatar;

Then, create an Avatar member that will be visible in the Editor:

[SerializeField] private Avatar avatar;

Remember to drag-and-drop your Codeman to the Avatar Root element:

Azure Kinect Avateering - Avatar in Unity Editor

Step 2 – Configure the Kinect device

It’s now time to configure the Azure Kinect device. As described in my first Kinect for Azure article, starting and stopping the device is straightforward. Simply create a KinectSensor reference and use the Start() and OnDispose() methods to start and stop it, respectively:

private KinectSensor sensor;
private void Start()
{
    sensor = KinectSensor.GetDefault();
    sensor?.Open();
}
private void OnDestroy()
{
    sensor?.Close();
}

Step 3 – Update the Avatar

Finally, it’s time for the actual avateering part! Head to the Update method and grab the latest Kinect frame. Then, use the BodyFrameSource to acquire the closest skeleton Body object. Lastly, feed the Avatar reference with the skeleton data by calling its own Update method.

private void Update()
{
    Frame frame = sensor.Update();
    if (frame != null)
    {
        Body body = frame.BodyFrameSource.Bodies.Closest();
        
        avatar.Update(body);
    }
}

That’s right! All of the avateering magic is happening with one line of C# code:

avatar.Update(body, floor);

Source code

You’ve made it to this point? Awesome! In this article, you’ve learnt how to develop your own avateering applications using the Azure Kinect sensor and Unity3D. Avateering can also be used in developing cool applications, such as virtual dressing rooms! You can have all of the functionality right out of the box in the Azure Kinect SDK for Unity3D.

Download the Azure Kinect SDK for Unity3D

Resources

You can refer to the following webpages for more in-depth information.

Before you go…

LightBuzz has been helping Fortune-500 companies and innovative startups create amazing body-tracking applications and games. If you are looking for software developers to create your next motion project, get in touch with us.

Sharing is caring!

If you liked this article, remember to share it on social media, so you can help other developers, too! Also, let me know your thoughts in the comments below. ‘Til the next time… keep coding!

Vangos Pterneas

Vangos Pterneas is a software engineer, book author, and award-winning Microsoft Most Valuable Professional (2014-2019). Since 2012, Vangos has been helping Fortune-500 companies and ambitious startups create demanding motion-tracking applications. He's obsessed with analyzing and modeling every aspect of human motion using AI and Maths. Vangos shares his passion by regularly publishing articles and open-source projects to help and inspire fellow developers.

4 Comments

  • Herb Schilling says:

    Thank you so much for this article.

    I am working with a high school student working on a project to create a kind of virtual dressing room. But this would be special, because the outfits would be astronaut suits! This would be a fun activity for attendees of NASA events we do. It would be a hi-tech version of a cardboard cutout photo booth that we do now where people stick their head in the hole where the astronaut’s face is and we take a photo.

    Your article was perfect timing. Still lots more to figure out but this helps a lot.

    • Thanks for your comment, Herb. The Azure Kinect SDK for Unity also includes a fitting room demo. Instead of a NASA suit, it allows the users to put on the Codeman model. If you need any help, feel free to reach out to us.

  • Vinay Vidhani says:

    Hello Lightbuzz,

    I want to buy the Azure Kinect plugin for unity but I have a small doubt, I saw in your description you have written that Video Recoding is supported, I want to understand what is the meaning of video recording?

    Along with this, I want to do something like If I can record Human motion and record this and I can use it as many times I want, Is the meaning is the same as the video recording?

    I will look forward to your reply.

    Thanks

    • Hey Vinay. Video recording allows you to record color, depth, and skeleton data. You can then play them without having a Kinect device connected! Our Unity package includes both recording and playback demos. When playing the videos, you can measure angles, animate avatars, etc. Just like the real-time view.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.