Skip to main content

Body tracking for every camera

LightBuzz has created the world’s most accurate body tracking software solution. Companies and universities worldwide use our SDK to develop commercial apps for desktop and mobile devices.

Need LiDAR support?

Update April 2020: If you are looking for the latest iPad LiDAR body-tracking support, check our latest article.

Apple is officially the newest player in the Body-Tracking industry! With its new pose estimation capabilities, ARKit is a Kinect alternative for mobile devices. People occlusion and human pose estimation are now core parts of the latest ARKit 3 framework.

So, without further ado, I am going to show you how to develop body-tracking apps for iPhone and iPad devices!

ARKit 3 body tracking

Prerequisites

Since we are developing for the Apple ecosystem, we need the proper Mac computer to develop our applications and the proper iOS device to run them.

Hardware

In terms of hardware, you need a MacOS computer that is compatible with MacOS Catalina. Also, body tracking applications need the powerful Apple A12 Bionic processors to run properly. The following Mac computers and iOS devices would be eligible:

ComputersMobile devices
12-inch MacBookiPhone XS
MacBook Air, 2012 and lateriPhone XS Max
MacBook Pro, 2012 and lateriPhone XR
Mac mini, 2012 and lateriPad Pro 11-inch
iMac, 2012 and lateriPad Pro 12.9-inch
iMac Pro
Mac Pro, 2013 and later

For this guide, I am using a Mac Mini computer with an 11” iPad Pro.

Software

To run the demos, you need to install the following software on your Mac computer:

  • Unity3D 2019.1.5f1 with iOS build target
  • MacOS Catalina 10.15 (Beta)
  • XCode 11 (Beta)

Your iOS device should be updated to iOS 13 (Beta) or iPadOS 13 (Beta).

As you can see, at the date of writing, most of the software is in Beta. Keep in mind that the devices may become unstable or unresponsive, so be extra careful not to lose valuable data. New articles will follow with the public release of ARKit 3, iOS 13, and MacOS 10.15.

If you are in a hurry, download the complete source code on GitHub. Keep reading to understand how to create your own body-tracking apps!

LightBuzz has been helping Fortune-500 companies and innovative startups create amazing body-tracking applications and games. If you are looking to get your business to the next level, get in touch with us.

Contact us

Body Tracking step-by-step

Enough said… Let’s dive right into the ARKit magic. On your computer, launch Unity3D 2019.1 and create a new project.

Unity3D 2019 New Project

Step 1 – Set up the main scene

Unity3D will start with an empty scene. Before adding any visual objects or writing any code, we first need to import the proper dependencies. The skeleton-tracking functionality is part of the ARKit toolkit. As a result, we need to import the ARKit and ARFoundation dependency packages.

Now, create a new scene and add an AR Session and an AR Session Origin object. These objects are controlling the iOS camera while providing a ton of ARKit goodies.

Unity3D XR - Add new AR Session Origin

Also, add an empty game object, name it e.g. Human Body Tracking, and attach a new C# script (HumanBodyTracking.cs).

The structure of the scene should look like this:

Body tracking ARKit - Unity scene setup

Step 2 – Set up the Skeleton

Since the visual elements are in place, we can now start adding some interactivity. Open the HumanBodyTracking.cs script and add a reference to the ARHumanBodyManager class. The ARHumanBodyManager is the primary script that analyzes the camera data to detect human bodies.

[SerializeField] private ARHumanBodyManager humanBodyManager;

To display the joints, we’ll use some simple Unity3D spheres. Each sphere will correspond to a specific joint type. Add a C# Dictionary class to update the joint data, frame-by-frame.

private Dictionary<JointIndices3D, Transform> bodyJoints;

Finally, add references to the user interface elements of the skeleton. We’ll need a sphere object for the joints and a line object for the bones.

[SerializeField] private GameObject jointPrefab;
[SerializeField] private GameObject lineRendererPrefab;
private LineRenderer[] lineRenderers;
private Transform[][] lineRendererTransforms;

You can find the complete C# code in the HumanBodyTracking.cs class on GitHub.

Step 3 – Detect the Tracked Bodies

This is the most important part of the tutorial! ARKit has made body-tracking incredibly easy and accessible. All you need to do is use the ARHumanBodyManager object and subscribe to the humanBodiesChanged event.

private void OnEnable()
{
    humanBodyManager.humanBodiesChanged += OnHumanBodiesChanged;
}
private void OnDisable()
{
    humanBodyManager.humanBodiesChanged -= OnHumanBodiesChanged;
}

The event handler is where the magic happens. The information about the tracked bodies is part of the event arguments. This is how to acquire the bodies:

private void OnHumanBodiesChanged(ARHumanBodiesChangedEventArgs eventArgs)
{
    foreach (ARHumanBody humanBody in eventArgs.added)
    {
        UpdateBody(humanBody);
    }
    foreach (ARHumanBody humanBody in eventArgs.updated)
    {
        UpdateBody(humanBody);
    }
}

Piece of cake, right? So, let’s bring everything together and display the skeleton in the Unity user interface we created in the previous steps.

Note: as of the time of this writing, the ARKit only supports one tracked body.

Step 4 – Display the Skeleton

The following lines of code update the positions of the joints in the camera space. The spheres and lines are overlayed on top of the iOS camera feed.

private void UpdateBody(ARHumanBody arBody)
{
    if (jointPrefab == null) return;
    if (arBody == null) return;
    if (arBody.transform == null) return;
    InitializeObjects(arBody.transform);
    NativeArray<XRHumanBodyJoint> joints = arBody.joints;
    
    foreach (KeyValuePair<JointIndices3D, Transform> item in bodyJoints)
    {
        UpdateJointTransform(item.Value, joints[(int)item.Key]);
    }
    for (int i = 0; i < lineRenderers.Length; i++)
    {
        lineRenderers[i].SetPositions(lineRendererTransforms[i]);
    }
}

Apple supports 92 joint types (indices). However, not all of these joint types are actually tracked! Most of them are inferred, based on the positions of their neighboring joints. For your convenience, I have selected 14 joint types, so I can have a fair comparison with the Kinect camera.

This is how to connect the proper joints and form the human bones:

private void InitializeObjects(Transform arBodyT)
{
    if (bodyJoints == null)
    {
        bodyJoints = new Dictionary<JointIndices3D, Transform>
        {
            { JointIndices3D.head_joint, Instantiate(jointPrefab, arBodyT).transform },
            { JointIndices3D.neck_1_joint, Instantiate(jointPrefab, arBodyT).transform },
            { JointIndices3D.left_arm_joint, Instantiate(jointPrefab, arBodyT).transform },
            { JointIndices3D.right_arm_joint, Instantiate(jointPrefab, arBodyT).transform },
            { JointIndices3D.left_forearm_joint, Instantiate(jointPrefab, arBodyT).transform },
            { JointIndices3D.right_forearm_joint, Instantiate(jointPrefab, arBodyT).transform },
            { JointIndices3D.left_hand_joint, Instantiate(jointPrefab, arBodyT).transform },
            { JointIndices3D.right_hand_joint, Instantiate(jointPrefab, arBodyT).transform },
            { JointIndices3D.left_upLeg_joint, Instantiate(jointPrefab, arBodyT).transform },
            { JointIndices3D.right_upLeg_joint, Instantiate(jointPrefab, arBodyT).transform },
            { JointIndices3D.left_leg_joint, Instantiate(jointPrefab, arBodyT).transform },
            { JointIndices3D.right_leg_joint, Instantiate(jointPrefab, arBodyT).transform },
            { JointIndices3D.left_foot_joint, Instantiate(jointPrefab, arBodyT).transform },
            { JointIndices3D.right_foot_joint, Instantiate(jointPrefab, arBodyT).transform }
        };
        lineRenderers = new LineRenderer[]
        {
            Instantiate(lineRendererPrefab).GetComponent<LineRenderer>(), // head neck
            Instantiate(lineRendererPrefab).GetComponent<LineRenderer>(), // upper
            Instantiate(lineRendererPrefab).GetComponent<LineRenderer>(), // lower
            Instantiate(lineRendererPrefab).GetComponent<LineRenderer>(), // right
            Instantiate(lineRendererPrefab).GetComponent<LineRenderer>() // left
        };
        lineRendererTransforms = new Transform[][]
        {
            new Transform[] { bodyJoints[JointIndices3D.head_joint], bodyJoints[JointIndices3D.neck_1_joint] },
            new Transform[] { bodyJoints[JointIndices3D.right_hand_joint], bodyJoints[JointIndices3D.right_forearm_joint], bodyJoints[JointIndices3D.right_arm_joint], bodyJoints[JointIndices3D.left_arm_joint], bodyJoints[JointIndices3D.left_forearm_joint], bodyJoints[JointIndices3D.left_hand_joint]},
            new Transform[] { bodyJoints[JointIndices3D.right_foot_joint], bodyJoints[JointIndices3D.right_leg_joint], bodyJoints[JointIndices3D.right_upLeg_joint], bodyJoints[JointIndices3D.left_upLeg_joint], bodyJoints[JointIndices3D.left_leg_joint], bodyJoints[JointIndices3D.left_foot_joint] },
            new Transform[] { bodyJoints[JointIndices3D.right_arm_joint], bodyJoints[JointIndices3D.right_upLeg_joint] },
            new Transform[] { bodyJoints[JointIndices3D.left_arm_joint], bodyJoints[JointIndices3D.left_upLeg_joint] }
        };
        for (int i = 0; i < lineRenderers.Length; i++)
        {
            lineRenderers[i].positionCount = lineRendererTransforms[i].Length;
        }
    }
}

ARKit is giving us the position and rotation of the joints in the 3D space! This is how to update the scale, position, and rotation of the sphere in the 2D screen space:

private void UpdateJointTransform(Transform jointT, XRHumanBodyJoint bodyJoint)
{
    jointT.localScale = bodyJoint.anchorScale;
    jointT.localRotation = bodyJoint.anchorPose.rotation;
    jointT.localPosition = bodyJoint.anchorPose.position;
}

This is it! Let’s build and run our project on an actual iOS device!

Step 5 – Build and Deploy

Finally, we need to build and run the project on an actual device. Given that ARKit is part of iOS and iPadOS, we cannot test our code on MacOS (I would love to see a simulator, though).

In Unity, select FileBuild Settings. Click the iOS build target and hit the Build button. You’ll need to specify a location to store the generated project. Wait patiently until Unity finishes with the build process.

Unity will create an XCode project (.xcodeproj). Open the project with XCode 11 Beta. If you use a previous version of XCode, you’ll get an error and your project will not run properly.

When the project is launched, provide your iOS Development credentials, connect your iOS 13 device, and click the Run button. This way, the project will be deployed to the device.

When finished, you should point the camera to a person and you’ll start seeing the 3D overlay on top of the tracked body!

LightBuzz has been helping Fortune-500 companies and innovative startups create amazing body-tracking applications and games. If you are looking to get your business to the next level, get in touch with us.

Contact us

If you liked this article, remember to share it on social media, so you can help other developers, too! Also, let me know your thoughts in the comments below. ‘Til the next time… keep coding!

Vangos Pterneas

Vangos Pterneas is a software engineer, book author, and award-winning Microsoft Most Valuable Professional (2014-2019). Since 2012, Vangos has been helping Fortune-500 companies and ambitious startups create demanding motion-tracking applications. He's obsessed with analyzing and modeling every aspect of human motion using AI and Maths. Vangos shares his passion by regularly publishing articles and open-source projects to help and inspire fellow developers.

38 Comments

  • FrankMiller says:

    Can you record the animations and link that up to a skeleton?

  • Nitin says:

    Is it possible to do this using Unity and ARCore? Thanks!

  • Bharanidharan says:

    Hello. This is BharaniDharan
    ARHumanBodyManager is to be Namespace missing. how to solve this?

  • Omar says:

    Hi, Is it possible to record a specific human motion and then use it to detect if camera detected that specific motion?

    • It’s feasible, yes. It’s not something Apple provides out of the box, though. You’ll need to build your own matching system.

    • Omar says:

      I understand!
      Unfortunately Microsoft Kinect already provides this feature, I can do it but it requires developing a tool first to capture the motion data, then implement a machine learning solution to detect if the current motion matches the recorded one, too much work 🙁
      Thank you though for the tutorial!

  • Shawn says:

    Thanks for your example. I have worked with skeleton tracking and nuitrack previously and am trying to do some similar things with arkit3. I need to track the root joint in 3d per your example and I also need to be able to get the screen coordinates for that point. I have been trying to figure out a simple way to get the screen coordinates and wondering if you might have any suggestions. Thanks.

    • Shawn says:

      I figured out the following which I think works ok for what I need.
      Vector3 screenPos = Camera.main.WorldToScreenPoint(jointT.position);

  • Cyan says:

    Hi,

    Im running Xcode 11, and iOS 13 on MacOS 10.15 (Catalina). I also have Unity 2019.2.9f1 since I can’t see the “Unity3D 2019.1.5f1” on the install menu.
    I downloaded the source code, then run the project. But I just see a pink screen and the error on Xcode Says ‘Shader Shader is not supported on this GPU’.

    Do you know how to resolve this?

    Im pretty new in using all this so please bear with me. I hope you can help me. Thanks!

  • So apparently the production release of Catalina causes Internal Errors for building Shaders in Unity. (Hence the Pink Screen mentioned above)

    This only happens if you try and build with a Graphic setting of Auto or Metal. A setting you can’t get around with ARKit and ARFoundation, as it relies on Metal to be able to build.

    So until Unity addresses this Internal Error, unless anyone else has managed to get a workaround, you can’t build iOS13 specific ARKit/ARFoundation Apps from Unity, so no human tracking.

  • Nikhil says:

    Hey there! I’m trying to follow your tutorial but I’m struggling to use the `JointIndices3D` class. I can’t seem to find any reference to it in the Unity docs or the ARFoundation release notes. Any suggestions?

  • Bilal says:

    Hello,
    Thanks for the tutorial.
    Can we measure the height of the tracked body using ARKit 3?

    • Hi Bilal. The height functionality is not supported out-of-the-box. However, you can add the distances between the various segments (e.g. Head to Neck, Neck to Spine, Spine to Knees, Knees to Ankles).

    • Bilal says:

      Sorry for late reply.
      I tried to get the distance between head and feet but it is the same (about 150 cm) for multiple human bodies (50 cm – 180 cm).
      So I think it gets the distance between parts of the model and not of the human (model has constant height).

      I am trying to calculate it using another way. Using humanStencilTexture and humanDepthTexture from ARHumanBodyManager.

      The problem that I faced now is how to get the distance between device and tracked human body.

      Can you please help me if you have any idea?

      Thanks,

  • BKB III says:

    Hi for some reason ARHumanBodyManager can not be found any suggestions?

  • BKB III says:

    Awesome thank you so much for the reply! I am a college student really interested in this technology. Do you have a rough idea of when such a tutorial will be available?

  • Kan says:

    Hey,
    Firstly, thanks for a great tutorial.
    I been using this as a basis for a project i’m doing and it works quite well, but i have encountered a problem and i was hoping you would have some insight into.
    I have a menu scene which calls the ARseassion scene which for the first time works well. But once i move to another scene and then go back to the AR scene, all i get is a black screen.
    It seems after moving scenes something happens to the AR scene but it gives no error, nothing to follow up on.
    It could be because once i change scenes, some thing is left from the AR scene and sticks around.
    I tried a lot of options and the internet isn’t filled with a lot of info.
    Any help would be much appreciated.

    Thanks,
    Kan

  • Kan says:

    Thanks for the quick replay (Doesn’t let me replay)
    I looked quite a lot into this, and it seems my problem is that after the OnDestory is called the ARSession State is changed to ‘ready’.
    I’m not sure if i’m missing something but the restart function for some reason isn’t usable to me, i couldn’t find it anything.
    I did find it inside the ARSession script but i could not call it.
    It seems i need to somehow as you said, reset it, but i’m a bit lost.
    Do i need to Instantiate it before i change scenes and then somehow use the clone? Or once i call Instantiate will it become the main object without any more actions?
    (Sorry for a kind of a long question, i don’t own a Mac so i can’t test this stuff til i can go to a place with one.)

    Thanks,
    Kan

  • Hey Kan. Consider updating to the latest Preview packages (ARKit XR Plugin & AR Foundation Plugin) from the Package Manager.

  • Kaspra says:

    I’ve tried building the project as described with Unity 2019, and no warnings or errors after building in Unity and Xcode, also including the prefabs and other scripts. I am not sure why however the skeleton is still not visible after running in Xcode. Is there something I’m missing?

    • Hi Kaspra. You need to run it on an actual iPad device. ARKit may take some time to start, usually no longer than 30 seconds.

    • Kaspra says:

      Thanks for the reply! I’m running on an iPhone XR so I thought it should still work. Thanks for the help! 🙂

  • Kaspra says:

    Thanks for the reply! I am running on a real iPhone XR, so I thought this should still work.

  • Saurabh Goel says:

    Assets\script_my\HumanBodyTracking.cs(17,24): error CS0246: The type or namespace name ‘JointIndices3D’ could not be found (are you missing a using directive or an assembly reference?)

    can you help

  • Poria says:

    Hello
    error CS0246: The type or namespace name ‘JointIndices3D’ could not be found (are you missing a using directive or an assembly reference?)

    how should i fix this

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.