Nowadays, Kinect and ARKit are becoming the major body-tracking technologies for desktop and mobile devices, respectively.

Given the current Coronavirus situation, home fitness applications are on the rise. People are more willing than ever to use the power of Artificial Intelligence and transform their computers into personal trainers! However, the development of an accurate fitness application requires a very particular set of skills: **Maths**!

When it comes to Motion Analysis, developers need to follow the steps of Pythagoras, the most famous Ancient Greek mathematician of all time. Today, I am going to show you how to measure an angle between two bones in the 3D space!

## Prerequisites

To run the demos, you need a computer with the following specifications:

- 7th Gen Intel Core i5 Processor (Quad-Core 2.4 GHz or faster)
- 4 GB Memory
- NVIDIA GeForce GTX 1070 or better
- Dedicated USB3 port
- Windows 10

To write and execute code, you need to install the following software:

## Did you know?…

LightBuzz has been helping Fortune-500 companies and innovative startups create amazing Fitness applications. If you need software engineers to develop your next Fitness app, get in touch with us.

## Source code

The source code of this tutorial is part of the Azure Kinect SDK for Unity3D.

## Measuring an angle in the 3D space

Before diving deeper into the Mathematical equations, let me introduce the Kinect coordinate system.

### The Kinect coordinate system

Kinect is a brilliant device. The sensor combines depth and color data to provide us the coordinates of the human joints in the 3D world space. Every joint is represented as a set of coordinates:

- Horizontal (X-axis)
- Vertical (Y-axis)
- Depth (Z-axis)

Think of the device itself as the reference point (0, 0, 0) of the 3D coordinates.

X and Y values may be positive or negative. The Z values are positive only (the camera cannot “see” behind itself, right?).

Every single joint is represented as a set of X, Y, and Z values (e.g. `{ 0.85, -1.42, 3.97 }`

).

Kinect is using the Metric system, so the units in the above example are **meters**. I’m sorry, my fellow American friends.

In the image below, every single joint is positioned in the 3D space. It’s precisely how Kinect perceives the world!

Breaking the problem down, let’s assume we need to find how much the user is bending their knees. We need to measure the angle between the **hip**, the **knee**, and the **ankle** joints.

Using the Azure Kinect SDK, we can grab the coordinates of theses joints as follows:

Vector3D hip = body.Joints[JointType.HipLeft].Position; Vector3D knee = body.Joints[JointType.KneeLeft].Position; Vector3D ankle = body.Joints[JointType.AnkleLeft].Position;

### Introducing Vectors

Mathematicians have created an intuitive structure to interpret the physical world (and do stuff such as measuring angles). This structure is called a Vector. Vectors are quantities with **size** and **direction**. In our example, we are interested in two particular vectors:

- The vector formed by the Hip and Knee joints.
- The vector formed by the Knee and Ankle joints.

Finding the vector between 2 points in the 3D space is dead easy. All we have to do is deduct the the coordinates of the joints:

Vector3D vector1 = new Vector3D { X = knee.X - hip.X, Y = knee.Y - hip.Y, Z = knee.Z - hip.Z }; Vector3D vector2 = new Vector3D { X = ankle.X - knee.X, Y = ankle.Y - knee.Y, Z = ankle.Z - knee.Z };

### Measuring the Angle

We have 2 vectors with It’s now time to measure the angle between them. We know from science that the Angle Formula is the following:

Stay with me for a moment. The formula is pretty straightforward when we break it down.

The angle is given by the inverse cosine of the dot product divided by the multiplication of the lengths.

### Length

You can think of the length of a vector as the distance between 2 points (in our case, Hip – Knee and Knee – Ankle). Here are the lengths of the vectors in C# code:

float length1 = (float)Math.Sqrt( (vector1.X * vector1.X) + (vector1.Y * vector1.Y) + (vector1.Z * vector1.Z)); float length2 = (float)Math.Sqrt( (vector2.X * vector2.X) + (vector2.Y * vector2.Y) + (vector2.Z * vector2.Z));

### Dot product

The dot product is the sum of the products of the components of two vectors. Here is the dot product of `vector1`

and `vector2`

in C#:

float dot = (vector1.X * vector2.X) + (vector1.Y * vector2.Y) + (vector1.Z * vector2.Z);

### Angle

Putting it all together, we can now apply the angle formula in C#:

float angle = (float)Math.Acos(dot / (length1 * length2)); // Radians

As you remember from school, angles are measured in **radians**. To convert radians to **degrees**, we need to multiply by 180 and divide with Pi (3.14159…).

angle *= 180.0f / (float)Math.PI; // Degrees

And there you have it! You can now measure the angle between any points in the 3D space! Pretty amazing how science works, huh?

## Summary

In this tutorial, we explored how to measure the angle formed by 3 points in the 3D space. We got an introduction to Vectors and applied Math concepts in our C# code.

## Source code

You’ve made it to this point? Awesome! Here is the source code for your convenience. The code also includes these fancy visualizations.

## Before you go…

We know Math is hard and Motion Analysis is tricky. So, if you are looking for software developers who can apply Science and help your business grow, get in touch with us.

## Sharing is caring!

If you liked this article, remember to share it on social media, so you can help other developers, too! Also, let me know your thoughts in the comments below. ‘Til the next time… keep Kinecting!

Great Masterclass Vangos, thanks!

I’m working on a project that involves pose estimation with ARKit. I calculated the angels in 3D space like you described, however I have my doubts about the angels that I find. For example, standing straight up the angle between hip, knee, and the ankle is around 150 degrees. Same for a straight arm position, the angle between shoulder, elbow, and the wrist is around 140 degrees. Any ideas why the result is not more in the range of 170-180 degrees? (I’m not using the new Lidar sensor)

Hey Niels. Thanks for your comment. That’s usually because the ARKit module is not so accurate compared to the Kinect. Many times the elbows or the knees appear to be bent. Try positioning the iPad device on a steady surface, parallel to the ground.