It’s rumored that Microsoft will showcase the second version of HoloLens in the next few days. So, I thought about bringing some holograms to the all-favorite Cloud.

This is Vangos from LightBuzz and today we are going to combine two amazing Microsoft technologies: Azure and HoloLens.

What is Microsoft Azure?

Microsoft Azure is a secure Cloud solution that allows you to build fast and scalable web applications. It’s easy to use, provides a lot of customizable services, and has global availability with 50 regions worldwide.

What is Microsoft HoloLens?

Microsoft HoloLens is a cutting-edge Mixed Reality headset, ready to get its first major revamp since 2016. HoloLens maps the 3D space around you and projects holograms in front of your eyes, allowing you to interact with them in a natural way.

What are we going to do?

Remember this simple To-Do list demo, which is bundled with every Azure tutorial? Stay tuned because I am going to show you how to implement this very same example on HoloLens.

As you can see, we’ll develop a simple Mixed Reality application with the following functionality:

  • The application will fetch a list of to-do items from the Cloud and will display them as holograms.
  • The user will be able to air-tap a to-do item to mark it as “done”.
  • The application will sync with the Cloud and display the updated information.

Grab the Source Code on GitHub

Prerequisites

To run the demos, you need a Windows 10 computer with the following software installed:

You can run the application on your HoloLens device or the HoloLens simulator.

Bringing Mixed Reality to the Cloud

Time to get started! I assume you already have Unity3D and Visual Studio open.

Step 0 – Setting up the development environment

Complex tasks can be made easy if using the right tools. There are a couple of tools that will help us streamline the process.

The first one is the Azure SDK for Unity3D. The Azure SDK is a Unity plugin that allows us to connect to a remote App Service and update data. Download the Unity Package and import it to the Unity Editor.

The second plugin we are going to use is Microsoft’s Mixed Reality Toolkit. Mixed Reality Toolkit (formerly known as “HoloToolkit”) includes a ton of holographic controls and interface building elements, including cursors, spatial mapping, gesture detection, and more.

The Azure SDK and the Mixed Reality Toolkit can smoothly work together.

LightBuzz Azure SDK for Unity - HoloLens demo

Now that the Unity Editor has the required plugins, create an empty scene and add 4 objects:

  • LightBuzz Azure Manager (take it from GitHub)
  • HoloLens Camera (take it from the Mixed Reality Toolkit)
  • Default Cursor (same)
  • Input Manager (same)

Azure HoloLens - Unity EditorStep 1 – Fetching data from the Cloud

In a real-world scenario, you should have your Azure services and databases deployed. For our demo, I have set up an Azure service for you:

private string mobileAppUri = "https://testtodolightbuzz.azurewebsites.net";

This service includes a dummy database with to-do items. A to-do item is represented by the TodoItem C# class:

public class TodoItem
{
    [JsonProperty(PropertyName = "id")]
    public string Id { get; set; }

    [JsonProperty(PropertyName = "text")]
    public string Text { get; set; }

    [JsonProperty(PropertyName = "complete")]
    public bool Complete { get; set; }
}

To fetch the to-do items, you need to create an Azure Service Client, an Azure Service Data Access Object, and a list of the to-do items:

private LightBuzzMobileServiceClient azureClient;
private AppServiceTableDAO<TodoItem> todoTableDAO;
private List<TodoItem> todoItems;

You can now connect to the remote service and fetch the data in your Unity Start() method:

// Initialize Azure
azureClient = new SampleMobileClient(mobileAppUri, supportLocalDatabase);
await azureClient.InitializeLocalStore();

// Retrieve the items from the server
todoTableDAO = new AppServiceTableDAO<TodoItem>(azureClient);
todoItems = await todoTableDAO.FindAll();

Hint: the Azure SDK for Unity3D allows you to also store the data locally. This is extremely important when developing HoloLens applications since an Internet connection may not be always active. Thus, being able to store your data on the device and sync them with the cloud later is a life-saver.

Now, the data can be easily displayed in our Unity 3D user interface. The source code comes with a HoloLensClickableElement prefab. A HoloLensClickableElement is an object we can interact with using simple HoloLens gestures. This is how to instantiate it:

private HoloLensClickableElement tile;

for (int i = 0; i < todoItems.Count; i++)
{
    TodoItem item = todoItems[i];

    HoloLensClickableElement obj = Instantiate(tile);
    obj.Setup(item);
}

Step 2 – Interacting with holograms

Let’s dive deeper into the HoloLensClickableElement class. How can you interact with a 3D hologram? The answer would be using an air-tap gesture. The air-tap is, currently, the only way to “click” an object in the 3D space.

Microsoft has done an incredible job to expose all of the necessary functionality into an easy-to-use API. Thanks to the Mixed Reality Toolkit, all we need to do is implement the IInputClickHandler interface:

public class HoloLensClickableElement : MonoBehaviour, IInputClickHandler
{
    public void OnInputClicked(InputClickedEventData eventData)
    {
        Debug.Log("Item Clicked");
    }
}

Again, you can find the complete source code on GitHub. For the sake of simplicity, I am not including all of the source code here. The complete source code also includes event handlers and animations for focusing on an element.

Step 3 – Updating the Cloud

Let’s bring it all together now:

  • The HoloLensClickableElement listens for the air-tap gesture.
  • When an air-tap gesture is detected, we should mark the to-do item as “Done”.

Let’s expand the HoloLensClickableElement class to include a reference to the to-do item:

public class HoloLensClickableElement : MonoBehaviour, IInputClickHandler
{
    public event EventHandler OnClick;

    public TodoItem Item { get; set; }

    public void Setup(TodoItem item)
    {
        Item = item;
    }

    public void OnInputClicked(InputClickedEventData eventData)
    {
        OnClick?.Invoke(this, new EventArgs());
    }
}

After that, all we need to do is subscribe to the Clicked event:

HoloLensClickableElement obj = Instantiate(tile, new Vector3(x, y, z), Quaternion.identity);
obj.Setup(item);

obj.OnClick += async (sender, e) =>
{
    HoloLensClickableElement source = sender as HoloLensClickableElement;

    await todoTableDAO.Delete(source.Item);
};

This is it! The database is now updated.

Step 4 – Deploying the application to HoloLens

There is one final step: deploying the application to the HoloLens device (or emulator). To do so, navigate to the Unity menu and select the following option:

Mixed Reality Toolkit → Configure → Apply Mixed Reality Project Settings

Unity3D Mixed Reality Project Settings

Click “Apply” and then go to File → Build Settings to create your UWP project. Wait patiently for Unity to build the project.

Finally, open the generated project in Visual Studio, connect your HoloLens device, and click the Run button.

For a complete tutorial on building and deploying a HoloLens project, refer to my Getting Started with HoloLens article about building your project for UWP.

Resources

Want to become an expert in Mixed Reality and Cloud computing? Experts start from the basics:

Getting started tutorials

Software Development Kits (open-source)

LightBuzz tutorials and guides

I would like to thank Georgia Makoudi, author of the Azure SDK for Unity, who helped me write this article and polish the demos.

So, did you like this tutorial? How are YOU planning to combine Mixed Reality and Cloud computing? Let me know in the comments below.

‘Til the next time, keep coding!

Vangos Pterneas

Vangos Pterneas

Vangos is helping innovative businesses and Fortune-500 companies create amazing digital products. He's an award-winning Microsoft Most Valuable Professional.

2 Comments

  • Marko says:

    Hi Vangos,
    Working on a project for our client, an aftermarket car parts retailer together with our partner who are doing lots of cool stuff with AR and VR. The idea is to use Hololens and to try to help our client with logistic operations in their big warehouse. Scenarios like freeing warehouse worker hands (pick by sight option), to show navigation through Hololens, scan barcodes and etc. Anyway, trying to get some founding from local Microsoft office here in Croatia, but unfortunately only way to get some founding is if there will be some serious Azure consumption. I’m not a tech person in this project and seems very difficult at the moment to figure out what could be potential Azure consumption for a project like this. I mean, most of the work that was done so far was concerning indoor location how to locate a person in a warehouse. What is your opinon? Any thoughts on our problem?
    Thanks in advance,
    Marko

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.