ARKit Guide: Augmented Reality App Development for iOS

ARKit Development Guide: Augmented Reality In iOS Applications

Based on a keynote “ARKit: Creating Non-Standard Augmented Reality Solutions”, which was contributed to MobileTechCon 2018 by Andrew Makarov, Head of Mobile Development at MobiDev.

It seems like yesterday Augmented Reality was just a buzzword. Today it is a part of our daily life, brought to millions of mobile users with such technologies as Apple’s ARKit for iOS development. This story is about cases and non-standard solutions that we applied in real life projects. You will see how we managed to overcome a number of technological constraints and form optimal tech solutions for businesses of our clients.

How ARKit works

It is like creating visual effects for films, but with a major difference. Everything is performed on a mobile device in real time, and that’s impressive.

The process of augmentation takes 3 steps: trackingscene understanding, and rendering. Input data comes from the camera, accelerometer, and gyroscope. Then it is processed to calculate the motion of the camera in the physical world. This allows to draw 3D content on top of the image.

Correct work of ARKit requires well-textured and lit environment, a flat surface for visual odometry, and static scene for motion odometry. If the environment does not correspond to these requirements, ARKit provides the user with information on the tracking state. There are 3 possible states: not available, normal, and limited.

ARKit provides the user with information on the tracking state

As a rule, we can place a virtual object on or near a surface in the real world. Starting from the version 1.5 ARKit is capable of vertical surfaces and images recognition.

Standard solutions are good because we can easily and quickly add AR to any app. We can create new experiences and draw attention to the product. That’s why App Store is currently full of AR apps—and it isn’t bad at all. However, we want to take a broader look at augmented reality.

The thing is, if you want to be noticed and recognized on the market, you have to create something new, something that others might even call impossible. Here are several cases.

Augmented Reality development company MobiDev

Case study: Augmented Reality in moving vehicles

Right after the announcement of ARKit we started the implementation of a product that would allow passengers in moving vehicles augment the outside reality. Official documentation said that ARKit could work only in static scenes, and in a moving car or a bus it would “misbehave” owing to mismatch of visual and motion data.

The solution was found in simplification. For our task, we had to render objects beyond the vehicle according to geographical coordinates. That is why we created a custom tracking solution. It was based on periodical location updates, routing to calculate intermediate positions, and compass data. Thus we were able to place objects in the scene correctly. Our algorithm was way simpler than the one used in ARKit—but this exact algorithm solved the problem.

Later, in the same project we had another issue – vibration during movement. The real image shook, and so did the virtual content. It was barely noticeable, yet we wanted to remove it. The most obvious and logical choice was implementation of image stabilization algorithm, like it is done for optics. We take the position calculations from the previous task and add a phase of data smoothing, for example, with the moving average. However, it was a problematic solution. The average of 90 degrees and 92 degrees is 91 all right, but if we take 360 degrees and 2 degrees, the average will be 181, which means a completely different direction.

Adaptive algorithm to implement Augmented Reality in moving vehicles

In addition, we didn’t want the image to lag whenever a user would quickly turn the phone. To solve this task, we implemented an adaptive algorithm, which corresponded with input data. In other words, small vibrations mean shaking, and we applied a smoothing algorithm. Bigger rotations mean that the camera has been quickly angled by the user. We needed to angle it just as quickly without smoothing—and turn it on again.

Case study: scene scaling

ARKit uses real world scale. It is very convenient when you need to place an object on a surface near the user. But what if we are creating an app that shows signs above buildings? If a building is far away, the perspective will make the sign too small—it will be barely noticeable. In our case, we had two possible solutions. The first — to scale signs proportionally with distance. The second — to transform coordinates with a projection on an invisible sphere.

We chose the second way. Besides solving the scaling problem, it helped us with clusterization. If the marks on the map are too close to each other, they will be gathered in a cluster. It is the same in AR, but in another coordinate system—from the user’s point of view.

Solving problem of scene scaling using ARKit

Case study: indoor navigation with Augmented Reality

Navigation is essential to our lives. When we are in a foreign city, all we have to do is open the map, choose the destination, and that’s it—we’ve got the route!

Indoor navigation is different. You probably faced the problem of wasting your time while searching for the right way in airports or shopping malls. Many companies work to solve this problem. For example, Apple and Google created detailed maps of large airports. So what’s the problem then?

How to implement indoor navigation in Augmented Reality with ARKit

It’s all about scene positioning. We need to determine user location. GPS works well outdoors, but it’s not accurate enough indoors because of interference—and it does not “see” what floor you are on. Some solve this task with iBeacon, a small Bluetooth device that periodically transmits small packages of data.

If we have several beacons in different locations, we can perform triangulation and define the location of the device. However, there is a number of problems. One — the operating range of such beacons is 10 to 100 meters, depending on the model. Two — the signal can be blocked by walls, people, plants, and other obstacles. Three — at least 3 active beacons are required for triangulation. It will take a lot of beacons to ensure accuracy of positioning. Their maintenance—deployment, batteries, diagnostics, urgent replacement—will be expensive as well. That is why iBeacon never became a popular solution for indoor navigation.

Full Research Article On AR Indoor Navigation

This is where ARKit helped us. Owing to its integration with machine vision, we were able to solve this problem in a different way. All we needed to do was to place marks with location metadata in corresponding places. It could be floors, it could be walls. After scanning one mark, we got the accurate position of the user in 3D coordinates, which would become the starting point. What is even better, visual marks need no batteries that run down or spare parts that can break. They can be printed images or already existing signboards and panels.

We got the starting point. Then we needed to draw the route. Initially, the 3D layer overlays the image layer taken by the camera. When our route goes around the corner, we expect it to be occluded by the wall. But in reality, we see the entire route. It does not look good or natural. It looks confusing.

How to draw a route around the corner using ARKit for iOS app

There were 3 possible ways to solve this problem.

  • The first and the simplest, but not very convenient for users, was to show the route with an arrow, like a compass. Good for a game of treasure seekers, but not the best solution for a navigation app.
  • The second way was to show the route within a certain distance from the user. It would not require much time for implementation, and it would solve the problem.
  • However, we opted for the third and most progressive way — creation of a low-poly model of the building, which is very easy to create if we have a 2D map of the building. It effectively clips the part of the route that must not be visible. If it goes around the corner, it will be clipped by this corner. If we are in a long corridor, we will see the route not within a certain radius, but rather until the next corner. It is much more convenient. The route looks very natural and easily understandable. This solution offered the highest quality and was the most cost-effective.

A good video demo will speak better than words, so we invite you to watch it!

You may learn more about this demo here. Furthermore, Android does not lag behind in this race, and similar solutions can be implemented with its ARCore. Feel free to check our Augmented Reality Demo Collection for further examples.

Case study: face-based Augmented Reality

Currently it works exclusively on iPhone X, which is equipped with a TrueDepth camera. The principles of work are similar to those of the traditional AR, including 3 stages – tracking, scene understanding and rendering.

After ARKit processes the data, we get the following information: tracking dataface mesh (in other words, face geometry), and blend shapes (face parameters that use percentage to show how open the eyes are, how raised the eyebrows are, and so on).

How to track mimics in face-based Augmented Reality with ARKit

How to create face-based Augmented Reality with ARKit

Face-based AR is widely used for creating face masks (like in Snapсhat, Instagram, and Facebook) and animation of virtual faces (animoji). But we have a case of more advanced application.

Sportspeople want to see the progress of their efforts, they need motivation. Yet sometimes, when we try to lose weight, it remains the same. It can fluctuate within several kilograms depending on the meal we have recently taken. Or we might get rid of fat, gaining muscles. Even the clothes we are currently wearing affect the measurements.

AR can provide a solution. When a person gains extra weight, the face gets wider. When a person keeps fit, face features get sharper. And it is the TrueDepth camera that can measure face parameters to help an app user see the progress and maintain a log with changes.

Use of TrueDepth camera and ARKit to monitor face parameters

There can be more of those non-standard, unique ways of application. You only have to turn on your imagination. ARKit, a good team, and proper AR testing will do the rest.

Augmented Reality Development Guide For Business Owners

Augmented Reality development has all the potential to become far more functional and business-oriented than it might seem at first glance. The future of AR will become even more integral with our daily life. For businesses, it is an opportunity to create unique products with the help of augmented reality company and occupy new niches on the market.

About the Author

Author: Andrew Makarov, Augmented Reality Solution ArchitectAndrew Makarov is the Augmented Reality Solution Architect at MobiDev with more than 9 years of software development experience, as well as in-depth expertise in encryption, prognostics, numerical computing, and algorithmization. He is an active speaker at world’s top tech events, and his latest passion is development of mobile Augmented Reality software.

Want to get in touch?

contact us

Full Research Article On AR Indoor Navigation

9 Augmented Reality Trends to Watch in 2020: The Future Is Here

10 Augmented Reality Trends in 2021: The Future is Here

How to Apply Machine Learning in Demand Forecasting for Retail?

Machine Learning In Demand Forecasting For Retail

Web Application Development Principles and Best Practices

SaaS web application development principles to be followed