A workaround for the limitations of ARKit 2

Mobile app placing picture on wall using Augmented reality
fb twitter linkedin
link
Copied!

The release of ARKit 2 in 2018 was highly anticipated and its popularity spread like wildfire. Our team was equally excited to provide updates using ARKit for Pixels, one of our client’s AR apps we had previously done work for. When we went about executing these updates, however, we came across some challenges. Before I jump into the details on how we solved it, let me provide some context on the app we were commissioned to build, and ARKit itself.

What is ARKit?

Apple’s ARKit is a set of software tools that makes it easier for developers to create augmented reality apps. Due to the framework being more powerful, and fully supported by a tech giant like apple, many AR apps including Pokemon GO have reworked their apps to be built on ARKit. Pixels was no exception.

The Pixels app

Pixels, a marketplace for buying and selling artwork, first approached us back in 2015 to build an innovative app using AR technology. To help with the purchasing decision, they wanted to give their users the ability to see mockups of the artwork on their walls using the app.


To make this happen, TTT came up with a way to use printed targets to pinpoint the positioning of the artwork and superimpose it onto the wall. This was before Apple came out with ARKit, and even before the launch of the wildly popular AR app Pokémon Go. Following the release of ARKit three years later, it made sense for them to update the three year old app and create a version that doesn’t require a printed target…which brings us to our challenge.

Challenges

The ARKit 2 brought a lot of new features to the table like the multiplayer interaction inside the AR sessions, environment texturing, image tracking for 2D objects, object detection for 3D objects and face tracking enhancements. Even with these new features, however, ARKit 2 definitely isn’t perfect. We found that vertical plane detection in particular is tricky and unreliable. The issue is especially apparent when it comes to detecting white or plain colored walls. The lack of texture on a standard wall makes it difficult and near impossible for ARKit to detect it.

So, although ARKit 2 technically does support vertical plain detection, it does not work well in many applications. For example, using our office walls as an example, ARKit successfully detects a wall with texture but it fails to do so for a plain white wall.

Given that Pixels is a marketplace app with the sole purpose of placing artwork on walls, relying on ARKit alone will result in detection issues for the vast majority of users. And so, we have ourselves a challenge.

Working our way around this issue

To tackle this issue, we merged our UX experience with mathematics to come up with a solution. We designed a UX flow where the user can help ARKit detect an untextured vertical plane. To make this work, the user would have to find the wall by simply aligning a line on the screen with the edge of the wall and floor.

We used 3D Geometry to calculate the position of the wall with the marker and drew the painting in front of your camera, positioning it on the wall. Users can also drag the painting to any place along the wall as if they were just dragging an object in a game. To activate the AR features, the app displays an AR button with all the ARKit compatible models.

To check out this feature for yourself, download the pixels app for free and test it out!

You are bound to encounter limitations and get stuck at certain stages while using tools and frameworks. This is true even for such a well supported framework like ARKit. Luckily, most issues are solvable with the right amount of research and thinking outside of the box. Such as the life of a developer!