What does the iPhone X’s AR engine mean for UX Designers?

an explanation of what AR is, and a look at its current and potential future applications. We also explore what AR could mean for UX designers.

Andrew Wilshere
Andrew Wilshere
|
Sep 12, 2017
|
0
Min Read
Share

Apple today announced the latest iteration of the iPhone – the iPhone X. Alongside the usual incremental improvements that most of us are now largely indifferent to, the arrival of iOS 11 does deliver something more significant: a mainstream, mass-market mobile product that for the first time has an Augmented Reality (AR) engine baked right into the OS.

Read on for an explanation of what AR is, and a look at its current and potential future applications. We also explore what AR could mean for UX designers, and, finally, our take on the $162bn question: are people going to use it?

1. What is AR?

AR, or Augmented Reality, is a computational technology that usually works by modifying a real image captured from the user’s point of view. AR usually adds extra visual elements like text, but it can also be used to remove, highlight, or modify parts of the original image.

Although AR is still in its infancy as a mass market technology, you’re probably already familiar with a few mainstream applications. Above all (in terms of user numbers) there’s Snapchat, which takes selfies, uses AR to identify natural facial patterns, and then applies filters to the image.

The famous Snapchat hot dog

Then there are apps like Google Translate that allow you to point your phone’s camera at a passage of text, and receive an instant translation in the original visual context. (This technology made its way into Google Translate following Google’s acquisition of Word Lens.)

Word Lens, an AR translation app that has now been integrated into Google Translate

And, of course, there was all of last year’s excitement about Pokémon Go, a mobile game that uses AR technology to populate players’ environments with Pokémon that the player moves around their real environment collecting:

We’ve also grown increasingly accustomed to AR features in TV weather reporting and sports coverage, including in NFL and cricket games:

AR technology allows a first-down line to be rendered on the field of play in live action

AR technology assists cricket umpires with decision reviews

However, AR technology dates back way further than you might imagine – indeed, the first head-mounted display was developed by Ivan Sutherland back in 1968, and it’s the decades of experimentation and development since that have brought us to the point where AR is poised to become truly mainstream. Check out this awesome timeline for more information about the history of AR.

2. With iOS 11, AR hits the mainstream

Up to now, designers and developers who wanted to use AR usually had to build AR capabilities into apps themselves. As well as being expensive, the lack of a shared AR engine also inevitably meant variable user experiences, since the visual and interaction patterns would vary significantly from one application to the next.

iPhone X

The iPhone X

Today sees the launch of Apple’s iPhone X, and along with it, iOS 11, which has AR technology baked right into the OS. The iPhone X also features a dual front camera, positioned vertically to allow for 3D AR experiences when the phone is held in a landscape orientation.

Arriving hot on the heels of Google’s announcement of ARCore a few weeks ago, Apple’s own ARKit makes AR more attractive and accessible to developers for two main reasons: first, since iOS 11 will be rolled out across hundreds of millions of Apple devices, a huge AR-ready customer base will be created overnight; and second, the AR engine will be built into the OS itself, waiting to be called by iOS apps. Developers will no longer need to find their own AR solution when creating an app.

Apple’s ability to roll out iOS updates to all users at once puts it in a considerably stronger position than Google when it comes to driving uptake of a new capability like AR. Unlike iOS, Android has to accommodate thousands of different hardware configurations. As a result, updates are often extremely slow to market, because many manufacturers work on adding a proprietary skin to the OS post-release. For these reasons, the majority of Android users do not run the latest version of the software.

In an interview with British newspaper The Independent, Tim Cook left us in no doubt about how he sees the arrival of mainstream AR:

“I regard it as a big idea like the smartphone. The smartphone is for everyone, we don't have to think the iPhone is about a certain demographic, or country or vertical market: it’s for everyone. I think AR is that big, it’s huge. I get excited because of the things that could be done that could improve a lot of lives. And be entertaining. I view AR like I view the silicon here in my iPhone, it’s not a product per se, it’s a core technology.”

Some companies have already spotted the potential of AR to enhance their customers’ product experiences – check out the Twitter account @madewithARKit for a growing collection.

For example, IKEA have already developed an app, Place, that uses AR to simulate how items of furniture will look in your house, automatically measuring spaces and even replacing existing furnishings. It’s ready for iOS 11, and for many will be the first place they experience the iPhone’s new AR capabilities.

IKEA’s Place app. Image: IKEA

3. What does AR mean for UX designers?

There’s no doubt that specialized applications of AR will continue to be developed to increase the efficiency and reliability with which professional tasks can be performed. For example, AR creates opportunities for the reduction of information overload in the time-sensitive operations of military personnel, surgeons, and engineers. In these use cases, AR can be used to more effectively direct the attention of human operators and reduce human error.

However, the impact that AR will have on mass market consumer applications is less certain. In principle, the future applications of AR are only limited by, on the one hand, the imagination and creativity of designers and developers, and on the other hand, how consumer appetites and attitudes towards augmented experiences are formed.

I asked Sara Vilas Santiago, Designlab’s in-house UX curriculum supremo, for her thoughts.

“AR uptake will start with changes in things that are very functional, like those Google Translate and IKEA apps. From there it’s up to designers to use AR to take the next step in experiences – making the digital more ‘real’.

“To take an analogy, texting via SMS was great, but then WhatsApp made the experience easier, cheaper, and more real-time. You can see who is online, you can send pictures, send videos, have video calls… I think getting closer to the ‘real thing’ in digital experiences can always win.

“But as designers we have to understand the need to do it properly – avoiding the creation of awkward experiences. It’s important to understand human psychology when it comes to new technologies, just as Steve Jobs understood that with the iPhone and iPad people didn’t really need a Star Trek-style device, but instead something that offered rich experiences that we could use naturally with our hands and in our everyday lives.”

One of the more urgent tasks facing designers and developers is therefore to think about and influence the overall user experience of AR as a technology.

The understandable desire in each sector to be first to market with an AR product risks rushing out poorly-engineered AR experiences, with predictable and avoidable scandals around leaky privacy and security measures or even applications that cause harm or inconvenience by conveying unreliable instructions or information.

Around 40% of mobile apps are uninstalled within an hour of initial installation. For AR apps to be successful, they must above all establish trust with the user, and then quickly prove their usefulness. The first impressions of AR that the industry makes on consumers could be the difference between a step change in how we interact with smart technology, or a repeat of the market failure of Google Glass.

4. The $162bn question: will people use it?

Business Insider predicts that the AR market will be worth $162bn within a few years. Obviously, this depends on user uptake, and one of the questions that has been asked of AR in the past is whether it meets any real user need. Apps like Pokémon Go have demonstrated AR’s novelty value and potential for fun. But can it stand the test of time in more serious use cases?

Does AR add enough value to the user experience?

When it comes to applications like measuring up a room for your new IKEA furniture, there are two questions: first, do people care enough to use an app – i.e., does it add enough value to justify the user’s perception of extra effort? And second, will people trust the technology to deliver on its promises – for example, will they trust the app’s furniture measurements, or will they end up checking it for themselves with a tape measure anyway?

It’s tough to predict what consumer reaction will be like. But one thing is sure: whatever the eventual uptake, a crucial factor will be the quality of apps and ideas generated by designers and developers. A product that hits the market in an already robust and reliable form could make the difference between failure and long-term success.

Can designers understand user needs in the AR space?

The opportunities and challenges for UX designers will focus around designing experiences that are delightful and functional for consumers that initially don’t know what they want. Most users don’t yet have an understanding of what AR is, what they want from it, or how it could benefit them.

UX designers will have an important role in shaping those expectations, and in establishing beneficial and valuable products. Novelty is never enough. Historically, Apple has been expert in this kind of creative space – the iPod's success was an example of how the company understood the our desires as users (to carry a whole music collection in our pocket) before most of us were aware that we had that desire at all.

Will developers accommodate established user preferences for screen-based interaction?

Another thing to consider is how surprisingly little the basic format of human-computer interaction has changed over the past 50 years. We still primarily interact with screen interfaces through clicks and keyboards. The main change in that time has been the fact that screens now allow us to touch them (though we still use keyboard and mouse about half the time).

A likely reason for the failure of wearable AR applications like Google Glass, which came to market in 2014 and was discontinued within 2 years, is the fact that it required a radical shift not only in the way users were consuming information, but also in the devices through which that information was being delivered.

AR stands the greatest chance of consumer uptake and market success where it accommodates and augments device experiences that people already favor – i.e., smartphones, tablets, and laptops. If we start asking people to wear special glasses or to mount phones on their faces in purpose-built cradles, the chances are that consumers will simply reject that change as being too big a step.

Successful AR will perhaps not represent a change in how we interact with machines, so much as a change in the way they present information to us.

We recently launched a major update of UX Academy, including a curriculum revamp, and the full rollout of Group Crits & Career Services. Find out more–including admission criteria and details of our unique tuition reimbursement policy!

Launch a career in ux design with our top-rated program

Top Designers Use Data.

Gain confidence using product data to design better, justify design decisions, and win stakeholders. 6-week course for experienced UX designers.

Launch a career in ux design with our top-rated program

Top Designers Use Data.

Gain confidence using product data to design better, justify design decisions, and win stakeholders. 6-week course for experienced UX designers.