Jeff Bier’s Impulse Response—Putting the Reality in Augmented and Virtual Reality

Submitted by Jeff Bier on Mon, 10/10/2016 - 22:01

Based on the pace of investment and acquisitions, and the level of buzz (some would say "hype") surrounding augmented reality and virtual reality, it is obvious that these technologies are hot.

With good reason, I think.

Augmented and virtual reality have long held enormous promise, but the challenge of making them work robustly – along with the cost, size and power consumption of the necessary hardware – have severely limited their use. In the past few years, though, what had been steady advances in the hardware and software essential for AR and VR have accelerated, thanks to companies and investors that have poured billions of dollars into these technologies, convinced that they'll be rewarded with big markets in the not-too-distant future.

What will those markets be, exactly?

Like many observers, I believe the biggest markets for virtual reality will be in gaming and entertainment. In 20 years, as we enjoy games, movies and sporting events through holodeck-like, immersive experiences, we may wonder at how we ever tolerated electronic entertainment when we were limited to viewing it through small, two-dimensional windows.

Like virtual reality, augmented reality also has big potential in entertainment. To get a sense of this potential, compare, for example, the gaming experience enabled by Microsoft's HoloLens (which implements sophisticated AR) with that delivered by Pokemon Go (which uses a very simplistic AR approach). But I agree with those who believe that AR will first gain substantial markets elsewhere, such as in industrial and retail settings. In these markets, AR is already being used for a surprising range of applications, from training welders to enabling shoppers to visualizing how a LEGO kit will look when assembled.

In a recent presentation to Embedded Vision Alliance Member companies, market researcher Sam Rosen, vice president at ABI Research, shared his firm's outlook for AR hardware and software. Rosen believes that AR hardware and software will grow to more than $100 billion per year in the next five years, mainly on the strength of industrial, healthcare, and government/military applications.

To achieve widespread adoption, AR product developers must deploy robust solutions. What does robust mean? One key aspect is tight synchronization between computer-generated information and the physical objects it relates to. This, of course, requires sophisticated computer vision to reliably identify and seamlessly track objects and their orientations. One could say that embedded vision puts the reality in augmented reality: without dependable, real-time vision, AR doesn't work. And, if AR solutions are going to be deployed widely, they need to be affordable and, in many cases, able to operate for many hours on the power supplied by a body-worn battery pack.

Fortunately, thanks to specialized processors built for vision, it's starting to become possible to deliver the necessary processing power at reasonable cost and power consumption, as Microsoft has shown with its HoloLens Holographic Processing Unit chip.

Because augmented reality devices aim to blend the virtual and physical world in a seamless manner, it's obvious that embedded vision is an essential ingredient. But what about virtual reality? In VR, the user is fully immersed in the virtual world, so what role does vision play?

It turns out that vision also plays crucial roles in VR. Tracking the user's location and head orientation is key to an immersive (and non-nauseating) VR experience – so that when you turn your head, for example, the virtual scene you're viewing shifts – instantly – in a manner consistent with how the real world works. There are multiple ways to track user position and head pose, but it appears that embedded vision may be the best way. At the Oculus Connect conference last week, Facebook CEO Mark Zuckerberg disclosed that Oculus is developing a wireless VR headset that uses an integrated camera to track the wearer.

Then there are hands. It turns out they're very useful in the virtual world, just like in the real world. And – you guessed it – embedded vision is what enables hand tracking so that a VR user can use their hands in the virtual world.

Another interesting potential use of embedded vision in VR goggles is for eye tracking. Eye tracking can serve several purposes. First, it can enable gaze to function as part of the user interface – stare at an object to select it, for example. In addition, it can enable VR product designers to take advantage of the fact that human vision is most sensitive to detail in the area where our gaze is focused. Reducing resolution in other areas of the displays can enable significant reductions in power consumption.

It's an exciting time for AR and VR. Thanks to improvements in underlying technologies – especially embedded vision – AR and VR devices are getting tantalizingly close to being ready for widespread deployment in a wide range of applications.

If you're interested in learning more about augmented reality, I invite you to attend a webinar, hosted by the Embedded Vision Alliance and presented by Xilinx, on December 6. Xilinx will present a number of AR use cases outside of the more commonly known consumer examples, and some the latest technologies for implementing AR devices. Visit the webinar web page for further details and to register.

Jeff Bier is president of BDTI and founder of the Embedded Vision Alliance. Post a comment here or send him your feedback at http://www.BDTI.com/Contact.

Add new comment

Log in to post comments