Interview with Brian Vogelsang of Qualcomm
Qualcomm, the $22 billion semiconductor and telecommunications giant (and new AREA member), has been deeply involved in AR and VR for more than a decade. We recently caught up with Brian Vogelsang, Qualcomm’s Senior Director, Product Management for XR Strategy and Partnerships, to learn more about the company, its AR strategy, and its vision for the future.
AREA: How would you describe Qualcomm’s role in the enterprise AR ecosystem?
Vogelsang: We’re a technology provider in the ecosystem, delivering chipsets that power AR experiences. Our Qualcomm Snapdragon platform provides the best silicon/chipset that we can customize to meet the needs of the XR enterprise ecosystem. You’ll see them in products today from customers like Vuzix and RealWear. Then there’s the Microsoft HoloLens 2 that was announced at Mobile World Congress; it uses our Snapdragon 850 Mobile Platform. Vuzix also announced at Mobile World Congress their M400 platform, which is powered by the Qualcomm Snapdragon XR1 platform. Finally, there are new, emerging OEMs, such as nreal, Realmax, Shadow Creator, and ThirdEye. Our goal is to optimize technology to put more capability in lighter weight designs that can drive more immersive experiences at the lowest possible power levels, but with full connectivity.
AREA: People might have thought that Qualcomm was getting out of AR when it sold the Vuforia business to PTC three years ago, but the company is still very much committed to VR and AR, isn’t it?
Vogelsang: That’s correct. We’ve been working for over a decade in this space. We have a long history of computer vision expertise and exploring how to build the technology and optimize it in hardware in ways that will allow more immersive experiences while running at the lowest possible power. To date, that has been predominantly on smartphones. However, our long-term vision is that within a decade, we will start transitioning from a handheld device (smartphone) to a head-worn device or a sleek AR glass that people use the whole day. And that’s really what we’re looking at: how do we accelerate that innovation and make those kinds of experiences happen – initially for enterprises, but long term for consumers.
AREA: So, you expect enterprises to be the early adopters of wearables, then the consumer market will develop from there?
Vogelsang: That’s right. Today, in the wearable form factor, there’s a spectrum of devices, from Assisted Reality devices for remote expert or guided work instructions, to full augmented or mixed reality devices like HoloLens or Magic Leap. Enterprises are willing to adopt these technologies if they solve a problem and deliver an ROI – and we’re excited about that. But long term, we think that the technology needs to get smaller, lighter weight, and more ergonomic. More like your standard eyeglasses. Because of these size requirements, that’s going to be particularly challenging technically. To deliver an immersive experience at the lowest possible power requires deep systems expertise. That’s right in Qualcomm’s wheelhouse. It’s going to take a few years for the industry to deliver mass adoption of consumer class AR eyewear. So for the short term, the enterprise is going to be doing a lot to drive the market.
AREA: How closely do you work with wearables manufacturers?
Vogelsang: We work really closely with them on their products and roadmaps, collaborating with them to achieve their market objectives. There are always tradeoffs as OEMS balance cost, weight, form factor and ergonomics, optics and display capability, performance, thermals, and often these impact immersiveness. And so we work really closely with them to understand their use cases and objectives and then help them with hardware, software, and support to meet their objectives. We also give them insight into future technology developments and their future requirements inform our chipset roadmap. We can’t solve all the problems. Things like displays and optics as well as camera modules are a big part of the equation in building an AR device, and while we don’t build those technologies, we work closely with the suppliers of these components and assist OEMs with integration through our reference designs and HMD Accelerator Program, which pre-validates and qualifies components so OEMs can get to market more quickly.
AREA: It seems as if technologies are starting to converge in new ways: 5G networks, Artificial Intelligence, the Internet of Things, and AR. Do you get that impression as well?
Vogelsang: Definitely. We see 5G as the connectivity fabric that’s going to allow the mobile network to not only interconnect people, but also interconnect and control machines and objects and devices. 5G is going to deliver performance and efficiency that will enable these new experiences and connect new industries, delivering multi-gigabit-per-second rates of connectivity at ultra-low latency. Latency is hugely important when it comes to Augmented and Virtual Reality experiences. And of course, 5G means more capacity. But AI is already being used in Augmented Reality experiences, enabling things like head tracking, hand tracking, 3D reconstruction and object recognition or estimating light. AI is a really important part of that. And I think 5G also will enable some capabilities to be moved off the device to the edge of the mobile network – taking some capability and moving it to be processed at the network edge. And that ultimately will help us enable lighter weight designs with richer, more immersive graphics at that low power threshold that we need. So all three – 5G, AI and AR – are coming together. And I think IoT will be a part of AR in terms of syndicating information contextually about the environment in an enterprise to an AR experience. IoT will feed the insights, which will be bubbled up as AR experiences.
AREA: What do you hope to get out of being a member of the AREA?
Vogelsang: Qualcomm’s customers are OEMs. We don’t sell to end customers, the people who would buy those devices or experiences. However, we do need to understand what their needs are so that we can better evolve our technology roadmap to support where those end users want to go. So, one of the things that excites us about becoming a member of the AREA is to begin hearing directly from some of the end customers who are deploying wearable AR technology. We know this is a marathon and we believe XR – spanning both Augmented and Virtual Reality – will be the next computing platform. So, we’re taking a long-term view and investing now in the technology that will enable this market. As a result, we’re very interested in learning from other AREA members about how the technology is being applied today to solve concrete problems in the enterprise so we can inform our roadmap. Those learnings will help us deliver products that can accelerate the pace of innovation and grow the overall AR wearable market.
We’re doing some trials and proofs of concept and other things where we get more directly engaged with end customer use cases. So, being able to collaborate with other AREA members in that space would be really good. Also, we’d like to get involved in the committees. We have a human factors team here, and I’d like to get them engaged with the work that’s being done on the human factors side. While we don’t build end devices ourselves, we still need to understand as we’re building out technology how human factors, such as weight, size, or thermals impact the user experience and ergonomics.
We’d also like to get involved in requirements. We think we’d really benefit from learning more about requirements from a horizontal cross-section of the AREA membership. And finally, I think we’d like to get involved in the marketing side, as well. We would be interested in using our platform to help tell the story and accelerate industry adoption.
AREA: Where do you see things headed in XR over the next three to five years? What are the next big milestones people should be looking for?
Vogelsang: I think that we’ll see a transition from smart glasses or Assisted Reality experiences to more Augmented Reality or spatial immersive computing type experiences. Over the next few years, that transition will really start to accelerate. We’re already seeing the early promise of what’s to come with technology such as HoloLens or Magic Leap. I’m really excited about seeing the companies who are deploying smart glasses or Assisted Reality experiences today start to adopt Augmented Reality or immersive computing in a much larger way.