1

Recapping the AREA/DMDII 2nd Enterprise AR Workshop

The Augmented Reality Enterprise Alliance (AREA) and the Digital Manufacturing and Design Innovation Institute (DMDII), a UI LABS collaboration recently hosted the 2nd Enterprise AR workshop at the UI Labs in Chicago. With over 110 attendees from enterprises who have purchased and are deploying AR solutions, to providers offering leading-edge AR solutions, to non-commercial organisations, such as universities and government agencies

“The goal of the workshop is to bring together practitioners of Enterprise AR to enable open and wide conversation on the state of the ecosystem and to identify and solve barriers to adoption,” commented Mark Sage, the Executive Director of the AREA.

Hosted at the excellent UI LABS and supported by the AREA members, the attendees enjoyed two days of discussions, networking, and interactive sessions.

Here’s a brief video summary capturing some of the highlights.

Introduction from the Event Sponsors

Sponsored by Boeing and Upskill, the workshop was kicked off by Paul Davies, Associate Technical Fellow at Boeing and the AREA President. His introduction focused on the status of the Enterprise AR ecosystem, highlighting the benefits gained from AR and some of the challenges that need to be addressed.

Summary of AR Benefits

Mr Davies added, “We at Boeing are pleased to be Gold sponsors of this workshop. It was great to listen to and interact with other companies who are working on AR solutions. The ability to discuss in detail the issues and potential solutions allows Boeing and the ecosystem to learn quickly.”

 Developing the Enterprise AR Requirements Schema

The rest of the day focused on brainstorming and developing a set of use cases that the AREA will build on to create the AREA requirements / needs database and ultimately be added to the AREA Marketplace. The session was led by Glen Oliver, Research Engineer from AREA member Lockheed Martin, and Dr. Michael Rygol, Managing Director of Chrysalisforge.

The attendees were organized into 17 teams and presented with an AR use case (based on the use cases documented by the AREA). The teams were asked to add more detail to the use case and define a scenario (a definition of how to solve the business problems often containing a number of use cases and technologies).

The following example was provided:

  • A field service technician arrives at the site of an industrial generator. They use their portable device
    to connect to a live data stream of IoT data from the generator to view a set of diagnostics and
    service history of the generator.
  • Using the AR device and app they are able to pinpoint the spatial location of the reported error code
    on the generator. The AR service app suggests a number of procedures to perform. One of the
    procedures requires a minor disassembly.
  • The technician is presented with a set of step-by-step instructions, each of which provides an in-context 3D display of the step.
  • With a subsequent procedure, there is an anomaly which neither the technician nor the app is able to diagnose. The technician makes an interactive call to a remote subject matter expert who connects into the live session. Following a discussion, the SME annotates visual locations over the shared display, resulting in a successful repair.
  • The job requires approximately one hour to perform, meaning the portable device should function without interruption throughout the task.
  • With the job complete, the technician completes the digital paperwork and marks the job complete
    (which is duly stored in the on-line service record of the generator).

*blue = use case

The tables were buzzing with debate and discussion with a lot of excellent output. The use of a maturity model to highlight the changes in scenarios was a very useful tool. At the end of the session the table leaders were asked to present their feedback on how useful the conversations had been.

Technology Showcase and Networking Session

The day ended with a networking session where the following companies provided demos of their solutions:

Day 2: Focus on Barriers to AR Adoption

The second day of the workshop started with an insightful talk from Jay Kim, Chief Strategy Officer at Upskill (event Sliver sponsors) who outlined the benefits of Enterprise AR and how to avoid “pilot purgatory” (i.e., the continual cycle of delivering pilots with limited industrialisation of the solution).

Next, Lars Bergstrom, at Mozilla Research, provided a look into how enterprises will soon be able to deliver AR experiences to any AR device via a web browser. The attendees found the session very interesting to understand the potential of WebAR and how it might benefit their organisations.

Barriers to Enterprise AR Adoption – Safety and Security

The next two sessions generated discussion and debate on two of the key barriers to adoption of Enterprise AR. Expertly moderated by the AREA Committee chairs for:

  • Security – Tony Hodgson, Bob Labelle and Frank Cohee of Brainwaive LLC
  • Safety – Dr. Brian Laughlin, Technical Fellow at Boeing

Both session provided an overview of the potential issues for enterprises deploying AR and providers building AR solutions. Again, many attendees offered contributions on the issues, possible solutions and best practice in these fields.

The AREA will document the feedback and share the content with the attendees, as well as using it to help inform the AREA committees dedicated to providing insight, research and solutions to these barriers.

Barriers to Enterprise AR Adoption – Change Management

Everyone was brought back together to participate in a panel session focusing on change management, both from an organisation and human perspective.

Chaired by Mark Sage, the panel included thought leaders and practitioners:

  • Paul Davies – Associate Technical Fellow at Boeing
  • Mimi Hsu – Corporate Digital Manufacturing lead at Lockheed Martin
  • Beth Scicchitano – Project Manager for the AR Team at Newport News Shipbuilding
  • Jay Kim – Chief Strategy Officer at Upskill
  • Carl Byers – Chief Strategy Officer at Contextere

After a short introduction, the questions focused on “if AR should be a topic discussed at the CEO level or by the IT / Innovation teams.” After insightful comments from the panel, the audience was asked to provide their input.

Questions then focused on how to convince the workforce to embrace AR. Boeing, Newport News Shipbuilding and Lockheed Martin provided practical and useful examples.

There followed a range of questions from the audience with the panel members offering their experiences in how their organisations have been able to overcome some of the change management challenges when implementing AR solutions.

Final Thoughts

The general feedback on the two days was excellent. The ability to share, debate and discuss the potential and challenges of Enterprise AR was useful for all attendees.

The AREA; the only global, membership-funded, non-profit alliance dedicated to helping accelerate the adoption of Enterprise AR by supporting the growth of a comprehensive ecosystem and its members to develop thought leadership content, reduce the barriers to adoption and run workshops to help enterprises effectively implement Augmented Reality technology to create long-term benefits.

Will continue to work with The Digital Manufacturing and Design Innovation Institute (DMDII), where innovative manufacturers go to forge their futures. In partnership with UI LABS and the Department of Defense, DMDII equips U.S. factories with the digital tools and expertise they need to begin building every part better than the last. As a result, more than 300 partners increase their productivity and win more business.

If you are interested in AREA membership, please contact Mark Sage, Executive Director.

To inquire about DMDII membership, please contact Liz Stuck ([email protected]), Director of Membership Engagement




The AREA Releases Member-Exclusive ROI Calculator and Best Practice Report

The AREA recently distributed the products of its second AREA-funded research project, an ROI Calculator and Best Practice Report. Conducted by Strategy Analytics under the supervision of AREA member PEREY Research & Consulting, the research examined the approaches taken by market leaders that are identifying, prioritizing, and managing costs and returns on their AR investments.

The ROI Calculator and Best Practice Report are available exclusively to AREA members, but non-members can download an abstract of the Best Practice Report and a sample ROI Case Study here.

Based on interviews conducted with AREA members and members of the Strategy Analytics Enterprise Customer panel, the research report identifies five critical best practices for companies to follow in preparing and conducting successful ROI analyses for enterprise AR projects. Companies following these practices are able to articulate and explain technology spending that will aid in decision making, and to accrue the greatest measurable benefits from their AR investment over the long term.

Along with the best practice report, the AREA has produced an Excel spreadsheet that enables companies to enter variables and calculate the ROI for their AR projects following well-established methodologies used by financial analysts.

The spreadsheet walks the user through the process: establishing the business case; assessing AR solution costs; inputting business financial metrics. The ROI Calculator produces annual AR solution costs and benefits allocations and an ROI analysis and cost/benefit overview that reveals the cumulative net benefits over several years.

“Companies that follow these practices when undertaking ROI analysis on their AR projects will be able to articulate and explain technology spending,” said Christine Perey of PEREY Research & Consulting. “This will aid their decision making and enable them to accrue the greatest measurable benefits from their AR investments over the long term.”

Download your free copies of the Best Practice Report abstract and ROI Case Study here.   To obtain the full ROI Calculator and Best Practice Report – and enjoy a host of other benefits – join the AREA. Click here to learn more.

 

 




The AREA Issues Call for Proposals for AR ROI Research Project

The AREA has issued a request for proposals for a funded research project that will develop a full set of best practices for performing analyses of the return on investment (ROI) of enterprise AR projects.

Organizations with relevant expertise in ROI analysis are invited to respond to the invitation by August 14th.

The goals of the AREA-directed research project are:

  • To define and answer common questions about how to measure ROI for enterprise AR projects. The AREA members will then be able to follow the best practices and guidelines when developing ROI estimates for their internal decision makers, or to assist their customers and partners in the development of ROI estimates for AR.
  • To increase understanding and demystify ROI for AR across the industry and ecosystem (members and non-members) through an information/awareness campaign.

The research project will produce:

  • A report that provides AREA members with a full set of best practices to prepare an ROI analysis for enterprise AR. This should be based on widely-accepted techniques and best practices for ROI of adjacent technologies, such as IoT and mobile.
  • An ROI calculator in the form of an annotated spreadsheet with sample formulas and instructions on how to fill in fields with which to begin preparation of an ROI estimate.
  • A case study (suitable for public release) with sample figures showing the use and interpretation of the ROI calculator tool in a fictional (or anonymized) organization.

All proposals will be evaluated by the AREA research committee co-chairs on the following criteria:

  • Demonstrated knowledge of ROI analysis methods
  • Clear qualifications of research organization and any partners in the domain of ROI in AR, if possible, or adjacent technologies
  • Review of prior research reports and calculator samples
  • Feedback of references

The AREA will provide detailed replies to submitters on or before August 18th. The research project is expected to be completed and finished deliverables produced by October 31st.


The AREA ROI research project was awarded to Strategy Analytics in August 2017.

The project produced the first and most definitive report on the topic of Measuring ROI of AR and the AREA AR ROI calculator. These are both AREA member exclusive results. The executive summary of the final report is made available to non-members and upon furnishing your name and contact details, can be downloaded from this page.




AR Adds a New Dimension to Financial Trading and Analysis

AR/MR-assisted trading and data analysis platforms empower traders and investors with advanced fintech, which is capable of monitoring and visualizing financial markets with new depth. Holographic visualization presents a new enhanced view of dynamic data, with flat images evolving into 3D shapes and innovative heatmaps to reveal revolutionary new data insights. With an AR/MR-assisted user interface, users are no longer restricted by the physical size of a computer screen, mobile or tablet, and can get a true 360-degree view with unlimited applications.

Utilizing light portable MR headset technology or AR smart glasses, advanced holographic representations of financial data and feeds are overlaid on, and exist in addition to, the real-world view of the user’s workspace. The phrase “workplace everywhere” has new meaning with AR enabling users to simultaneously operate a laptop or smartphone, or speak with a physical person in the room at the same time as a virtual colleague via videoconferencing.

The AR financial landscape can remain completely private (safe from inquisitive eyes), or users can share data by mirroring their views to an external laptop and even enable “spectator view” for colleagues or clients who are also using AR/MR technology. Users can even invite clients or advisors located anywhere in the world to a virtual conference room, where they can collaboratively and seamlessly analyze and interact with their financial landscape.

The technologies behind the solution

Powerful AR/MR-assisted trading and market data analysis for the finance sector can be viewed through Microsoft’s MR HoloLens headsets, and new technology currently in development that will look and feel like ordinary eyeglasses. The inclusion of Unity, a powerful editor, enables the software to be transported to other wearable hardware. While the solution largely uses HoloLens gesture recognition technology, voice recognition is also possible using embedded Microsoft Cortana functionality, along with holographic object manipulation, which can be useful in certain scenarios.

The core of the AR/MR-assisted financial trading and market data analysis platform is built on an existing data solution called dxFeed, which is one of the world’s largest cloud-based fully-managed data tickerplants focused exclusively on the Capital Markets industry. dxFeed uses unique technology called QD, designed and built by Devexperts, for market data distribution. The result is a powerful tool that can transform and adapt any data feed into an AR/MR-assisted virtual market data infrastructure. Gathering and storing historical data from the key exchanges in the USA, Canada and Europe, every single change of price (tick-by-tick market data), is streamed live and can be accurately viewed and interrogated through the AR/MR headset.

What it means for traders and analysts

The advent of AR/MR-assisted trading and data analysis delivers many benefits to financial services firms:

  • Organizations can replace multiple monitors in a fixed location with a lightweight wireless MR headset or AR  smart glasses, freeing users from the physical size restrictions of computers, mobile devices, and tablets.
  • Companies can implement “workplace everywhere” – with a 360-degree view, users can work literally on any surface and even in the air.
  • Colleagues and customers can collaborate on projects from anywhere in the world via videoconferencing; point-of-view capabilities enable users to monitor and jointly analyze financial data, limiting miscommunication and strengthening decision-making.
  • Users can increase their productivity and dramatically improve market visualization with advanced holographic data representation – a key element for traders needing to make important data-driven decisions quickly.
  • A more intuitive user interface makes it easier to view, analyze and manipulate large quantities of complex data.
  • Users can gain rapid access to stored historical market data and use tick-market replay and back-testing while simultaneously keeping a sharp eye on current market activity.
  • Users stay better informed with streamlined integrated news feeds and financial information, aggregated from multiple providers in text view – with support for live streaming of news channels.

Who are the target users?

Fintech is more than a buzzword. In order to stay ahead of the competition, banks, investment-funds, hedge funds, FX desks, proprietary traders, and exchanges are adopting AR/MR technology. The driving force behind AR/MR-assisted trading and data analysis, however, is individual traders, investors and advisors working for financial institutions across the globe, who will find ease of collaboration from anywhere hugely beneficial.

Some typical scenarios

  • An investor can connect an advisor to a virtual conference room, enabling them to share their point of view and explain how a drop or rise affects the portfolio and what decisions they can make now.
  • A trader can take action faster as a result of a more intuitive interface highlighting hotspots and revealing opportunities.
  • An investor looking to enter new markets can accurately view historical data, use tick-market replay and back-testing and make informed decisions based on the hard facts.
  • A financial analyst required to monitor a particular stock on a major exchange can access and visualize full-depth data, explore how well the stock has performed in the past, and instantly communicate that information to a client, in the form of a holographic data representation.
  • Students or new employees learning to trade can use AR/MR-assisted fintech to study and analyze patterns using historical data and market replay, and immerse and interact with the financial market.

Dmitry Parilov is Managing Director of Data Products at Devexperts and Simon Raven is a technical writer.




Mixed Reality: Just One Click Away

Author: Aviad Almagor, Director of the Mixed Reality Program, Trimble, Inc.

Though best known for GPS technology, Trimble is a company that integrates a wide range of positioning technologies with application software, wireless communications, and services to provide complete commercial solutions. In recent years, Trimble has expanded its business in building information modeling, architecture and construction, particularly since the company’s 2012 acquisition of SketchUp 3D modeling software from Google. Mixed Reality is becoming a growing component of that business. This guest blog post by Trimble’s Aviad Almagor discusses how Trimble is delivering mixed reality solutions to its customers.

Many industries – from aerospace to architecture/engineering/construction (AEC) to mining – work almost entirely in a 3D digital environment. They harness 3D CAD packages to improve communication, performance, and the quality of their work. Their use of 3D models spans the full project lifecycle, from ideation to conceptual design and on to marketing, production, and maintenance.

Take AEC, for example. Architects design and communicate in 3D. Engineers design buildings’ structures and systems in 3D. Owners use 3D for marketing and sales. Facility managers use 3D for operation and maintenance.

And yet, we still consume digital content the same way we have for the last 50 years: behind a 2D screen. For people working in a 3D world, the display technology has become a limiting factor. Most users of 3D content have been unable to visualize the content their jobs depend on in full 3D in the real world.

However, mixed reality promises to change that. Mixed reality brings digital content into the real world and supports “real 3D” visualization.

The challenge

There are several reasons why mixed-reality 3D visualization has not yet become an everyday reality. Two of the primary reasons are the user experience and the processing requirements.

For any solution to work, it needs to let engineers, architects, and designers focus on their core expertise and tasks, following their existing workflow. Any technology that requires a heavy investment in training or major changes to the existing workflow faces an uphill battle.

Meanwhile, 3D models have become increasingly detailed and complex. It is a significant challenge – even for beefy desktop workstations – to process large models and support visualization in 60fps.

One way around that problem is to use coding and specialized applications and workflows, but that approach is only acceptable to early adopters and innovation teams within large organizations – not the majority of everyday users.

To support real projects and daily activities – and be adopted by project engineers — mixed reality needs to be easily and fully integrated into the workflow. At Trimble, we call this “one-click mixed reality” – getting data condensed into a form headsets can handle, while requiring as little effort from users as possible.

Making one-click mixed reality possible

The lure of “one-click” solutions is strong. Amazon has its one-click ordering. Many software products can be downloaded and installed with a single click. The idea of one-click mixed reality is to bring that ease and power to 3D visualization.

Delivering one-click mixed reality requires a solution that extends the capabilities of existing tools by adding mixed reality functionality without changing the current workflow. It must be a solution that’s requires little or no training. And it means that any heavy-lifting processing that’s required should be done in the background. From a technical standpoint, that means any model optimization, including polycount, occlusion culling, and texture handling, is performed automatically without the need for manual, time-consuming specialized processes.

At Trimble, we’re working to deliver one-click mixed reality by building on top of existing solutions. Take SketchUp for example, one of the most popular 3D packages in the world. We want to make it possible for users to design a 3D model in SketchUp, click to publish it, and instantly be able to visualize and share their work in mixed reality.

We’re making sure that we support users’ existing workflow in the mixed reality environment. For example, we want to enable users to use scenes from SketchUp, maintain layer control, and collaborate with other project stakeholders in the way they’re accustomed.

And we’re taking it one step further by making it possible to consume models directly from SketchUp or from cloud-based environments, such as SketchUp 3D Warehouse or Trimble Connect. This will eliminate the need to install SketchUp on the user’s device in order to visualize the content in mixed reality. As a next step, we are exploring with our pilot customers a cloud-based pre-processing solution which will optimize models for 3D visualization.

We’re making good progress. For example, in his Packard Plant project (which was selected to represent the US at the Venice Architecture Biennale), architect Greg Lynn used SketchUp and SketchUp Viewer for Microsoft HoloLens to explore and communicate his design ideas. In this complex project, a pre-processing solution was required to support mixed reality visualization.

“Mixed-reality bridges the gap between the digital and the physical. Using this technology I can make decision at the moment of inception, shorten design cycle, and improve communication with my clients” 

– Architect Greg Lynn

One-click mixed reality is coming to fruition. For project teams, that means having the ability to embed mixed reality as part of their daily workflow. This will enable users to become immediately productive with the technology, gain a richer and more complete visualization of their projects, and build on their existing processes and tools.

The advent of one-click mixed reality indicates that the world of AR/VR is rapidly approaching the time when processing requirements, latency, and user experience issues will no longer be barriers.

Aviad Almagor is Director of the Mixed Reality Program at Trimble, Inc.




The 1st AREA Ecosystem Survey is Here!

The Augmented Reality (AR) marketplace is evolving so rapidly, it’s a challenge to gauge the current state of market education, enterprise adoption, provider investment, and more. What are the greatest barriers to growth? How quickly are companies taking pilots into production? Where should the industry be focusing its efforts? To answer these and other questions and create a baseline to measure trends and momentum, we at the AREA are pleased to announce the launch of our first annual ecosystem survey.

Please click here to take the survey. It won’t take more than five minutes to complete. Submissions will be accepted through February 8, 2017. We’ll compile the responses and share the results as soon as they’re available.

Make sure your thoughts and observations are captured so our survey will be as comprehensive and meaningful as possible. Thank you!




The AREA Issues Call for Proposals for an AR Research Project

The AREA has issued a request for proposals for a funded research project that its members will use to better understand relevant data security risks associated with wearable enterprise AR and mitigation approaches.

Organizations with expertise in the field of data security risks and mitigation and adjacent topics are invited to respond to the invitation by January 30, 2017.

The goals of the AREA-directed research project are:

  • To clarify questions about enterprise data security risks when introducing enterprise AR using wearables
  • To define and perform preliminary validation of protocols that companies can use to conduct tests and assess risks to data security when introducing wearable enterprise AR systems

The research project will produce:

  • An AREA-branded in-depth report that: details the types of data security risks that may be of concern to IT managers managing AR delivery devices and assets; classifies the known and potential threat to data security according to potential severity levels; and proposes risk mitigation measures
  • An AREA-branded protocol for testing wearable enterprise AR devices for their hackability or data exposure threat levels
  • An AREA-branded report documenting the use of the proposed protocol to test devices for their security exposure threat levels.

All proposals will be evaluated by the AREA research committee co-chairs on the following criteria:

  • Demonstrated knowledge and use of industry best practices for research methodology
  • Clear qualifications of research organization and any partners in the domain of data security threats and mitigation, and AR, if possible
  • Review of prior research report and testing protocol samples
  • Feedback of references

The AREA will provide detailed replies to submitters on or before February 13, 2017. The research project is expected to be completed and finished deliverables produced by May 1, 2017.

Full information on the request for proposals, including a submission form, can be found here.

 




GE’s Sam Murley Scopes Out the State of AR and What’s Next

General Electric (GE) has made a major commitment to Augmented Reality. The industrial giant recently announced that it plans to roll out AR in three business divisions in 2017 to help workers assemble complex machinery components. In his role leading Innovation and Digital Acceleration for Environmental Health & Safety at General Electric, Sam Murley is charged with “leading, generating and executing digital innovation projects to disrupt and streamline operations across all of GE’s business units.” To that end, Sam Murley evangelizes and deploys immersive technologies and digital tools, including Augmented Reality, Virtual Reality, Artificial Intelligence, Unmanned Aerial Vehicles, Natural Language Processing, and Machine Learning.

As the first in a series of interviews with AREA members and other ecosystem influencers, we recently spoke with Sam to get his thoughts on the state of AR, its adoption at GE, and his advice for AR novices.

AREA: How would you describe the opportunity for Augmented Reality in 2017?

SAM MURLEY: I think it’s huge — almost unprecedented — and I believe the tipping point will happen sometime this year. This tipping point has been primed over the past 12 to 18 months with large investments in new startups, successful pilots in the enterprise, and increasing business opportunities for providers and integrators of Augmented Reality.

During this time, we have witnessed examples of proven implementations – small scale pilots, larger scale pilots, and companies rolling out AR in production — and we should expect this to continue to increase in 2017. You can also expect to see continued growth of assisted reality devices, scalable for industrial use cases such as manufacturing, industrial, and services industries as well as new adoption of mixed reality and augmented reality devices, spatially-aware and consumer focused for automotive, consumer, retail, gaming, and education use cases. We’ll see new software providers emerge, existing companies taking the lead, key improvements in smart eyewear optics and usability, and a few strategic partnerships will probably form.

AREA: Since it is going to be, in your estimation, a big year, a lot of things have to fall into place. What do you think are the greatest challenges for the Augmented Reality industry in 2017?

SAM MURLEY: While it’s getting better, one challenge is interoperability and moving from proprietary and closed systems into connected systems and open frameworks. This is really important. All players — big, medium and small — need to work towards creating a connected AR ecosystem and democratize authoring and analytical tools around their technology. A tool I really like and promote is Unity3D as it has pretty quickly become the standard for AR/VR development and the environment for deployment of AR applications to dozens of different operating systems and devices.

It’s also important that we find more efficient ways to connect to existing 3D assets that are readily available, but too heavy to use organically for AR experiences. CAD files that are in the millions of polygons need some finessing before they can be imported and deployed as an Augmented Reality object or hologram. Today, a lot of texturing and reconstruction has to be performed to keep the visual integrity intact without losing the engineering accuracy. Hopefully companies such as Vuforia (an AREA member) will continue to improve this pipeline.

For practical and wide-scale deployment in an enterprise like GE, smart glasses need to be intrinsically safe, safety rated, and out-of-the box ready for outdoor use. Programmatically, IT admins and deployment teams need the ability to manage smart glasses as they would any other employee asset such as a computer or work phone.

AREA: GE seems to have been a more vocal, public proponent of Augmented Reality than a lot of other companies. With that level of commitment, what do you hope to have accomplished with Augmented Reality at GE within the next year? Are there certain goals that you’ve set or milestones you hope to achieve?

SAM MURLEY: Definitely. Within GE Corporate Environmental Health & Safety we have plans to scale AR pilots that have proven to be valuable to a broader user base and eventually into production.

Jeff Immelt, our Chairman and CEO, in a recent interview with Microsoft’s CEO Satya Nadella, talked specifically about the use of Microsoft HoloLens in the enterprise. He put it perfectly, “If we can increase productivity by one percent across the board, that’s a no brainer.” It’s all about scaling to increase productivity, scaling to reduce injuries, and scaling based on user feedback. In 2017, we will continue to transform our legacy processes and create new opportunities using AR to improve worker performance and increase safety.

AREA: Do you have visibility into all the different AR pilots or programs that are going on at GE?

SAM MURLEY: We’re actively investigating Augmented Reality and other sister technologies, in partnership with our ecosystem partners and the GE Businesses. Look, everyone knows GE has a huge global footprint and part of the reward is finding and working with other GE teams such as GE Digital, our Global Research Centers, and EHS Leaders in the business units where AR goals align with operational goals and GE’s Digital Industrial strategy.

At the 2016 GE Minds + Machines conference, our Vice President of GE Software Research, Colin Parris, showed off how the Microsoft HoloLens could help the company “talk” to machines and service malfunctioning equipment. It was a perfect example of how Augmented Reality will change the future of work, giving our customers the ability to talk directly to a Digital Twin — a virtual model of that physical asset — and ask it questions about recent performance, anomalies, potential issues and receive answers back using natural language. We will see Digital Twins of many assets, from jet engines to or compressors. Digital Twins are powerful – they allow tweaking and changing aspects of your asset in order to see how it will perform, prior to deploying in the field. GE’s Predix, the operating system for the industrial Internet, makes this cutting-edge methodology possible. “What you saw was an example of the human mind working with the mind of a machine,” said Parris. With Augmented Reality, we are able to empower the workforce with tools that increase productivity, reduce downtime, and tap into the Digital Thread and Predix. With Artificial Intelligence and Machine Learning, Augmented Reality quickly allows language to be the next interface between the Connected Workforce and the Internet of Things (IoT). No keyboard or screen needed.

However, we aren’t completely removing mobile devices and tablets from the AR equation in the short term. Smart glasses still have some growing and maturing to do. From a hardware adoption perspective, smart glasses are very new – it’s a new interface, a new form factor and the workforce is accustomed to phones, tablets, and touch screen devices. Mobile and tablet devices are widely deployed in enterprise organizations already, so part of our strategy is to deploy smart eyewear only when absolutely needed or required and piggyback on existing hardware when we can for our AR projects.

So, there is a lot going on and a lot of interest in developing and deploying AR projects in 2017 and beyond.

AREA: A big part of your job is navigating that process of turning a cool idea into a viable business model. That’s been a challenge in the AR world because of the difficulty of measuring ROI in such a new field. How have you navigated that at GE?

SAM MURLEY: That’s a good question. To start, we always talk about and promote the hands-free aspects of using AR when paired with smart glasses to access and create information. AR in general though, is a productivity driver. If, during a half-hour operation or maintenance task out in the field, we can save a worker just a few minutes, save them from having to stop what they’re doing, go back to their work vehicle, search for the right manual, find the schematic only to realize it’s out of date, and then make a phone call to try and solve a problem or get a question answered, an AR solution can pay for itself quickly as all of that abstraction is removed. We can digitize all of that with the Digital Twin and supply the workforce with a comfortable, hands-free format that also keeps them safe from equipment hazards, environmental hazards, and engaged with the task at hand.

Usability is key though – probably the last missing part to all of this – to the tipping point. Our workforce is so accustomed and trained to use traditional devices – phones, tablets, workstations, etc. Introducing smart glasses needs to be handled with care and with an end-user focus. The best AR device will be one that requires zero to no learning curve.

It is important to run a working session at the very start. Grab a few different glasses if you can and let your end users put them on and listen to their feedback. You need to baseline your project charter with pre-AR performance metrics and then create your key performance indicators.

AREA: At a company like GE, you’ve got the size and the resources to be able to explore these things. What about smaller companies?

SAM MURLEY: That’s definitely true. I hope we see some progress and maturation in the AR ecosystem so everyone can benefit – small companies, large organizations, and consumers. The cost of hardware has been a challenge for everyone. Microsoft came out with the HoloLens and then announced a couple of months later that their holographic engine in the system was going to be opened to OEMs. You could have an OEM come in and say, maybe I don’t need everything that’s packed in the HoloLens, but I still want to use the spatial sensing. That OEM can potentially build out something more focused on a specific application for a fraction of the cost. That’s going to be a game changer because, while bigger companies can absorb high-risk operations and high-risk trials, small to medium size companies cannot and may take a big hit if it doesn’t work or rollout is slow.

Hopefully we’ll see some of the prices drop in 2017 so that the level of risk is reduced.

AREA: Can you tell us about any of the more futuristic applications of AR that you’re exploring at GE?

SAM MURLEY: The HoloLens demo at Minds + Machines mentioned earlier is a futuristic but not-that-far-off view of how humans will interact with data and machines. You can take it beyond that, into your household. Whether it’s something you wear or something like the Amazon Echo sitting on your counter, you will have the ability to talk to things around as if you were carrying on a conversation with another person. Beyond that, we can expect that things, such as refrigerators, washing machines, and lights in our houses, will be powered by artificial intelligence and have embedded holographic projection capabilities.

The whole concept around digital teleportation or Shared Reality is interesting. Meron Gribetz, Meta’s CEO, showcased this on stage during his 2016 TEDx – A Glimpse of the Future Through an Augmented Reality Headset. During the presentation, he made a 3D photorealistic call to his co-founder, Ray. Ray passed a digital 3D model of the human brain to Meron as if they were standing right next to each other even though they were physically located a thousand miles apart.

That’s pretty powerful. This type of digital teleportation has the potential to change the way people collaborate, communicate, and transfer knowledge amongst each other. Imagine a worker being out in the field and he or she encounters a problem. What do they do today? They pick up their mobile device and call an expert or send an email. The digital communication stack of tomorrow won’t involve phones or 2D screens, rather holographic calls in full spatial, photorealistic, 3D.

This is really going to change a lot of, not only heavy industrial training or service applications, but also applications well beyond the enterprise over the next few decades.

AREA: One final question. People are turning to the AREA as a resource to learn about AR and to figure out what their next steps ought to be. Based on your experience at GE, do you have any advice for companies that are just embarking on this journey?

SAM MURLEY: Focus on controlled and small scale AR projects to start as pilot engagements. Really sharpen the pencil on your use case and pick one performance metric to measure and go after it. Tell the story, from the start to the end about how and what digital transformation can and will do when pitching to stakeholders and governing bodies.

My other recommendation is to leverage organizations like the AREA. The knowledge base within the AREA organization and the content that you push out on almost a daily basis is really good information. If I were just dipping my toe in the space, those are the types of things that I would be reading and would recommend other folks dig into as well. It’s a really great resource.

To sum up: stay focused with your first trial, determine what hardware is years away from real-world use and what is ready today, find early adopters willing to partner in your organization, measure effectiveness with insightful metrics and actionable analytics, reach out to industry experts for guidance, and don’t be afraid to fail.




The AR Market in 2017, Part 4: Enterprise Content is Not Ready for AR

Previous: Part 3: Augmented Reality Software is Here to Stay

 

As I discussed in a LinkedIn Pulse post about AR apps, we cannot expect users to run a different app for each real world target they want to use with AR or one monolithic AR application for everything in the physical world. It is unscalable (i.e., far too time-consuming and costly). It’s unclear precisely when, but I’m confident that we will, one day, rely on systems that make content ready for AR presentation as a natural result of digital design processes.

The procedures or tools for automatically converting documentation or any digital content into AR experiences for enterprise use cases are not available. Nor will they emerge in the next 12 to 18 months. To begin the journey, companies must develop a path that leads from current procedures that are completely separate from AR presentation to the ideal processes for continuous AR delivery.

Leaders need to collaborate with stakeholders to focus on areas where AR can make a difference quickly.

Boiling the Ocean

There are hundreds of AR use cases in every business. All AR project managers should maintain a catalog of possible use cases. Developing a catalog of use cases begins with identification of challenges that are facing a business. As simple as this sounds, revealing challenges increases exposure and reduces confidence in existing people and systems. Most of the data for this process is buried or burned before it escapes. Without data to support the size and type of challenges in a business unit, the AR advocate is shooting in the dark. The risk of not focusing on the best use case and challenges is too high.

There need to be initiatives to help AR project managers and engineers focus on the problems most likely to be addressed with AR. Organizational change would be a likely group to drive such initiatives once these managers are, themselves, trained to identify the challenges best suited for AR.

In 2017, I expect that some systems integration and IT consulting companies will begin to offer programs that take a methodical approach through the AR use case development process, as part of their services to clients.

Prioritization is Key

How do stakeholders in a company agree on the highest priority content to become AR experiences for their top use cases? It depends. On the one hand there must be consistent AR technology maturity monitoring and, in parallel, the use case requirements need to be carefully defined.

To choose the best use case, priorities need to be defined. If users perceive a strong need for AR, that should weigh heavily. If content for use in the AR experience is already available, then the costs and time required to get started will be lower.

A simple method of evaluating the requirements appears below. Each company needs to define their own priorities based on internal drivers and constraints.

ch

A simple process for prioritizing AR use cases (Source: PEREY Research & Consulting).

Markers Won’t Stick

One of the current trends in enterprise AR is to use markers as the target for AR experiences. Using computer vision with markers indicates to the user where they need to point their device/focus their attention, consumes less power and can be more robust than using 3D tracking technologies in real-world conditions.

However, for many enterprise objects that are subject to sun, wind and water, markers are not a strategy that will work outside the laboratory. Those companies that plan to use AR with real-world targets that can’t have markers attached need to begin developing a new content type: trackables using natural features.

In 2017 more enterprise AR project managers will be asking for SDKs and tools to recognize and track the physical world without markers. For most, the technologies they will test will not meet their requirements. If well managed, the results of testing in 2017 will improve the SDKs as suggested in our post about AR software.

The AR Ecosystem and Technology are Immature

While the title of this post suggests that enterprise content is not in formats and associated with metadata to make AR experiences commonplace, the reverse statement is also true: not all the required AR components are ready for enterprise introduction.

Projects I’ve been involved with in 2016 have shown that while there are a few very solid technologies (e.g., tracking with markers on print), most components of AR solutions with which we are working are still very immature. The hardware for hands-free AR presentation is one area that’s changing very rapidly. The software for enterprise AR experience authoring is another. As more investments are made, improvements in the technology components will come, but let’s be clear: 2017 will not be the year when enterprise AR goes mainstream.

For those who have seen the results of one or two good proofs of concept, there will be many people who will need your help to be educated about AR. One of the important steps in that education process is to participate in the activities of the AREA and to share with others in your company or industry how AR could improve workplace performance.

When your team is ready to introduce AR, call in your change management group. You will need all the support you can get to bring the parts of this puzzle together in a successful AR introduction project!

Do you have some predictions about what 2017 will bring enterprise AR? Please share those with us in the comments to this post. 




The AR Market in 2017, Part 2: Shiny Objects Attract Attention

Previous: Part 1, Connecting the Dots

 

There’s a great deal of attention being paid to the new, wearable displays for Augmented Reality. Hardware permits us to merge the digital and physical worlds in unprecedented ways. Wearable hardware delivers AR experiences while the user is also able to use one or both hands to perform tasks. The tendency to pay attention to physical objects is not unique to AR industry watchers. It is the result of natural selection: genes that gave early humans the ability to detect and respond quickly to fast moving or bright and unusual objects helped our ancestors survive while others lacking those genes did not.

Although this post focuses on the hardware for Augmented Reality, I don’t recommend focusing exclusively on advancements in AR hardware when planning for success in 2017. The hardware is only valuable when combined with the software, content and services for AR in specific use cases.

Now, considering primarily AR hardware, there are important trends that we can’t ignore. This post only serves to highlight those that, in my opinion, are the most important at an industry-wide level and will noticeably change in 2017.

Chips accelerate changes

Modern Augmented Reality hardware benefits hugely from the continued reduction in size and cost in hardware components for mass market mobile computing platforms. We need to thank all those using smart phones and watches for this trend.

As the semiconductor manufacturers gain experience and hard-code more dedicated vision-related computation into their silicon-based mix, performance of complete AR display devices is improving. Combined with the technology Intel recently acquired from Movidius (which will produce significant improvements in wearable display performance beyond 2017), Intel RealSense is an example of a chip-driven technology to monitor. Other offerings will likely follow from NVIDIA and Apple in 2017.

When available for production, the improvements in semiconductors for wearable AR devices will be measurable in terms of lower latency to recognize a user’s environment or a target object, less frequent loss of tracking, higher stability in the digital content that’s rendered, lower heat and longer battery life. All these are gradual improvements, difficult to quantify but noticeable to AR experts.

As a result of optimization of key computationally-intensive tasks (e.g., 3D capture, feature extraction, graphics rendering) in lower cost hardware, the next 12 to 18 months will bring new models of AR display devices. Not just a few models or many models in small batches.

These next-generation wearable display models with dedicated silicon will deliver at least a basic level of AR experience (delivery of text and simple recognition) for an entire work shift. Customers will begin to place orders for dozens and even, in a few cases, hundreds of units.

Optics become sharper

In addition to semiconductors, other components will be changing rapidly within the integrated wearable AR display. The next most important developments will be in the display optics. Signs of this key trend were already evident in 2016 – for example, when Epson announced the OLED optics designed for the Moverio BT-300.

It’s no secret that over the next few years, optics will shrink in size, drop in weight and demand less power. In 2017, the size and weight of fully functional systems based on improved optics for AR will decline. Expect smart glasses to weigh less than 80gms. Shrinking the optics will make longer, continuous and comfortable use more likely.

Developers raised issues about color quality and fidelity when testing devices introduced in 2015 and 2016. Color distortion (such as an oil spill rainbow effect) varies depending on the type of optics and the real world at which the user’s looking (the oil spill pattern is particularly noticeable on large white surfaces). The 2017 models will offer “true” black and higher fidelity colors in a wider range of settings. Again, the experts will feel these improvements first and “translate” them to their customers.

Another key area of improvement will be the Field of View. Some manufacturers will announce optics with 50° diagonal (a few might even reach 80° diagonal) in 2017. When combined with advanced software and content, these changes in optics will be particularly important for making AR experiences appear more realistic.

Combined with new polychromatic materials in lenses, lower weight and stronger material in the supports, optics will be more tolerant of changes in environmental conditions, such as high illumination, and will fit in more ruggedized packages.

More options to choose from

Speaking of packaging, in 2016 there are three form factors for AR displays:

  • monocular “assisted reality” hardware that clips onto other supports (frames) or can be worn over a user’s ear,
  • smart glasses that sit on the user’s nose bridge and ears, and
  • head-worn displays that use straps and pads and a combination of ears, noses and the user’s skull for support.

The first form factor does not offer an immersive experience and isn’t appropriate for all use cases, but assisted reality systems have other significant advantages (e.g., lower cost, longer battery life, lighter weight, easy to store) so they will remain popular in 2017 and beyond.

At the opposite end of the spectrum, the highly immersive experiences offered by head-worn devices will also be highly appealing for different reasons (e.g., depth sensing, accuracy of registrations, gesture-based interfaces).

We need to remember that the use cases for enterprise AR are very diverse and so can be the displays available to users. The new wearable AR display device manufacturers entering the fray in 2017 will stay with the same three general form factors but offer more models.

In addition to diversity within these three form factors there will be extensions and accessories for existing products – for example, charging cradles, corrective lenses, high fidelity audio and materials specifically designed to tolerate adverse conditions in the workplace environment.

The results of this trend are likely to include:

  • those selling wearable displays will be challenged to clearly explain new features to their potential customers and translate these features into user benefits,
  • those integrating AR displays will be more selective about the models they support, becoming partners with only a few systems providers (usually leaning towards the bigger companies with brand recognition)
  • buyers will need to spend more time explaining their requirements and aligning their needs with the solutions available in their budget range.

Wearable display product strategists will realize that with so many use cases, a single user could need to have multiple display models at their disposal. One possible consequence of this trend could be reduced emphasis on display systems that are dedicated to one user. We could see emergence of new ways for multiple users in one company or group to reserve and share display systems in order to perform specific tasks on schedule.

Rapid personalization, calibration and security will offer new opportunities to differentiate wearable AR display offerings in 2017.

Enterprise first

All of these different form factors and options are going to be challenging to sort out. Outside enterprise settings, consumers will not be exposed to the hardware diversity in 2017. They simply will not invest the time or the money.

Instead, companies offering new hardware, even the brands that have traditionally marketed to mass market audiences, will target their efforts toward enterprise and industrial users. Enterprises will increase their AR hardware budgets and develop controlled environments in which to compare AR displays before they can really make informed decisions at corporate levels. Third party services that perform rigorous product feature evaluations will be a new business opportunity.

While this post highlights the trends I feel are the most important when planning for success with AR hardware in 2017, there are certainly other trends on which companies could compete.

To learn more about other options and trends in wearable AR displays in 2016, download the EPRI Technology Innovation report about Smart Glasses for AR in which I offer more details.

What are the trends you think are most important in AR hardware and why do you think they will have a significant impact in 2017?

 

Next: AR software matures and moves toward standardization