1

When a Developer Needs to Author AR Experiences, Part 1

There’s an established process for creating a new Web page. If it’s not already available, you begin by defining and developing the content. Then, there’s the formatting. Often there’s some scripting to provide interactivity. When the “authoring” is done, a page is published.

It’s not all that different for AR. Once an Augmented Reality project’s use case is clear, the experiences come about through an authoring process that resembles that of preparing and publishing content for the Web.

authoring-cycle

Figure 1. An AR authoring system combines trackables (created using features of the real world and a tracking library) with digital content that is encoded into presentation data and then assigned interactive functions (e.g., see more details, show relevant info, move and freeze in position, hide/close). The AR authoring system uses databases to store the scene elements – trackables, presentation data and interactions. (Source: PEREY Research & Consulting)

Today, Content Management Systems for the Web support the steps for page development with grace. Systems like WordPress and Drupal are so easy to use and commonplace that we hardly notice their existence.

In contrast, there are many AR authoring systems from which a developer can choose and none are as mature as CMS for the Web. The choice of tool and approach depends on the project requirements, skills of the developer and the resources available.

Define the AR Project Requirements

Before choosing an AR authoring system, the requirements must be clear. An AR experience design process should generate a storyboard and, from the storyboard, the following factors are defined:

  • User settings (indoor, outdoor, noise levels, etc.)
  • Need for a user management system to provide experience personalization or tracking
  • Need for live communication with any remote experts during the experience
  • Type of recognition and tracking required (marker, 3D, SLAM, etc.)
  • Need to access device GPS and compass for geospatial context
  • Preferred display device (smartphone, tablet, smart glasses or another type of HMD)
  • Human interaction modalities (hands-free, touch, speech, gaze)

In addition to the above variables that can be deduced from the storyboard, there could be other factors to consider. For example, if the target device is connected by an IoT protocol or if there are any supplementary files (e.g., videos, PDFs, etc.), then these need to be provided to the developer as early as possible. The project manager should also specify the frequency and types of updates that may be required after the initial AR experience is introduced to users.

When these project requirements and parameters are defined, the developer can choose the tools best suited for the AR experience authoring.

Want to know more about your choices of authoring platforms? There’s more in the next post




When a Developer Needs to Author AR Experiences, Part 2

This post is a continuation of the topic introduced in another post on the AREA site.

Choose a Development Environment

Someday, the choice of an AR development environment will be as easy as choosing a CMS for the Web or an engineering software package for generating 3D models. Today, it’s a lot more complicated for AR developers.

Most of the apps that have the ability to present AR experiences are created using a game development environment, such as Unity 3D. When the developer publishes an iOS, Windows 10 or Android app in Unity 3D, it is usually ready to load and will run using only local components (i.e., it contains the MAR Scene, all the media assets and the AR Execution Engine).

Although there’s a substantial learning curve with Unity, the developer community and the systems to support the community are very well developed. And, once using Unity, the developer is not limited to creating only those apps with AR features. The cost of the product for professional use is not insignificant but many are able to justify the investment.

An alternative to using a game development environment and AR plugin is to choose a purpose-built AR authoring platform. This is appropriate if the project has requirements that can’t be met with Unity 3D.

Though they are not widely known, there are over 25 software engineering platforms designed specifically for authoring AR experiences.

authoring-landscape

Table 1. Commercially Available AR Authoring Software Publishers and Solutions (Source: PEREY Research & Consulting).

The table above lists the platforms I identified in early 2016 as part of a research project. Please contact me directly if you would like to obtain more information about the study and the most current list of solutions.

Many of the AR authoring systems are very intuitive (featuring drag-and-drop actions and widgets presented through a Web-based interface), however most remain to be proven and their respective developer communities are relatively small.

Some developers of AR experiences won’t have to learn an entirely new platform because a few engineering software publishers have extended their platforms designed for other purposes to include authoring AR experiences as part of their larger workflow.

Or Choose a Programming Language

Finally, developers can write an AR execution engine and the components of the AR experience into an app “from scratch” in the programming language of their choice.

To take advantage of and optimize AR experiences for the best possible performance on a specific chip set or AR display, some developers use binary or hexadecimal instructions (e.g., C++) which the AR display device can run natively.

Many developers already using JavaScript are able to leverage their skills to access valuable resources such as WebGL, but creating an application in this language alone is slow and, depending on the platform, could fail to perform at the levels users expect.

To reduce some of the effort and build upon the work of other developers, Argon.js and AWE.js are Open Source JavaScript frameworks for adding Augmented Reality content to Web applications.

Results Will Depend on the Developer’s Training and Experience

In my experience, it’s difficult to draw a line between the selection of an AR authoring tool or approach and the quality or richness of the final AR application. The sophistication and quality of the AR experience in an app is a function of both the tools chosen and the skills of the engineers. When those behind the scenes (a) ensure the digital content is well suited to delivery in AR mode; (b) choose the components that match requirements; and (c) design the interactions well, a project will have the best possible impacts.

As with most things, the more experience the developer has with the components that the project requires, the better the outcomes will be. So, while the developer has the responsibility for choosing the best authoring tool, it is the AR project manager’s responsibility to choose the developer carefully.




New EPRI Report Offers Insights for Wearable AR Display Customers

Innovation in wearable technology continues to accelerate. Smart watch vendors are making so many announcements there are portals dedicated to helping customers sort through the details. There is also a portal to help customers compare the features of wearable displays for AR.

And new wearable segments are being defined. For example, Snap recently introduced its $130 Spectacles

Is this all good?

Thinly veiled behind the shiny new products is a vicious cycle.

The continual stream of announcements confirms for readers of this blog that the wearable AR display segment is still immature. This means that those customers with limited budgets seeking to select the best hands-free AR display for their projects in 2016 are likely to be disappointed when an update or new model appears, making the model they just brought in-house out of date. Risk-averse organizations may put their resources in another product category.

On the other side of this conceptual coin, the companies developing components and building integrated solutions for wearable AR must continue to invest heavily in new platforms. These investments are producing results — but without clear customer requirements, the “sweet spot” for which the products should aim is elusive. And when customers lack clear requirements, differentiating the latest offerings while avoiding hype is a continual challenge.

Breaking the cycle with specific requirements

When customers are able to prioritize their needs and provide specific product requirements and budgets, there’s hope of breaking this cycle.

The Electric Power Research Institute (EPRI) and PEREY Research & Consulting, both AREA Founding Sponsor members, have collaborated on the preparation of a new report entitled Program on Technology Innovation: State of the Art of Wearable Enterprise Augmented Reality Displays.

Targeting the buyers of wearable technology for use when performing AR-assisted tasks in utilities (and by extension, in other enterprise and industrial environments), the report seeks to demystify the key product features that can become differentiators for wearable AR solutions.

Based on these differentiators, the first multi-feature wearable AR display classification system emerges.

Perey

Source: Program on Technology Innovation: State of the Art of Wearable Enterprise Augmented Reality Displays. EPRI, Palo Alto, CA: 2016. 3002009258.

The report also discusses challenges to widespread wearable AR display adoption in technology, user experience, financial, and regulatory/policy domains.

Descriptions of a few “lighthouse” projects in utilities companies, logistics, manufacturing, and field service provide readers valuable insight into how early adopters are making the best of what is currently available.

This report is available for download at no charge as part of the EPRI Program on Technology Innovation.

If you have comments or feedback on the report, please do not hesitate to address them to the authors, Christine Perey and John Simmins.




How Optical Character Recognition Makes Augmented Reality Work Better

Today, companies in many industries seek to develop AR and VR applications for their needs, with the band of existing Augmented Reality solutions extending from gimmicky marketing solutions to B2B software. Helping production companies train their workers on the job by augmenting service steps onto broken machines is one of those solutions.

Augmented Reality could assist designers or architects to see a product while it is still in development. It could facilitate a marketing and sales process, because customers can already “try on” a product from a digital catalog. Or it could assist warehouse systems so that users get support in the picking and sorting process

The list of opportunities is endless and new use cases are constantly arising. The whole point of using AR is to make processes easier and faster. While at first, Augmented Reality and devices like smart glasses seemed way too futuristic, new use cases make them increasingly suitable for everyday life in the workplace.

Recognizing Objects and Characters

Augmented Reality is based on a vital capability: object recognition. For a human being, recognizing a multitude of different objects is not a challenge. Even if the objects are partially obstructed from their view they can still be identified. But for machines and devices this can still be a challenge. For Augmented Reality this is crucial though.

A smartphone or smart glasses can’t display augmented overlays without recognizing the object first. If needed for correct augmentation, the device has to be aware of its surroundings and adapt its display in real time according to each situation, all the while changing the device’s camera viewing angle. Augmented Reality applications use object detection and recognition to determine the relevant information needing to be added to the display. They also use object tracking technologies to continually track an object’s movements rather than redetecting it. That way the object remains in the frame of reference even if the device is moved around.

Character recognition is also crucial for a device’s understanding of the environment, as it not only needs to recognize objects, but according to the use case, it might also have to “read” it. This provides an even better discernment of the types of information that are important to process.

OCR Anyline

Optical Character Recognition

Optical Character Recognition (OCR) deals with the problem of recognizing optically processed characters, such as those in the featured image above. Both handwritten and printed characters may be recognized and converted into computer readable text. Any kind of serial number or code consisting of numbers and letters can be transformed into digital output. Put in a very simplified way, the image taken will be preprocessed and the characters extracted and recognized. Many current applications, especially in the field of automation and manufacturing, use this technology.

What OCR doesn’t take into account is the actual nature of the object being scanned. It simply “looks” at the text that should be converted. Putting together Augmented Reality and OCR therefore provides new opportunities; not only is the object itself recognized, but so is the text printed on that object. This boosts the amount of information about the environment gathered by the device, and increases the decision-support capabilities offered to users.

The Potential of OCR

Data import still requires high processor power and camera resolution and is expensive. Nevertheless OCR offers a viable alternative to voice recognition or input via typing.

Using OCR with smart glasses offers improvements for different kinds of business processes. Imagine a warehouse worker who needs both hands free to do his job efficiently. Using smart glasses to overlay virtual information on his environment can make him more efficient. But the ability to automatically scan codes printed on objects just by glancing at them frees his hands for other tasks.

Another example would be the automation of meter reading. When a device identifies the meter hanging on a wall, as well as its shape and size, and then automatically scans its values, a greater amount of meters can be read per day. This use case could be useful to energy providers.

When you look around, you will realize how many numbers, letters and codes need to be either written down or typed into a system every single day. Such processes, which can be very error prone, can become much less painful using OCR.




A Partnership Model for Augmented Reality Enterprise Deployments

Due to the potential to radically change user engagement, Augmented Reality has received considerable and growing attention in recent months. Pokémon Go certainly has helped and, in turn, generated many expectations for the advancement of AR-based solutions. In fact, the game has provided the industry with a long overdue injection of mass appeal and as a result, significant investment from (and among) tech giants around the world.

From corner shops to large utility providers, the spike in popularity of this technology has everyone buzzing about how it could improve their business. The flexibility of implementation, from improving processes to stand-out marketing solutions, has also altered the expectations of these prospective clients as they seek personalized enterprise-level AR-based solutions. Consequently, the time has come for vendors and suppliers to consider a new model when it comes to managing customer expectations.

When deploying Augmented Reality solutions in an enterprise context, it is essential to build strong partnerships with your customers, and in many cases to take on the role of a trusted advisor. This becomes more important through the stages of delivering a project—starting with defining a proof of concept (POC) to implement bleeding edge solutions with operational teams and ultimately end users, who in fact are the actual users of the technology.

While the primary value of Augmented Reality systems is to allow for the contextual overlay of information to enable better decision making, the visual data overlay and various data sources and devices that trigger location sensors all come into play—converging in the form of a complex mesh. Vendors must note that partnerships are key to solving the pieces of this puzzle.

Service Delivery—Creating Value from the Complex Mesh

This complex mesh is what ultimately garners value as the assimilation of these technologies creates new and innovative social and business ecosystems and associated processes. When addressing enterprise adaptation, one must be aware of the following questions:

  • How best can value be driven into workable solutions in an enterprise?
  • How well does it integrate with existing legacy systems?
  • Would new skills be required to introduce and manage the change?
  • Does the solution deliver increased productivity or efficiencies, i.e., better utilization of resources or allow for better decision making through information?
  • Does the solution enable new revenue models for the organization that are consistent with the existing product and service offerings?
  • In turn, how does this solution affect the profitability of the organization?
  • Last, but not least, is the business rationale clear for the implementation of such a solution?

The move towards customer-centric systems means that your customer (or your customer’s customer) is at the center of all decision making. This may be a shift from their existing system practices, meaning it’s even more critical that the chosen change management process be well aligned to the client’s corporate culture.

The Client’s Point of View—Questions to Ask When Going Beyond the POC

Some of the questions that vendors need to consider when it comes to implementing the solution beyond the POC are:

  • What is changing?
  • Why are we making the change?
  • Who will be impacted by the change?
  • How will they react to the change?
  • What can we do to proactively identify and mitigate their resistance to the change?
  • Will the solution introduce new business or revenue models?

Working as one with your customers through innovations to operations is a key factor for success. The complex mesh of AR, VR, IoT and Big Data technologies makes this even more critical as enterprises see an integration of their digital content, systems and processes.

It is essential to take a partnership mindset—where the Augmented Reality innovation solution is built both for and with the customer, and through a customer-implemented change management process—to quickly and easily create ROI as well as tangible, actionable outcomes.




Two Months In: An Update From the Executive Director

Further to my last post about the AWE ’16 conference, I want to share some thoughts and areas for future focus from my first two months as the Executive Director of the AREA.

It’s exciting to be involved in such a vibrant and dynamic ecosystem of AR providers, customers and research institutions. I’m amazed at the sheer breadth of the kinds of member organizations, their offerings, skills, achievements and desire to work with the AREA to help achieve our shared mission of enabling greater operational efficiencies through smooth introduction and widespread adoption of interoperable AR-enabled enterprise systems.

Success and Challenges

Through my initial conversations with the members, I’ve learned of many success stories and also the challenges of working in a relatively young and rapidly changing industry.

For example, AREA members talk about the prototypes they’re delivering with the support of software, hardware and service providers. However, I would like to see more examples of wider rollouts, beyond the prototype stage, which will encourage more buying organizations to investigate AR and understand its massive potential.

The AWE conference in Santa Clara in June, and the subsequent AREA Members Meeting added emphasis to my initial thoughts. The AR in Enterprise track of AWE, sponsored by the AREA, highlighted a number of organizations who are already using AR to create real benefits, ranging from the enabling of real time compliance, better use of resources, applying the most relevant data and the reduction of time, errors and costs. It was great to see that many member companies understand the benefit of working together to enable the whole AR ecosystem to become successful.

Carrying on the Momentum

My continued focus over the coming weeks will be to carry on the great momentum that has been started. I’m briefing more organizations from all over the world about the benefits of becoming an AREA member. I’ll continue the focus on developing and curating thought leadership content including case studies, frameworks and uses cases, and deliver them via the AREA website, webinars and social media. We’re enhancing our value proposition through our development of research committees that increase the capabilities of the industry.

This is an exciting time for the enterprise AR industry and the AREA; I’m very interested in any feedback or comments you may have so please contact me at [email protected]. I look forward to hearing from you and working with our growing membership to meet our goals of realizing the potential of Augmented Reality in the workplace.




New Executive Director Reports on AWE ’16 and Members Meeting

As the incoming executive director of the AR for Enterprise Alliance, I was very excited to attend my first Augmented World Expo and to meet some of the 34 members of the AREA.

AWE is one of the largest and best-attended events worldwide about Augmented Reality, and typically hosts thousands of attendees and hundreds of companies. This year’s event was no exception and did not disappoint. I was pleased to meet a high number of innovative AR companies from the AREA provider segment and attend demos of their groundbreaking solutions. It’s clear to me that AR in enterprise is here to stay and the AREA occupies a strategic position in growing the entire ecosystem to the benefit of everyone.

Benefits of AR in Enterprise

The event gave me the opportunity to speak with a range of attendees from many companies and markets. It was exciting to be asked so many different and interesting questions on many topics and one conclusion that came up time and again was the importance of AR in enterprise. The potential benefits and savings of AR is getting the attention of C-suite rather than just the innovation and technology teams. The trajectory towards a real reduction in time, costs and errors is a critical for companies as they look to streamline their business and increase the return on investment.

Enterprise AR Track at AWE ‘16

The focus on enterprise was supported by an impressive number of customers and providers presenting their experiences during the Enterprise AR track—sponsored by the AREA. I learned a lot from all the presentations but it was also instructive to listen to the members of the AREA’s customer segment. They were insightful and provided a unique perspective on the benefits and issues they experienced when implementing AR solutions within their companies. It’s clear that there are many lessons to learn and the AREA is well placed to help the AR ecosystem make effective and informed decisions based on shared knowledge and experience.

The AREA at AWE

At AWE we experienced a constant stream of people visiting our stand and asking questions. Many expressed appreciation of the AREA’s work and benefits achieved for the ecosystem. A number of them even mentioned regularly visiting the AREA website when trying to find information about AR, and that the AREA’s content was insightful and informative.

For those who hadn’t heard of us, it was useful to discuss our mission, benefits, membership options and growth. Much interest was expressed and I hope new members will join based on these discussions.

AWE was my first real experience meeting the enterprise AR community and it was a very useful and insightful experience. I look forward to following up with the many attendees I met and help drive the AREA’s development and its role in supporting this nascent ecosystem.

AREA Members Meeting

After AWE, we held an AREA Members Meeting in Palo Alto, California, on June 3. It was an honor to chair my first such meeting. AREA in-person meetings occur around three times a year and they’re a great opportunity to meet with members, discuss progress made, define future strategic plans to further develop the ecosystem and have some fun.

Thanks to Atheer for hosting the event at the beautiful Palo Alto Art Center.

The morning agenda items included:

  • Progress updates from the various AREA committees
  • Upcoming events in which the AREA can support its members

The afternoon included various brainstorming sessions around the content and the way the AREA positions itself to potential new members.

The day was full of insightful and interesting discussions, and from a personal perspective it was great to interact with many leaders and understand how we can work together as an alliance to support and grow the ecosystem and provide thought leadership to possible new customers and providers of AR.

If you are interested in joining the AREA, please complete this form.




Augmented Reality in Future Manufacturing

In a previous post we described how, by developing a new framework that leverages Augmented Reality, IoT, social networking and advances in hardware, the members of the European SatisFactory consortium seek to increase productivity in manufacturing.

After a period of design and development, SatisFactory solutions and technologies will be validated at three pilot manufacturing facilities.

Each of the three pilot sites corresponds to a different industry:

  • Chemical Processes: The Chemical Process Engineering Research Institute (CPERI) is a non-profit research and technological development organization based in Thessaloniki, Greece.
  • Industrial Automation: Comau S.p.A is a global supplier of industrial automation systems and services based in Turin, Italy.
  • Energy: Systems Sunlight S.A. headquartered in Athens, Greece, is a manufacturer of energy storage and power systems for industrial and consumer applications.

PzS-battery-cells-production1

Installation and validation of the SatisFactory framework at each of the sites is an iterative process, with Augmented Reality devices (with technology partner GlassUp) and other technologies to be implemented in the coming months.

While this article focuses on plans for Augmented Reality at each of the three pilot sites, there are many more technological aspects bringing together innovations for streamlining efficiency in the factory. A publicly available project report sheds light on the use cases described below.

Continuous Production in Chemical Processing

CPERI is an institute that performs research and provides services to industries related to chemical engineering, energy and materials. CPERI is an ideal site for testing and improving continuous processes. In contrast to batch manufacturing of goods, continuous processes for chemical, pharmaceutical, food processing and other types of plants impose different challenges. As the facilities use equipment that must run continuously, any downtime can be costly. When shutdowns occur, incomplete products must often be disposed of, and the corresponding infrastructure (e.g., pipes, vessels, etc.) thoroughly cleansed of remaining materials.

Startup and shutdown procedures must be validated and documented to prevent all unwanted impacts. Typically such procedures require several hours to complete. CPERI is pioneering Augmented Reality in a use case for plant startup procedures, in which an operator using an AR device completes a task normally requiring many sequential steps done in several hours for starting up a plant.

The AR-enabled system incorporates a human-machine interface to display real time feedback to the operator from the Supervisory Control and Data Acquisition (SCADA) automation system and other process control systems. Without AR, such tasks must be performed by an experienced operator referencing a manual, while the new AR solution proposes a workflow with reduced attention switching and can be performed by someone with less prior experience.

CPERI Pilot

Additional use cases for AR are being designed and discussed with other SatisFactory pilot sites. CPERI, along with its partners, are also authoring a set of standard operating procedures for enabling and using Augmented Reality to improve productivity and compliance in continuous manufacturing and other chemical industrial processes.

Discrete Manufacturing in Heavy Industry

Part of the Fiat Group, Comau provides industrial automation systems for manufacturing in the automotive, aerospace, steel and petrochemical industries. The company specializes in:

  • Body welding equipment for a variety of vehicle types
  • Manufacturing systems for engine powertrain components
  • Robotic systems for a range of manufacturing use cases

Comau is developing an Augmented Reality-enhanced system to help a user assemble a robot wrist, a process that normally requires four hours and over 290 individual steps to complete. The proposed AR solution, to be provided on a device that’s either fixed or wearable (e.g., smart glasses), uses animations that appear over the real world to guide the technician through steps to complete the task in less time and with fewer errors than existing methods.

As with CPERI, the AR solution will eventually be integrated into the site’s IT infrastructure, including warehouse management, enterprise resource planning and manufacturing execution systems to provide real time data and support.

Remote operator assistance and technician training are other use cases being evaluated. Operators will be able to request live help and collaboration from remote experts for tasks, as well as choose training scenarios from online repositories for on-the-spot examples and guidance. Comau is also evaluating how AR can be used to record processes performed for verification and future training purposes.

Augmented Reality in Power System Manufacturing

Systems Sunlight’s integrated energy products span a range of industries from vehicles and consumer electronics to utilities and defense. The company operates a manufacturing facility where assembly lines produce batteries, transforming raw materials into ready-to-use products. Battery production requires continuous monitoring of variables such as cell temperature, which is measured with a thermal camera.

Augmented Reality is being evaluated for producing “motive batteries” for powering machinery: in a two-hour assembly process of six major steps, an operator places battery cells in a metal box whose sequence depends on battery type. The cells are then connected by means of a battery string and later checked whether they need additional electrolyte filling. They are then sealed with regular or water-filling plugs. In the next two stages, the terminal plugs and labels are installed and the batteries are checked for quality. Finally, the batteries are transported to a warehouse for dispatch to customers.

Systems Sunlight will implement Augmented Reality guidance on a fixed or wearable device so the technician can work hands free. Besides the expected benefits that animated steps overlaid on the field of view can provide, the company anticipates the technology will increase the overall motivation of technicians and operators. They plan to measure this increase with surveys after the conclusion of pilot testing.

If the pilot is successful, the company will explore use of Augmented Reality on further production lines and create a training system combining AR and gamification, along with quizzes to reinforce knowledge.

A Template for Manufacturing Efficiency

The lessons learned in these three pilot project sites will allow the SatisFactory solutions to be fine-tuned and demonstrate their value for the manufacturing sector.

In 2017, they will be made available to the European manufacturing industry to improve efficiency through novel interaction and collaboration technologies. The solutions also aim to improve the quality of life and overall working experience of factory operators, and mark a major step forward in European manufacturing competitiveness.




Augmented Reality and Gartner’s Hype Cycle

Industry watcher and analyst firm Gartner has been studying emerging technologies for over 20 years. The company has become widely recognized for publishing its annual Hype Cycle, the chart that captures Gartner analysts’ assessments of the maturity of emerging information and communication technologies.

Interpreting the Hype Cycle

As stated on Gartner’s website, the chart is designed for the firm’s clients : “Clients use Hype Cycles to get educated about the promise of an emerging technology within the context of their industry and individual appetite for risk.”

The sidebar on the same page goes on to suggest that the Hype Cycle:

  • Separates hype from the real drivers of a technology’s commercial promise
  • Reduces the risk of your technology investment decisions
  • Compares your understanding of a technology’s business value with the objectivity of experienced IT analysts

In my slides introducing the March AREA webinar on the topic of forecasting the growth of enterprise Augmented Reality, I provided 15 Hype Cycle figures of the years between 2000 and 2015 showing where Gartner placed Augmented Reality. These figures were compiled by Dr. Robin (Rab) Scott of the AMRC, an AREA member, and used in the webinar with Dr. Scott’s permission.

The figures show how this influential firm has followed Augmented Reality for over a decade. I pointed out in my remarks that readers should not interpret the position of any technology on the Gartner curve as highly definitive.

Looking at Gartner’s positioning of Augmented Reality over the years, and anticipating the 2016 Hype Cycle to be published, I am recommending in this post that Gartner consider treating Augmented Reality and its associated technologies as separate nodes on the cycle. By giving more attention to AR’s enabling technologies Gartner will help its clients better achieve their goals and better serve our industry.

Augmented Reality Isn’t One Technology

My primary concern about Augmented Reality appearing as a dot on the Gartner 2015 Hype Cycle is that it suggests that Augmented Reality is one technology. I don’t think this was ever the case in the past and it certainly isn’t today.

In its press release about last year’s Hype Cycle, the company stated that more than 2,000 technologies were studied. It would be helpful if the firm pointed out which of the hundreds of AR-enabling technologies it considered in positioning the “whole AR” on its cycle.

In my opinion Gartner needs to begin explaining how technologies are treated differently. For instance, some technologies on the cycle are “general” (representing many enablers at different stages of evolution), and others are not. In 2015, for example, brain-computer interfaces are in the first phase. Gesture control technologies, another relatively precise technology label, are on the slope of enlightenment. Another example is natural language question answering (very specific technology, in my framework but probably also composed of many enablers), which is positioned on the line between Peak of Inflated Expectations and the Trough of Disillusionment. And, by the way, when will the questions asked be answered correctly all the time?

On the other hand, Augmented Reality is not the only example of the ambiguity and confusion caused when a general category is represented as a dot on the cycle. For example, wearables and Internet of Things are other labels (represented as dots) on the 2015 Hype Cycle that could benefit from being represented by an array of enabling technologies (or are they enablers?).

In my opinion, the company would better serve its clients and readers by tracing the progress of some of the important enablers or components for AR and other technologically-powered systems, such as autonomous driving vehicles. A few components that I have recently studied for a technology maturity assessment, and that I believe should be added to the Hype Cycle, include:

  • Depth-sensing technologies
  • Computer vision-based 3D target object recognition and tracking
  • Optics for use in wearable displays
  • Gaze detection and tracking technologies

Enlightenment Is a Process

Enlightenment about the benefits of a technology does not happen by simply turning on a light. The processes by which technologies move from barely understood to mainstream use differ widely.

Twenty-five years ago I began reading and writing about the future with multimedia information. Multimedia was not on the Gartner curve in 2000 because it had already reached something approaching maturity; now it’s an archaic term. I have been an outspoken proponent for the adoption of mobile technologies for over 12 years. Mobile technology was not a dot on the Gartner curve in 2004 but its enablers such as MMS and 802.11 g certainly were.

Would it not be better to leave Augmented Reality off of the 2016 and future Hype Cycle figures and, rather, to point the spotlight on the state of dozens of key enablers?

Do you feel the Gartner Hype Cycle correctly portrays the state of Augmented Reality? What would you like to see added or removed from the Gartner Hype Cycle in 2016?




Augmented Reality Boosts Efficiency in Logistics

Fulfilling customer orders at a warehouse, or order picking, can be costly. A well-known study on warehouse management cited the typical costs of order picking as being nearly 20% of all logistics costs and up to 55% of the total cost of warehousing. The use of technology to streamline order picking offers an important opportunity to reduce cost.  

While great strides have been made in automating warehouse processes, customer expectations also continue to rise. For example, Amazon offers same-day delivery in many US metropolitan areas and this is becoming a standard elsewhere. Increasing fulfillment and delivery speeds may result in increased errors that are not caught prior to shipment.

Four panel image

Augmented Reality can significantly increasing order picking efficiency. An AR-enabled device can display task information in the warehouse employee’s field of view. Logistics companies such as DHL, TNT Innight and others have been collaborating with providers of software and hardware systems to test the use of Augmented Reality in their warehouses.

A recent study by Maastricht University conducted in partnership with Realtime Solutions, Evolar and Flos brings to light the impact smart glasses can have on order fulfillment. The research sought to:

  • Confirm prior research that smart glasses improve efficiency compared with paper-based approaches
  • Study usability, required physical and mental effort and potential empowerment effects of the technology in a real world environment
  • Assess the impact of an individual’s technology readiness on previously introduced performance and well-being measures

Design of the Study

Sixty-five business students at the University of Maastricht participated in a three-day study conducted in a controlled environment. Study participants were given instructions to pick individual items from bins containing items and place them into appropriate customer bins:

  • One group picked items from 28 bins using item IDs printed on paper and then matched those to IDs on customer bins. The study assessed order picking efficiency by measuring the ability and speed of participants to place the items in the correct customer bins.
  • The other group used AR-enabled smart glasses to scan barcodes in item bins and follow the displayed instructions to place them in the customer bins.

The researchers evaluated metrics such as:

  • Performance measures of error rates and picking times per bin
  • Health and psychological measures such as heart rate variability, cognitive load and psychological empowerment
  • Usability measures such as perceived ease of use
  • “Technology readiness” on a scale measuring personal characteristics such as optimism for, and insecurity with new technologies

View through smartglasses

Faster with Smart Glasses

The researchers found that smart glasses using code scanners permitted users to work 45% faster than those using paper-based checklists, while reducing error rates to 1% (smart glasses users made ten times less picking errors than the control group).

The smart glasses group also expended significantly less mental effort to find the items with the same heart rate variability as the group using paper.

Overall the usage of smart glasses empowers users and engenders positive attitudes toward their work and the technology: in comparison with the group following checklists, they felt the successful completion of tasks was more attributable to their own behavior. This corroborates other studies in efficiency gains such as this one, and demonstrates the level of impact Augmented Reality can have in the workplace.

You can read about more Augmented Reality research from Maastricht University and other university partners at this portal.

Maastricht University Logo