1

Factories of the Future

In a blog post last month, Giuseppe Scavo explored the Industrial Internet of Things (IIoT) and the growing trend of connected devices in factories. Smart devices and sensors can bring down production and maintenance costs while providing data for visualization in Augmented Reality devices.

Connecting AR and IIoT requires applied research. In this article we’ll look at the EU-sponsored SatisFactory project, which is focusing on employee satisfaction in factories by way of technology introduction.

Innovation in Industrial Production

In 2014, the European Union launched Horizon 2020, a seven-year research and innovation program (ending in 2020) dedicated to enhancing European competitiveness. Horizon 2020 is a partnership between public and private entities and receives nearly $90 billion in public funds. As the program’s website describes, Horizon 2020 aims to drive smart, sustainable and inclusive growth and jobs.

Factories

Within this push is the Factories of the Future initiative, a roadmap providing a vision and plan for adding new manufacturing technologies to the European production infrastructure. Objectives of Factories of the Future initiative include:

  • Increasing manufacturing competitiveness, sustainability, automation
  • Promoting energy-efficient processes, attractive workplaces, best practices and entrepreneurship
  • Supporting EU industrial policies and goals

To meet these objectives, ten partner companies and institutions from five European countries founded the SatisFactory consortium in 2015. SatisFactory is a three-year project aiming at developing and deploying technologies such as Augmented Reality, wearables and ubiquitous computing (e.g., AR-enabled smart glasses, etc.) and customized social communication and gamification platforms for context-aware control and adaptation of manufacturing processes and facilities.

SatisFactory-developed solutions seek higher productivity and flexibility, job education of workers, incident management, proactive maintenance and above all a balance between workers’ performance and satisfaction. The solutions are currently being validated at three pilot sites (one small- and two large-scale industrial facilities) pending release for use at industrial facilities throughout Europe.

Factories

Industry 4.0

SatisFactory’s vision of Industry 4.0 includes a framework with four sets of technologies:

  • Smart sensors and data analytics for collecting and processing multi-modal data of all types. The results of this real time data aggregation will include diagnosing and predicting production issues, understanding the evolution of the workplace occupancy model (e.g., balancing numbers of workers per shift) and enhancing context-aware control of production facilities (e.g., semantically enhanced knowledge for intra-factory information concerning production facilities, re-adaptation of production facilities, etc.).
  • Decision support systems for production line efficiency and worker safety and well-being. These systems can take many forms, ranging from Augmented Reality for human visualization of data to systems for incident detection and radio frequency localization.
  • Tools for collaboration and knowledge sharing, including knowledge bases and social collaboration platforms. Augmented Reality for training by remote instructors will provide flexibility and increase engagement. Collaborative tools also allow employees to exchange information and experiences, and these tools are combined with learning systems.
  • Augmented Reality and gamification can increase engagement. SatisFactory will use tools previously developed by consortium partners and, in pilot projects, explore use of smart glasses and human-machine interfaces. Interaction techniques and ubiquitous interfaces are also being explored.

satisfactory8-jaune

Pilot Sites

SatisFactory solutions are being tested at the pilot sites of three European companies:

  • The Chemical Process Engineering Research Institute (CPERI) is a non-profit research and technological development organization based in Thessaloniki, Greece. The company provides a test site for continuous manufacturing processes.
  • Comau S.p.A is a global supplier of industrial automation systems and services and is based in Turin, Italy. The company provides manufacturing systems for the automotive, aerospace, steel and petrochemical industries.
  • Systems Sunlight S.A. is headquartered in Athens, Greece, and produces energy storage and power systems for industrial, advanced technology and consumer applications.

In the next post, we’ll look at activities at the sites and how the project is applying Augmented Reality at the different production facilities.




Unity Gives Augmented Reality the Nod during Vision Summit 2016

If you saw the headlines coming out of Unity’s Vision Summit, you probably noticed a trend: Virtual Reality was the star of Vision Summit 2016. Valve’s Gabe Newell gave everyone an HTC Vive Pre. The Oculus Rift will come with a four-month Unity license. Unity is getting native support for Google Cardboard. At the summit, the expo floor had long lines for the “big three” head-mounted displays (HMDs): Sony’s PlayStation VR, Oculus Rift and HTC Vive.

It’s not that Augmented Reality was absent from what was billed as “The Definitive Event for Innovators in VR/AR,” but rather that the technology was in the minority of tools. This is the year of Virtual Reality, with the big three VR providers launching major products in March (Oculus), April (HTC) and sometime in the fall (Sony). The event was hosted by Unity, which caters almost exclusively to game developers needing comprehensive cross-platform development tools, and gaming in VR is expected to be huge. Virtual Reality was even the focus of the keynote, but astute observers might have noticed something.

Best Days of Augmented Reality Are Ahead

Unity’s own keynote referenced a report by Digi-Capital which predicts that the AR industry will have negligible revenue in 2016, but will surpass VR in 2019. In 2020, the AR industry is predicted to triple the value of VR. Take this with a grain of salt; Unity is in the business of selling licenses for their cross-platform game development toolset, so they’re incentivized to predict massive growth, but even reducing these numbers to a cynical level shows massive promise in a new field.

Most of this growth may be in gaming, but the AR presence on the expo floor leaned toward enterprise use. Epson was demonstrating their Moverio line of smart glasses, which has been around since 2012. Vuzix had their M-100 available to try, and they were eager to tout their upcoming AR3000 smartglasses.  In its booth, Vuforia demonstrated a Mixed Reality application on Gear VR that allowed the viewer to disassemble a motorcycle and view each part individually, which could be handy for vehicle technicians.

Of course, you can learn the most from hands-on experience with enterprise AR, which is exactly what NASA presented. They showed how they replaced complicated written procedures with contextual, relevant, clear instructions with AR using HoloLens. They also had a suite of visualization tools for collaborating on equipment design.

I presented the results of a year-long collaboration between Float and the CTTSO to develop an AR application designed to assist users in operational environments. We discussed the ins and outs of developing a “true AR” experience from the ground up, in addition to all of the lessons we learned doing image processing, using Project Tango, and more. At the end, I demonstrated the finished app, with integrated face recognition, text recognition, and navigation assistance supported either on an Epson Moverio or the Osterhout R-6.

An Increasing Focus

Vision Summit 2016 may have been a largely focused on VR, but that’s not a reflection of a lack of interest in AR. In our own research, we estimated that AR was lagging behind VR in terms of the technology readiness level by a few years. This was confirmed at the Vision Summit, but there’s still plenty of AR to get excited about. Valve even stated that they’d let developers access the external camera on the HTC Vive “in the long run” for Augmented and Mixed Reality applications. Expect next year’s Vision Summit to have a much larger focus on AR as this industry begins to truly take shape.

Did you attend Vision Summit 2016? What did you observe? Do you plan to attend the Unity event in 2017?




Augmented Reality: the Human Interface with the Industrial Internet of Things

Are you noticing an emerging trend in manufacturing? After years of hype about Industry 4.0 and digital manufacturing, companies with industrial facilities are beginning to install Internet-connected sensors organized in networks of connected devices, also known as the Industrial Internet of Things (IIoT), in growing numbers.

Industrial IoT Is Not a Fad

According to a recent report published by Verizon, the number of IoT connections in the manufacturing sector rose 204% from 2013 to 2014. These connect data from industrial machines to services that provide alerts and instructions on consoles in control rooms to reduce plant downtime. The same Verizon study provides many examples of IIoT benefits in other industries as well: companies that move merchandise are reducing fuel consumption using data captured, transmitted and analyzed in near real time. Connected “smart” streetlights require less planned maintenance when their sensors send an alert for needed repairs. Other examples include smart meters in homes, which reduce the cost of operations for utilities. An analysis from the World Economic Forum describes other near-term advantages of globally introducing IIoT such as operational cost reduction, increasing worker efficiency and data monetization. These are only the tip of the iceberg of benefits.

Many predict that as a result of IIoT adoption, the global industrial landscape is shifting towards a more resource efficient, sustainable production economy. Part of the equation includes combining IIoT with other technologies. Companies that deploy IIoT must also build and maintain advanced systems to manage and mine Big Data.

Big Data

To act upon and even predict factory-related events in the future, companies need to mine Big Data and continually detect patterns in large-scale data sets with Deep Learning technologies. Combined with vast processing power “for hire” in the cloud, these technologies are putting cost-saving processes like predictive maintenance and dynamic fault correction within reach of many more companies. With predictive technologies, managers can optimize responses better and adapt their organizations more quickly to address incidents. A study from General Electric in collaboration with Accenture highlights that for this reason, two managers out of three are already planning to implement Big Data Mining as a follow up to IIoT implementation.

Data and Objects Also Need Human Interfaces

Having post-processing analytics and predictive technologies is valuable to those who are in control centers, but what happens when a technician is dispatched to the field or in the factory to service a connected machine? Augmented Reality provides the human workforce with an interface between the data from these sensors and the real world.

The real time visualization (or “consumption”) of sensor data is an important component of the larger equation. Sensor tracking protocols are not new. In fact, SCADA can be traced back to the ‘70s but when combined with Augmented Reality, new options are available. As industrial equipment becomes more and more complex, workers constantly face long procedures that often involve monitoring and decision-making. When assisted by Augmented Reality during this process, the worker with the contextual guidance as well as all the up-to-date information required for successful decision-making can perform tasks more quickly and with lower errors.

How It Works

Let’s examine a compelling use case for AR and IIoT: maintenance of Internet-connected machines. A worker servicing a machine facing a fault needs to access the real time data readings of the internal variables of all the machine components in order to diagnose the problem and choose the right procedure to apply. In current scenarios the worker needs to phone the central control room in order to access the data or, in some cases, retrieve the data readings from a nearby terminal, then return to the machine. With an AR-enabled device, the worker can simply point the device at the machine, visualize the real time internal readings overlaid on top of the respective components, and decide the best procedure (as shown in the ARise event presentation about data integration). The same device can then provide guidance for the procedure, informing the worker with the contextual data needed at every step.

Another use case that can benefit from the combination of AR and IoT is job documentation. Through the interaction with real time sensor data, workers can document the status of machines during each step, feeding data directly into ERP systems, without having to fill out long paper-based forms as part of their service documentation. Procedures can be documented with greater precision, eliminating the possibility for human error during data gathering.

Big Data and Augmented Reality

When deploying IoT in industrial contexts, entrepreneurs should take into account the two faces of the value of the data produced by this technology. The offline processing capabilities of Big Data Mining algorithms provide a powerful prediction and analysis tool. In parallel, implementing Augmented Reality allows those who are in the field to reap the benefits of having real time onsite contextual data. 

Some AREA members are already able to demonstrate the potential of combining sensors, Big Data and Augmented Reality. Have you heard of projects that tap IIoT in new and interesting ways with Augmented Reality? Share with us in the comments of this post.




Efficiency Climbs Where Augmented Reality Meets Building Information Management

At Talent Swarm we envisage that by using pre-existing platforms and standards for technical communication, our customers will reach new and higher levels of efficiency. Our vision relies on video calling to make highly qualified remote experts available on demand, and the data from Building Information Management (BIM) systems will enhance those live video communications using Augmented Reality.

Converging Worlds

There have been significant improvements in video calling and data sharing platforms and protocols since their introduction two decades ago. The technologies have expanded in terms of features and ability to support large groups simultaneously. Using H.264 and custom extensions, a platform or “communal space” permits people to interact seamlessly with remote presence tools.  The technology for these real time, parallel digital and physical worlds is already commonplace in online video gaming. 

But there are many differences between what gamers do at their consoles and enterprise employees do on job sites. As our professional workforce increasingly uses high-performance mobile devices and networks, these differences will decline. Protocols and platforms will connect a global, professionally certified talent pool to collaborate with their peers on-site. 

Enterprises also have the ability to log communications and activities in the physical world in a completely accurate, parallel digital world.

Growth with Lower Risk

We believe that introducing next generation Collaborative Work Environments (CWE) will empower managers in many large industries, such as engineering, construction, aviation and defense. They will begin tapping the significant infrastructure now available to address the needs of technical personnel, as well as scientific research and e-commerce challenges. When companies in these industries put the latest technologies to work for their projects, risks will decline.

Most IT groups in large-scale engineering and construction companies now have an exhaustive register of 3D models that describe every part of a project. These are developed individually and used from initial design through construction. But these have yet to be put to their full use. One reason is that they are costly to produce, and companies are not able to re-use models created by third parties. There are no codes or systems that help the companies’ IT departments determine origins of models or if the proposed model is accurate. The risks of relying on uncertified models, then learning that there is a shortcoming or the model is not available when needed, are too great.

Another barrier to our vision is that risk-averse industries and enterprises are slow in evaluating and adopting new hardware. Meanwhile, hardware evolves rapidly. In recent years, video conferencing has matured in parallel with faster processors and runs on many mobile platforms. Specialized glasses (such as ODG´s R-7s, Atheer Air and, soon, Microsoft’s HoloLens), helmets (DAQRI´s Smart Helmet), real time point-cloud scanners (such as those provided by Leica or Dot Products) or even tablets and cell phones can capture the physical world to generate “virtual environments.”

With enterprise-ready versions of these tools coupled with existing standards adopted for use in specific industries, the digital and physical worlds can be linked, with data flowing bi-directionally in real time. For example, a control room operator can see a local operator as an avatar in the digital world. By viewing the video streaming from a camera mounted on the local operator’s glasses, the remote operator can provide remote guidance in real time. 

Standards are Important Building Blocks

At Talent Swarm, we have undertaken a detailed analysis of the standards in the construction industry and explored how to leverage and extend these standards to build a large-scale, cloud-based repository for building design, construction and operation.

We’ve concluded that Building Information Management (BIM) standards are reaching a level of maturity that makes them well suited for developing a parallel digital world as we suggest. Such a repository of 3D models of standard parts and components will permit an industry, and eventually many disparate industries, to reduce significant barriers to efficiency. Engineers will not need to spend days or weeks developing the models they need to describe a buttress or other standard components.

Partnerships are Essential

The project we have in mind is large and we are looking for qualified partners in the engineering, construction and oil and gas industries, and with government agencies, to begin developing initial repositories of 3D models of the physical world.

By structuring these repositories during the design phase, and maintaining and adding to this information in real time from on-site cameras, we will be able to refine and prove CWE concepts and get closer to delivering on the promise.

Gradually, throughout the assembly and construction phases we will build a database that tracks the real world from cradle to grave. Analyzing these databases of objects and traces of physical world changes with Big Data tools will render improvement and maintenance insights previously impossible to extract from disjointed, incomplete records. We believe that such a collaborative project will pave the way towards self-repairing, sentient systems.

We look forward to hearing from those who are interested in testing the concepts in this post and collaborating towards the development of unprecedented Collaborative Work Environments.  




New Augmented Reality Case Studies Suggest Productivity Improvement

In the future, Augmented Reality could play a role in a variety of production or assembly processes. On the one hand it can provide support for those working on individual, custom products made in mom-and-pop shops or by specialized welders on location. At the other extreme, Augmented Reality can also play a role in high-volume, low-mix manufacturing in factories full of automated and specialized machines.

In highly automated production facilities, workers are few and far between. Their role is to anticipate and respond to the needs of machines. These machines usually have dozens or even hundreds of sensors continually capturing information about the machine’s activities in the real world.

In today’s factories, most sensor data is sent directly to a control room. Human operators receive alerts or make decisions based on raw readings or on algorithms that analyze the sensor observations, and then go to the machine to perform planned and unplanned procedures on the equipment. The operator travels between the control room and the production machinery to determine the status as procedures are implemented. There may be changes in the data while the operator is in transit. The operator may make mental errors, forget or invert data when transcribing observations or once at the machine.

New case studies recently released by AREA member DAQRI provide a glimpse into the future.

Kazakhstan Seamless Pipe Steel Operators See More

A team of DAQRI solution architects visited the Kazakhstan Seamless Pipe Steel (KSP Steel) factory in Pavlodar, Kazakhstan and studied the problems facing machinery operators up close. They then developed and demonstrated an application for Hot Rolling Mill Line optimization using the DAQRI Smart Helmet.

Live machine performance data could be seen in real time by those using the DSH when on the shop floor. The factory supervisor remarked that this technology has the potential to “decentralize” the control room and reduce the time for workers to respond to machinery performance data.

The results of the demonstration suggest that using Augmented Reality in the manner implemented by this project could reduce downtime by 50% and increase machine operator productivity by 40%.

More information about this project and a video of the DSH in use are available on the DAQRI web site.

HyperLoop Welders Receive Support on the Spot

A project involving the DSH on the HyperLoop, a transportation system invented by Elon Musk and being prototyped in 2016, demonstrates another use case that has a great deal of potential to offer productivity gains.

In a proof of concept with HyperLoop engineers and the DSH Remote Expert application, experts in a central “command” center view live video coming from remote robotic welders. The supervising engineer in the Los Angeles office sees construction progress and provides audio and telestration guidance while a welder performs a very specific spot weld. The description of the project and a video of the DSH in use are also available on DAQRI’s web site.

Tip of the Iceberg

These case studies reveal the potential for dramatic productivity improvements when workers are equipped with Augmented Reality-assisted systems such as the DSH.

Other enterprise customers are testing the use of Augmented Reality for manufacturing and production of a wide range of products. Stay tuned! New case studies with details about the potential for significant customer benefit will soon be coming to light.

If you have a case study that you would like to share, provide a link to it in the comments of this post or contact the AREA’s editorial team. We will be happy to support the preparation and publication of your case studies and testimonials.

Daqri_logo_Horizontal-sm

 




Enterprise Augmented Reality Makes a Splash at CES 2016 – Part 2

In our previous post about the event we focused on the exhibits and demonstrations of enterprise Augmented Reality found on the CES exhibition floor. But to cast CES as only an exhibition experience is shortsighted. Discussions and demonstrations of enterprise Augmented Reality during the four-day event were not limited to the vast and crowded exhibition halls.

Beyond the Exhibition Halls

Some companies, including Atheer, an AREA founding sponsor member, were demonstrating their new hands-free display technology and development kit in private suites. Such environments are more conducive to advancing business discussions with potential partners and customers.Some of those prospective new partners and customers joined AREA members and guests on the evening of January 6.

Some of those prospective new partners and customers joined AREA members and guests on the evening of January 6. Over 40 enterprise AR providers and customers gathered for casual networking during which members provided insights and shared their views on enterprise Augmented Reality trends. The international crowd included representatives from Portugal, Spain, Canada, Taiwan, Hong Kong, Germany, France and, of course, the United States.  

Augmented Reality in the CES Conference

In addition to featuring the DAQRI Smart Helmet during the Intel keynote, this year CES also featured enterprise AR during a panel discussion in which I participated on behalf of the AREA. While I shared the stage with Christopher Stapleton of Simyosis, Neil Trevett of NVIDIA and Ralph Osterhout of Osterhout Design Group (ODG), Mashable’s Tech Editor Pete Pachal moderated the “What’s Next for Augmented Reality?” session.

We began by debating the age-old question of whether it is important for customers to understand the differences between Virtual and Augmented Reality. The confusion between these two concepts lingers and increases the risk of customers thinking that Augmented Reality is “just a game” or a gimmick.  In the end, we agreed that when there is an opportunity, a first-hand demonstration quickly clarifies the differences between AR and VR. 

We explored the wide range of use cases for Augmented Reality and shared opinions about which industries or use cases would be likely to break out in 2016. Panelists also explored if and when interoperability might come and the role of emotions as part of delivering meaningful value to users.

CES5

 

The Future

Despite its name, CES isn’t just a consumer electronics or technology show. In my opinion, it’s currently the world’s most important event when it comes to seeing and trying on the latest (and future) enterprise Augmented Reality hardware. And, even if the environment does not lend itself to realistic demonstrations, touching new hardware is extremely important when making buying decisions. This is the appeal that motivates customers and providers to make plans to attend CES, then drives them to crawl the halls looking for those high value partners.

CES1

“The quality of the discussions at CES is far more mature than in past years,” Jay Kim, Chief Strategy Officer of APX Labs, remarked to me. “This year we’re spending very little time explaining the concept of Augmented Reality or the use cases for it, and having more meaningful discussions with new partners and customers.”

While the establishment of new contacts made at CES is a compelling benefit, converting those to customers and generating new revenue streams or highly successful enterprise AR projects remains a year-long (or longer) process.

Furthermore, CES logistics are an issue. Getting to Las Vegas for the event is fraught with problems due to congested air traffic. The accommodations are expensive; it’s also painfully difficult to navigate the large exhibition halls. There’s a lot of waiting around in long lines. While waiting or walking around it’s common to feel that there are people we should be meeting but who, due to lack of time or high congestion, we miss.

What do you think? Do so many people really need to endure this annual punishment to see the future of enterprise Augmented Reality hardware first-hand?  Did you attend CES 2016 and have observations you can share with us?




Meeting and Managing Enterprise Augmented Reality Risks

As with other new technology introductions, enterprise Augmented Reality projects are fraught with familiar and new risks. To move forward with enterprise AR projects on reasonable budgets and schedules, it’s important to acknowledge the existence of risks and to find ways to creatively manage them.

Risk 101

Managers should take a short course on risk as part of their preparation for taking on Augmented Reality introduction projects. One of the first lessons in risk management offers different ways to classify and prioritize the risk types or sources. 

To my view, there are four classes of risk for enterprise Augmented Reality:

  • Technological maturity risks
  • Financial risks
  • User acceptance risks 
  • Regulatory/corporate policy risks

Four classes of AR project risk

While this is good theory, in practice most risks are interconnected. Sometimes addressing one risk increases another.  Furthermore, the type of project will impact the number and type of risks within each class.

Bring in All the Project Stakeholders

Just as with other aspects of Augmented Reality, management of risks is a multi-disciplinary process. It’s important to have representatives of all the stakeholders in the discussion of risk and to keep them engaged as the project advances to monitor and adjust the risk assessment.

For example, a representative from your corporate finance group will have different perspectives than the representative of the labor union. The IT department will keep an eye on security and the safety managers will be looking for a different set of risk sources.

If possible, establish a shared risk tracking system for the project and maintain a schedule of regular reviews.  Some risks are reduced or eliminated quickly while others could escalate and derail a very promising project.

Where Rubber Meets the Road

In the end, all stakeholders realize that, as with any new technology introduction, there’s not a silver bullet that will remove all known and imagined risks. The best the project manager can aim for is reasonable management of risk.

Watch the AREA webinar archive to learn more about this topic and recommendations to project managers. AREA members will also be discussing how they have approached or address these risks in real world settings.




Enterprise Augmented Reality Makes a Splash at CES 2016 – Part 1

This year, enterprise Augmented Reality was highly visible at the CES, the tech industry’s gathering in Las Vegas.

Enterprise Augmented Reality’s first “prime time moment” of 2016 was when Brian Mullins, CEO of DAQRI, an AREA founding sponsor member, was invited by Intel CEO Brian Krzanich to the CES main stage. During the event’s pre-show keynote address, thousands of media and analysts and tens of thousands of attendees watched as an assistant wearing the shiny white helmet examined a maze of pipes. As Mullins described the DAQRI Smart Helmet’s features and benefits, video output of the helmet, including pipes with readings visible in Augmented Reality, appeared on the stage’s mega screens.

CES4

Rising Numbers

Although it lasted less than five minutes of the nearly two-hour keynote, DAQRI’s Smart Helmet demonstration caught the attention of major media outlets and produced dozens of interviews, posts and articles. And, with this coverage, enterprise attendees at CES and tens of thousands of professionals who have watched (or will watch) the segment since its enactment can more easily understand that enterprise Augmented Reality has the potential to improve workplace performance and reduce risk.

While many new customers are only beginning to understand its potential, many of those who have done pilots and are now seeking to go to scale also attended CES 2016. Visitors from hundreds of large enterprises such as Caterpillar, Phillips 66, Pratt & Whitney, CNH Industrial and Northrup Grumman, and dozens of government agencies and smart cities, including Brussels, Amsterdam and New York, were prowling around the booths of AR technology vendors, listening carefully and asking probing questions about volume pricing and service options.

If the DAQRI Smart Helmet had been worn around the 2.47 million square feet of CES exhibition floor, it would have helped its user to find DAQRI’s demonstration which was featured in a corner of the Intel booth alongside the AR-assisted sand table provided by Design Mill, also an AREA founding sponsor member.

It could also have helped customers find other AREA members including Bosch and Huawei, and APX Labs, which had demonstrations in both the Sony and Vuzix booths. The helmet might have led its user to nearly 80 other booths where enterprise Augmented Reality-enabling technologies or systems were featured. While small in comparison with the total 3,800 CES exhibitors, 2016 brought out nearly double the number of relevant exhibitors and demonstrations we found in 2015. In 2016, most AR exhibitors at CES were showing or using transparent hands-free display technology or components with which such products are manufactured.

Its Own Marketplace

CES helps visitors focus on product segments by creating zones it refers to as “marketplaces.” At the center of the large Virtual Reality and Gaming Marketplace was a perpetual line of attendees wrapped around the giant black Oculus booth patiently waiting for their turn to sit in a theater while wearing a VR display for 10 minutes.

CES3

Next to it was the first and significantly smaller CES Augmented Reality Marketplace, where exhibitors included Marxent, Matter & Form, Occipital, Lumus Optical, VanGogh Imaging, InfinityAR, ODG and Sony Electronics.

ODG had the Augmented Reality Marketplace’s largest footprint dedicated to enterprise AR demonstrations. Inside a closely guarded cage (a miniature “marketplace” within the CES Augmented Reality Marketplace) were members of the ODG Reticle Partner Program showing their solutions to increase workplace safety, improve productivity and streamline complex workflows. Demonstrations by Optech4D, Vital Enterprises, Augmenta, and ScopeAR featured utilities, oil and gas, aerospace, logistics and automotive industry use cases, while other parts of the booth allowed visitors to discover ODG’s R-7 and to try on the next generation device sporting a 1080p resolution and 50-degree field of view display. ODG suggests that users of the next generation smart glasses will also have the ability to control opacity, offering both optical see-through Augmented Reality and Virtual Reality experiences with the same device.

odg

Also within the CES Augmented Reality Marketplace:

  • Sony had a small booth in which several partners were demonstrating enterprise solutions using SmartEyeglass and a station showing the new Rochester Optical lenses.
  • InfinityAR demonstrated an Augmented Reality-assisted office using its 3D tracking based on the company’s stereoscopic camera technology combined with an InvenSense IMU.
  • Nearby, Lumus was showing its latest optics for integration into smartglasses and its new developer kit. The company announced that it has entered in partnership with InfinityAR and SUNCORPORATION, a Japanese IT and entertainment provider, to use its optics in AceReal, a new product targeting enterprise markets.

Surrounding the Augmented Reality Marketplace were many vendors showing the latest mobile 3D scanning systems that capture the real world when authoring AR experiences. Partnerships with these providers should accelerate the speed of authoring AR experiences. For example, VanGogh Imaging announced that it has integrated its advanced tracking technology with 3D capture technology provided by Orbbec3D, a newcomer to this product segment.

But the CES Augmented Reality Marketplace didn’t meet all the relevant exhibitors’ budgets or requirements, so finding other examples of enabling technology and enterprise Augmented Reality experiences involved careful research and route planning. 

Beyond the Marketplace

Augmented Reality was in many of the automotive industry booths as a feature of new “safe driving” technology packages and a future component of automated or computer-assisted driving. Although these consumer-facing solutions are quickly coming to market, the same technologies could also be made available for helping workers to navigate, operate or service their trucks, forklifts or other types of industrial vehicles.

vuzix

Elsewhere there were dozens of enterprise AR use cases illustrated. In the Dassault Systems booth, Augmented and Mixed Reality were shown as part of a creative data-driven workflow. In Vuzix’s CES booth, partners using the M100 were illustrating use cases in service, maintenance and logistics. XOEye Technologies conducted regular remote service calls with representatives at Lee Company, its customer based in Nashville, TN. The printed brochure and industrial design of M300 were available at one of the stands but, unfortunately, working models for demonstration purposes were not. The latest Vuzix VR and video streaming products for consumers occupied the other half of the booth and received considerable attention.

The latest products for enterprise Augmented Reality were also being shown in Sands Expo by:

  • Optinvent, a French provider of optics and fully integrated eyewear
  • Brilliant Service, a Japanese company, introducing mirama, eyewear targeting industrial users
  • Sharing the booth with Brilliant Service was Telepathy, which showed its new “Walker” product targeting Augmented Reality gamers
  • Ryosho, a Japanese company which is distributing the InfoLinker, manufactured by Westunities,in Japan and internationally (from its office in San Jose, CA)

In Westgate hall, the spokespeople in the expansive AltoTech booth explained that the company is preparing to launch its next generation of the Cool Glass product which, despite several differentiators, closely resembles Google Glass. AltoTech plans to release the next generation Cool Glass product internationally from its new offices in the US as well as in China later this year.

To read my observations about other CES 2016 features and highlights, please read the second post on this topic.

Were there CES announcements that you want everyone to know about? What did you find most valuable at CES 2016? Share your thoughts in comments below.




Technical Communicators are Keen to Learn about Augmented Reality

Technical communicators are a technology-savvy audience so they’ve read and heard about Augmented Reality. But most people in this role have yet to acquire knowledge about how it works and hands-on experience with the tools. The 2015 edition of tcworld, the annual conference of the European Association for Technical Communication (tekom), offered a unique opportunity for attendees to satisfy their curiosity and begin filling the gaps in their understanding of this new technology, but left them hungry for more.

Held in Stuttgart, Germany, from November 10-12, tcworld drew over 4,200 delegates from 48 countries and featured ten sessions on Augmented Reality topics. To view abstracts of the Augmented Reality track sessions, you can select “Augmented Reality” from the “Topic Area” drop-down menu on the English language program here, and the German program here. All seats were filled and people stood along the walls during the AR sessions delivered in German, the primary language of business for the attendees and, while there remained open seats, English language sessions were also well attended.

tcworld1

Technical Communication Toolsets and AR Demos

In addition to the conference sessions, tcworld has an extensive exhibition floor. Over 200 vendors offering software and services filled two halls. In their own zone there were a half dozen technical communication associations, several emphasizing localization. Some exhibitors, such as Bosch, Cognitas, Semcon, Kothes! and others, demonstrated that their tools could produce and manage Augmented Reality experiences, and the added value of AR as an alternative or extension of traditional user manuals and service documentation. Although many questions about AR delivery hardware platforms such as smart glasses were asked, the tcworld vendors’ AR demonstrations exclusively used tablets. The Oculus Rift demonstration in the Canon/Cognitas booth attracted curious visitors, many of whom were unclear on the distinctions between VR and AR.

Some vendors shared that, compared with last year’s offerings, they are able in 2015 to demonstrate improved object tracking and more complete AR-enabled systems as a result of dedicating more internal resources to research and development. It’s clear that both those in the aisles and the booths consider AR a promising new technical information visualization and delivery method.

Enterprise Augmented Reality Use Cases

In contrast to some other events focusing on AR topics, the tcworld AR session speakers consistently featured real world use cases for the technology. Their high-quality presentations highlighted practical benefits of introducing AR, such as how employees can do their jobs faster and better with the technology, rather than using AR as a marketing gimmick to sell more products.

For example, Andrew Head of Semcon Product Information described and then demonstrated an AR-assisted training use case executed as part of a project with J.C. Bamford Excavators Limited (JCB).

In this use case, overlays of engine maintenance information offer advantages over standard service manuals such as:

  • Improved learning efficiency and knowledge retention as a result of users associating instructions with an object’s features and visual cues, thus promoting spatial learning.
  • Increased awareness and retention of the safety regulations, as a result of users being required to read regulations prior to starting the AR-based experiences.

Head reported that users were generally happy with the technology and were motivated to explore its further usage, asking such questions as, “When can I get this on my phone?”

In another presentation, Dirk Schart of RE’FLEKT GmbH presented projects underway for urban dwellers and service technicians and described experience delivery on a variety of AR-enabled devices, such as smartphones and smart glasses.

Specifically, Schart described use cases involving:

  • An AR-enabled emergency responders’ system (such as this one) for firemen that displays a user interface projected on a fireman’s glove and features remote streaming with dispatchers.
  • A smart helmet by AREA member DAQRI for providing both hard hat protection and AR visualization in the field of view.
  • A tangible UI developed by the MIT Media Lab, permitting AR-based visualization of additional user interface features overlaid on products.
  • Hybrid-city lighting that uses projection AR to guide pedestrians along walkways.
  • An AR-enabled window created by RE’FLEKT and displaying information for passengers as part of the Hyperloop transportation system.

Robert Schaefer of TID Informatik GmbH and Daniel Schultheiss of AllVisual presented interesting AR-assisted helicopter maintenance use cases, developed on the basis of the CAD models of Schaefer’s CATALOGcreator product.

The product not only enables real time guidance of mechanics in their daily work, but it showcases another, and just as important aspect of enterprise AR: the visualization of enterprise data and its role in industry 4.0.

Schultheiss spoke about the sheer amounts of data collected by their helicopters before, during and after flight, and how they leverage the data to enhance maintenance. For example, in-flight data can be collected and used for both (predictive) maintenance and insurance. They use tool chains from SAP, TID Informatik and Wikitude to simplify tablet-enabled helicopter servicing, and have developed an integrated data model and cloud technologies to support efficient data collection and usage.

Requirements for Augmented Reality Systems

Several sessions focused on practical aspects of implementing Augmented Reality systems in enterprise.

Representatives from Bosch shared their experiences and challenges with AR projects, and compiled a list of the right questions to ask. The presentation revisited many of the themes that AREA member representative Juergen Lumera spoke about at the AR in Automotive conference in Cologne on October 5th

Simone Schappert, a Masters student at Karlsruhe University of Applied Sciences provided a survey of tools and technologies for incorporating Augmented Reality in work instructions, and explained the basics of tracking methods (such as marker and markerless tracking), as well as the current landscape of tool vendors. She emphasized the strengths of AR technology in user manuals, such as providing a more immediately helpful (and emotional) experience for users.

In another session of the AR track, Rob Heemels from Canon Business Services discussed the creation of a Learning Activity Plan (LAP) using inputs from intelligent devices and sensors with Augmented Reality (along with some VR) for presentation. Based on recent projects with the Dutch Océ International Training Center

Mixed and Augmented Reality increases target audience engagement with training materials when the service professionals are remote. Professionals who successfully use mixed and Augmented Reality as part of training are likely to feel comfortable with extending its use in service and support use cases.

Augmented Reality is Technical Communication

At heart, the visualization of data in context with Augmented Reality is an important development in the field of technical communication. The presence of AR as a topic in the conference presentations, as well as the interest of traditional technical communication vendors for the technology serves are clear evidence of AR’s growing role. We’re sure to see increasing numbers of AR use cases in the workplace, as well as sessions at tcworld.

Did you attend tcworld 2015? What were your impressions of the Augmented Reality offerings? Leave your comment below.

tcworldconference-2015




Advancing Toward Open and Interoperable Augmented Reality

Enterprise Augmented Reality engineers and content managers who published experiences created with Metaio’s software tools have or will soon encounter a situation they didn’t anticipate: the publishing and delivery environments are unsupported and not evolving to take advantage of the latest enabling technologies.

Are you among this group? If so, you are not the only one to find yourself in this uncomfortable situation.

If there was a mandate to continue providing the value of their AR experiences to end users, customers of other AR software providers who are no longer supporting or advancing their platforms with the latest technology innovations hit the same roadblock. Prior to agreement on standards, they could not “port” their experiences to another AR platform. Evaluating and choosing another proprietary AR technology platform, and then investing in re-authoring, testing and re-deploying AR experiences based on their original designs, was the only way forward.

Unfortunately, some of those reading this blog are in this awkward position today.

Successfully addressing the root causes of low AR experience “portability” and the inherent lack of integration or interoperability between AR authoring and publishing systems is an important, highly collaborative process.  Different parts of the AR ecosystem must agree that there are issues, firstly, and then on principles for collaboration. Then, based on shared conceptual frameworks, they must work together towards implementing those principles in their workflows and solutions.

Supporting that collaborative process is the reason I’ve been leading the grassroots community for open and interoperable Augmented Reality content and experiences since 2009.

Is There Really a Problem?

Interoperable Augmented Reality is not a high priority for most people. Only about a hundred people are consistently investing their time in advancing the principles of open and interoperable Augmented Reality. We know one another on a first name basis; many of us compare notes in person a few times per year. Another few hundred people know of such activities but don’t directly invest in meaningful ways.

For most companies, the investment in AR has not been great. A few tens of thousands of dollars to rebuild and deploy a half dozen carefully handcrafted AR experiences is minor by comparison to investments in other enterprise technologies. 

“There’s still too much innovation to begin working on standards,” is another commonly heard refrain. Clearly they haven’t been reading the posts or listening to the presentations made by AREA member IEEE Standards Association, or leaders of other standards development groups. When designed collaboratively and to address interoperability in strategic places, there are many examples of standards doing the reverse.

There are other reasons for many to turn a blind eye to the problems. They are valid for different people to different levels.

This is a Serious Problem

In my opinion, ignoring the lack of open and interoperable Augmented Reality solutions and services is doing everyone a disservice.

The fact that only a relatively low amount of money has been invested to date is a poor justification for investing yet more time and money into building experiences with another proprietary platform, only to have the same scenario in a matter of months or years.

In fact, innovation in Augmented Reality is not what it should be today because many of the best AR developers are building a better mouse trap: smart engineers are working to solve problems that have, for the most part been solved by others, in a different way. Whether it’s for reasons of avoiding encroachment on a third party’s patents or something else, this investment of effort is in highly integrated proprietary silos and at the expense of solving other problems that remain unaddressed.

There are three more serious problems with having only proprietary technology silos and very low use of widely agreed standards for Augmented Reality experiences. The first of these is that enterprises with assets that could be leveraged for AR experiences are unable to integrate production of AR experiences into their corporate workflows. This lack of integration between AR as a method of information delivery and other information delivery systems (e.g., web pages and mobile services without AR support) means we can’t seriously stand before a CIO and recommend they support the development of AR content. What we are recommending requires setting up another entirely separate and different content management system.

In the same vein, the second reason that enterprise CIOs and CFOs are justifiably reluctant to deepen their investment in AR projects is that they cannot deploy modular architectures in which multiple vendors can propose different components. In today’s landscape of offerings, it’s all or nothing. The customer can buy into provider A’s system or that offered by provider B. If provider C comes along with a better option, too bad.

The third reason the lack of standards is a serious problem worthy of your support is closely related to the other two. Deep collaboration between AR-enabling technology vendors (providers of technologies) and service providers is currently very difficult.  They are not working to improve customer outcomes: they are working much more on competing with one another for attention and for the small investments that might be made.

Three serious enterprise AR obstacles that agreements about open and interoperable AR could reduce

  1. Low or lack of content or experience portability between proprietary technology silos

  2. Strong customer aversion to risks due to vendor lock-in

  3. Low cooperation between competitors or ecosystem members to partner for best customer outcomes

This situation with lack of interoperability and fear of vendor lock-in would be addressed if the vendors took a more serious look at possible open interfaces and standards within a larger framework. Conversely, vendors might study new approaches and establish some level of interoperability if they believed that customers would respond by increasing their budgets for Augmented Reality.

This is all very serious.

Another recent development is not helping: it’s clear that some internet and IT giants are paying a lot of attention to AR. The lack of visibility into what highly competitive and successful companies like Microsoft, Google, Apple and PTC will do about AR interoperability and integration has cast a very cold spell over enterprise AR adoption.

Their lack of support for standards and their unwillingness (to date) to shed light in a public way on how they will cooperate or how their proposed (future) systems will interoperate is causing so much uncertainty. No CIO or CFO should seriously invest in enterprise Augmented Reality until these companies’ plans with respect to integration and interoperability are clearer.

Progress is Being Made

We should be open to the possibility that 2016 will be different.

Thanks to the dedication of members of the grassroots community, the situation is not as bleak as it could be. A few weeks ago a few dozen members met in Seoul, Korea, to compare notes on progress. SK Telecom, a strong supporter of open and interoperable Augmented Reality, hosted two days of sessions. We heard status updates from four standards organizations that have highly relevant activities ongoing (Khronos Group, Open Geospatial Consortium, IEEE and ISO/IEC). We also received reports from AR developers who are working to advance their solutions to support standards.

The fact that the ISO/IEC JTC1 Joint Adhoc Group for Mixed and Augmented Reality Reference Model is nearing completion of its work is a major development about which I presented in Seoul.

In the spirit of full disclosure: the community of people in support of open and interoperable AR was the environment in which this work began, and I have been a member of that ad hoc group since its formation. If you would like to obtain a draft of the Mixed and Augmented Reality Reference Model, please send me an email request.

We are also seeing increased interest from industry-centric groups. There is a German government supported project that may propose standards for use in automotive industry AR. The results of an EU-funded project for AR models in manufacturing became the basis for the establishment of the IEEE P1589 AR Learning Experience Model working group (which I co-chair). In a recent meeting of oil and gas industry technologists, formation of a new group to work on requirements for hands-free display hardware was proposed.

These are all encouraging signs that some are thinking about open and interoperable Augmented Reality. If you want to monitor the activities of the grassroots community focusing on this topic, and to receive announcements of upcoming meetings, visit this page and register yourself for one or more of the mailing lists.

Have you seen other signs that there is increasing awareness of the problems? Do you know about any new standards that should be monitored by and presented during a future meeting of the grassroots community?