1

Introducing the AR in Strategic Enterprise Sessions

In contrast to companies that are responding and reacting to changing conditions without a plan, strategic enterprises systematically apply the best planning and management processes.

A strategic enterprise successfully integrates emerging and mature systems to improve processes and outcomes. Managers in strategic enterprises factor in their existing information systems development and maintenance efforts, as well as any new technology introduction when guiding their businesses towards the achievement of goals.

ARiseBlogPost

The AREA and AR in Strategic Enterprises

AREA members met with strategic enterprise managers in Sheffield on July 1. The focus of the event was on how to introduce and integrate AR into strategic enterprises.

Over the course of the day, AREA members shared their experiences and recommendations for choosing use cases, preparing data for use in AR experiences, choosing and training users for AR pilots and introduction activities, measuring impacts and managing risks associated with AR introduction.

screenshot_257

The AREA’s Value Added

The sessions are a perfect example of AREA members demonstrating their thought leadership and collaborating to share knowledge with others. In addition to the valuable discussions made possible during the networking and panel sessions, the recordings of the presentations are now available for viewing on YouTube.

Through the ARise event and its sessions, the AREA and its members are accelerating AR adoption in the corporate environment. As Executive Director of the AREA, I am proud to present the 11-session series and hope you will gain additional insights into the ways Augmented Reality can benefit your enterprise.




Augmented Reality Can Increase Productivity

Technological and cultural shifts that result in enhancements in manufacturing tend to increase complexity in products and processes. In turn, this complexity increases requirements in manufacturing and puts added pressure on organizations to squeeze out inefficiencies and lower costs where and when feasible.

This trend is acute in aerospace, where complexity, quality and safety require a large portion of final assembly to be done by humans. Corporations like AREA member Boeing are finding ways to improve assembly workflows by making tasks easier and faster to perform with less errors.

At ARise ’15, Paul Davies of Boeing presented a wing assembly study in collaboration with Iowa State University, showing dramatic differences in performance when complex tasks are performed following 2D work instructions versus Augmented Reality.

A Study in Efficiency

In the study, three control groups were asked to assemble parts of a wing, which required over 50 steps to assemble nearly 30 different parts. Each group performed the task using three different modes of work instruction:

  • A desktop computer screen displaying a work instruction PDF file. The computer was immobile and sat in the corner of the room away from the assembly area.
  • A mobile tablet displaying a work instruction PDF file, which participants could carry with them.
  • A mobile tablet displaying Augmented Reality software showing the work instructions as guided steps with graphical overlays. A four-camera infrared tracking system provided high-precision motion tracking for accurate alignment of the AR models with the real world.

Subjects assembled the wing twice; during the first attempt, observers measured first time quality (see below) before disassembling the wing and having participants reassemble it to measure the effectiveness of instructions on the learning curve.

Participants’ movements and activities were recorded using four webcams positioned around the work cell. In addition, they wore a plastic helmet with reflective tracker balls that allowed optical tracking of head position and orientation in order for researchers to visualize data about how tasks were fulfilled. Tracker balls were also attached to the tablet (in both AR and non-AR modes).

First Time Quality

To evaluate the ability of a novice trainee with little or no experience to perform an operation the first time (“first time quality”), errors are counted and categorized. The study revealed that tablet mode yielded significantly less errors (on average) than desktop mode.

In the diagram above, the blue bar represents the first assembly attempt and the green bar is the second. The diagram also shows that subjects using Augmented Reality mode made zero errors on average per person, indicating the potential of AR to improve first time quality for assembly tasks.

In the diagram above, the blue bar represents the first assembly attempt and the green bar is the second. The diagram also shows that subjects using Augmented Reality mode made zero errors on average per person, indicating the potential of AR to improve first time quality for assembly tasks.

Rapid assembly

ARIncreaseProductivity-graph2

This diagram measures time taken to complete tasks by mode, both the first and second time. AR-assisted participants completed tasks faster the first time than with other modes

Conclusions

Overall the study witnessed an almost 90% improvement in first time quality between desktop and Augmented Reality modes, with AR reducing time to build the wing by around 30%. Researchers also found that when instructions are presented with Augmented Reality, people gain a faster understanding and need less convincing of the correctness of tasks.

Bottom line is that this study shows and quantifies how complex tasks performed for the first time can benefit from Augmented Reality work instructions. If the task is done with fewer errors and faster, the impact on productivity is highly significant.

Where can Augmented Reality make an impact in your organization?




ARLU—the Right Event at the Right Time

EPRI is proud to collaborate with the AREA on the first ever Augmented Reality in Leading-Edge Utilities (ARLU) this July, where we will lead the industry to discern a disruptive technology and anticipate and solve issues through collaborative effort. In fact, ours is the only industry we know of where Augmented Reality as a disruptive innovation is being openly discussed. This isn’t going unnoticed.  Other industries are pointing at utilities and saying “Hey, look what they’re doing.”  Utilities are rarely perceived as having an active role in exciting new trends.

Three in One

The ARLU event is, in fact, three events in one.  First, it’s a meeting where EPRI and utilities industry representatives will present their Augmented Reality research and projects to vendors developing applications for the utility industry.  Vendors will see where utilities are placing emphasis in their development efforts and learn about the issues they‘re encountering.  Requirements such as size, weight and battery life of wearable technologies will be explored through the presentations, and will impart to participants a deeper understanding of the issues facing introduction of Augmented Reality in utilities.

Next, vendors will present their latest technologies for immediate feedback from industry experts. Not all technologies fit every utility situation and discussions around fit for purpose of presented technologies will be lively and informative. Finally, a workshop on gaps in existing standards will bring multiple perspectives to the problems of creating safe, comfortable and interoperable AR experiences in the utility space. 

Thought Leaders

Having subject matter experts together in one room is the one of the key objectives of this meeting. As we’ve been preparing the ARLU event, we’ve invited some of the brightest people in the utilities and utilities software industry to mix with thought leaders in Augmented Reality. We expect that the impact will last much longer than the two days in July because new ideas will emerge in the weeks and months that follow as the participants who meet in Charlotte continue to develop relationships.

We expect to capture some of the ideas these thought leaders generate and to share the outcomes of discussions with the broader community so that many others can also benefit.

Time is Right

We feel this is the right time for such a conference. Today, judging a technology for what it can do right now is the wrong way to look at it.  Advances occur almost daily and it’s better to first define what’s needed to build a future state of the technology. That’s where Augmented Reality is today. Practical applications are just now being introduced but an explosion of functionality is coming. By the time the average person notices the ubiquity of Augmented Reality, many of the issues we are going to discuss in Charlotte will already have been settled.

Wearable technologies with Augmented Reality are at a stage where real utility applications are possible. At the same time, shifting demographics at utilities are bringing in younger, less experienced workers—as older, more practiced workers are leaving. There needs to be an orchestrated “changing of the guard” where institutional knowledge, gained by years of hard work and experience, is transferred to a younger, more tech-savvy generation. The technologies presented at ARLU will deliver remote expertise and put information at the fingertips of crews composed of less seasoned individuals.

The wise man says it’s better to act on a lightning flash than wait to hear the thunder. That’s why we planned this event in 2015 and look forward to seeing many of the readers of this blog at the first ARLU event.




Why Augmented Reality and Collaboration Make for a Safer and Better World

Augmented Reality (AR)-enabled systems show a mechanic how to repair an engine, or perhaps in the future will guide an inexperienced surgeon in a delicate heart operation. In my opinion, it’s when AR is combined with human collaboration that the magic begins. AR will soon work its way into a variety of applications that are bound to improve our lives, but more importantly, I am convinced it’s to become a catalyst for greater human understanding and world peace.

Augmented Reality Can Bring Us Closer

Everyone’s heart raced when Jake Sculley, the wheel chair-bound Marine in the movie Avatar, first connected his thoughts to those of his avatar, walked and then ran. His mission was to infiltrate the society of the natives, learn their customs and, having gathered that information help destroy their world. Of course, we all know how the story ends…It’s difficult to do harm to those we know. The first step in Hitler’s campaign to eliminate those he considered unworthy was to convince his followers that the others were less than human. In fact, this is a universal technique involved in incitement to violence against another group. It is only when we finally get to know someone that, even if we don’t agree, we can begin to understand and care about them.

Sharing Experiences

AR allows a user to see an enhanced view of reality, placing graphic images and 3D models over the real background. This will be great for building and repairing things by ourselves, but when we combine that capability with modern telecommunications, remote users will be able to participate in those processes with local users in real time, and appear to the wearer of the glasses as if standing alongside them. We won’t just see our grandkids in a Skype screen; we will take them with us on new adventures around the world or in our backyard. An astronaut in space will literally see the hand of the equipment specialist on earth pointing to the board to be replaced as they speak.

Gutenberg changed the world because the printed page could easily display the manuals that apprentices used for learning the trades that freed them from the fields. Radio and then television added sound, motion and recently 3D to the flood of information. Telecommunications has brought the cost of distributing it to practically zero. Now AR combines these capabilities and creates an infinite number of parallel worlds that you may create and visit, as well as acquire skills in from one-on-one instruction. It’s the closest thing to teleportation this side of Star Trek.

Non-verbal communication is said to account for between 55 and 97% (depending on the study) of communication between people. AR will provide practically the same information due to its enabling of “belly to belly” proximity. You will be able to virtually sit in a conference room and interact with other remote participants, watch a theater performance in your living room or tag along with a friend on an exotic trip to a foreign land. That friend will be able to see you, too.

New Ways of Displaying Information

Talk about disruptive. This is downright neutron bomb material. Why do you need a laptop or tablet when you see the screen suspended in mid-air, with the glasses projecting a keyboard on any surface? Gone are large-screen TVs, when everyone sat stationary watching the game from the same angle. Why wouldn’t they prefer it in perfect 3D? Forget glass cockpits in airplanes; why not have all the instruments projected in your field of view? How about infrared images of deer or pedestrians in fog or at night shown on the windshield of your car, to avoid hitting them in time?

Augmented Reality and Collaboration

But, again collaboration use cases will take the cake. The level of empathetic bonding that occurs when you’re in the room with another person will make current social messaging seem like sending smoke signals. Professionals in other countries will virtually know you and work together on projects as I am proposing using the Talent Swarm platform. Along with such proximity-enabled work will come a better understanding of other countries and cultures.

Collaboration is key, but it can’t happen at scale if everyone needs to buy and use exactly the same hardware and software. Collaboration across networks and companies as diverse as the places where humans live and work builds upon deep interoperability. Interoperability with existing and future systems will require a globally agreed-upon set of open standards. We will work within the AREA to strongly advocate for interoperable systems and push for global standards together with other AREA members. Once we have collaborative AR platforms, the benefits of this technology will rapidly serve all people of the world. Becoming an AREA founding sponsor member is, for Talent Swarm, not only common sense, but putting a stake in the ground, demonstrating our leadership for a more productive and peaceful world. We will avoid embarking on another wasteful battle such as VHS vs. Beta, nor allow a single company to reduce the opportunities or lock others out. Christine Perey, Executive Director of AREA, refers to it as our mandate: to ensure that an ecosystem of AR component and solution providers is in harmony with the customers’ needs, and able to deliver the diversity and innovation upon which economic success is based.

Path to the Future

With a concerted group goal centered on the advancement of AR, and with many technological developments both in the works and being introduced at an increasingly fast pace, we will one day look back to 2015 and say, how did we ever get along without Augmented Reality?




Augmented Reality at CES 2015 is Better and Bigger Than Ever

There’s something for everyone at CES. Do you need a way to store your earbuds so the cables aren’t tangled? What about printing icing on a cake?

Roger Kay, a technology analyst who writes for Forbes, recommends breaking up the event into ten parts. It’s not about the horrendous taxi lines or other logistical issues of dealing with so many people in a relatively small area. I walk everywhere I go. I leisurely covered twenty-four miles on the flat Las Vegas ground in four days; there are buses to and from the airport. Kay wants his topics served out in concentrated exhibition floor zones.

Like for Kay, many of CES’ themes lie outside my areas of interest and despite the headaches caused by the crowds, having the option to see and sample the developments in a variety of fields is one of the reasons I return each year.

Finding what I need to see isn’t a matter I treat lightly. A month before heading to Las Vegas I begin planning my assault because the CEA’s web site is horrendously inefficient and their new mobile app pathetic. Using brute force, I locate all the providers of head-mounted personal displays, the providers of hardware that is or could be AR enabling, and the “pure” AR firms with whom I already have relationships. I also plan a long, slow visit through the innovation zones, such as Eureka Park. I know another half day will be dedicated to Intel, Samsung, Sony, LG Electronics and Qualcomm. Then I search for outliers by name.

A few days prior to the event I begin following the news feeds on social media and technology trade blogs. While there, I also scan the headlines for surprises. 

Highlights of my CES 2015

For reasons that don’t have to do with Google Glass, vendors are making good progress in the personal display space.  The first reason is that more companies are experimenting with new combinations of familiar technology components, particularly with hardware. Optinvent is recombining their optical technology with a set of headphones. Seebright is adding a remote control to your smartphone. Technical Illusions is combining reflector technology and projectors with new optics. It’s like gene mixing to produce new capabilities and life forms.

Vuzix demonstrated the new waveguide technology in their optical see-through personal displays for Augmented Reality.

Vuzix demonstrated the new waveguide technology in their optical see-through personal displays for Augmented Reality.

That’s not to say that designs for the “traditional” optical see-through display form factor are standing still. Getting new investments, such as Vuzix received from Intel, is a major accelerator. ODG’s sales of patents to Microsoft in 2014 produced sufficient revenues for the company to develop a new model of their device targeting consumers.

The second reason for the significant advances in the personal display product category is the evolution of components. I saw firsthand in many exhibits, the “familiar” components these displays are must include, such as motion and other sensors, eye tracking kits and optics. All are rapidly improving. For these components, “improving” means smaller size packaging and lower power consumption. 

It was good to focus—if only briefly—on the familiar faces of AREA members such as APX Labs and NGRAIN who were participating in the Epson developer ecosystem booth, and to see the latest Epson products, which seems to be increasingly popular in enterprise. I found APX again in the Sony SmartEyewear zone, where I was able to try on the Sony prototype. I also caught up with executives and saw impressive new AR demonstrations by companies whom I don’t often see attending my events. If you’re interested, I encourage you to click on these links to learn about MetaInfinityAR, Occipital, ScopeAR, Technical Illusions, LYTE, XOeye Technologies, FOVE, Jins Company, Elvision Technologies, Avegant  and Augumenta. I’m sorry if I neglected to include others that I saw at CES.

Although they were around and showing AR or AR-enabling technologies, and we may have crossed paths unknowingly, I didn’t have a chance to meet with Metaio, Lumus, Lemoptix or Leap Motion.

I spent more time than expected visiting and observing the booths of Virtual Reality headset providers who were at CES. There were several exhibition zones dedicated to Oculus VR, with the new Cresent Bay device.  The lines waiting to try on the new Razer OSVR (Open Source VR) system were stunningly long. It amazes me that a small company like Sulon could afford such a huge footprint in South Hall to set up private briefing rooms for its Cortex display for AR and VR, and yet exhibit openly outside.

Elsewhere there were hordes swarming at the Samsung Gear VR and the Sony Project Morpheus zones. What good are all these headsets without content? I stopped in at JauntVR, which seems to be getting a lot of attention these days. I’m sure there were dozens more showing VR development software, but VR is peripheral to my focus.

I was impressed by the NVIDIA booth’s focus on Advanced Driver Assistance Systems this year, demonstrating real time processing of six video feeds simultaneously on the Tegra K1 Visual Computing Module. There were also excellent demonstrations of enterprise use of AR in the Hewlett Packard exhibit. Intel dedicated a very significant portion of its footprint to Real Sense. And, similarly, the Vuforia zone in Qualcomm’s booth has expanded by comparison to 2014. The IEEE Standards Association offered an AR demonstration to engage people about their work.

Automotive companies were also showing Augmented Reality. I saw examples in the BMW pavilion, in Daimler’s area, the Bosch booth, and Hyundai’s prototype cars.

At the other end of the spectrum there were many exciting new products in the pico projector category. MicroVision and Celluon were both showing HD pico projectors for use with smartphones; such technology will certainly be considered for projection AR in enterprise. ZTE and Texas Instruments also introduced their latest pico projector models at CES 2015.

Digging in Deeper

Although no longer in Las Vegas and despite my careful advance planning, I continued with my CES homework for at least a week. For example, I watched the archive of the “New Realities” panel and played back other videos that cover AR and VR at CES on CNET, Engadget, Tested and Financial Times

The IEEE published an analysis of AR at CES in Spectrum that reaches the same conclusion I drew:  the “C” in CES is for Consumer but a lot of consumer technology is going into corporate IT.

I hope I will have digested all that I gathered at CES 2015 before I begin preparations for 2016.




The IEEE Standards Association at Inside AR 2014

This is a guest post by the IEEE Standards Association (IEEE-SA), on their participation at the 2014 edition of InsideAR in Munich.

There has been a lot of hype about Augmented Reality, and concrete examples help us all to grasp how far we have come and how much is yet to be done in the field. For this reason, we at IEEE Standards Association (IEEE-SA) were delighted to see all the new technologies showcased at InsideAR 2014. We also enjoyed talking with the diverse crowd of visitors to our booth, which included folks from wearable tech, mobile software and business development, research, academia and marketing.

Many visitors were aware of IEEE-SA’s activities and were interested in knowing more about our “Bringing Standards to Life” AR app experience, showing how IEEE standards (with focus on IEEE 802® standards) impact their daily lives. Some were interested in IEEE-SA’s role in AR, and whether there were any current standards available.

IEEE believes AR is an enabling tool for a number of technologies, including the broad tech that IEEE serves. – Mary Lynne Nielsen, Global Operations and Outreach Program Manager at IEEE Standards Association

Tools for Expanding Augmented Reality Markets

At IEEE we help companies interested in AR to plan for the future. For example, we offer tools for business planning, such as our complimentary copy of IEEE Scenarios for AR in 2020. Also, the standards-development process offers a path for engineers to realize the full potential of Augmented Reality, as the adoption of open standards fosters innovation and market growth through economies of scale and wider interoperability.

https://www.youtube.com/watch?v=OszYuLx_Onk

We lead campaigns and projects to advance open and interoperable AR. This makes sense, given the scope of IEEE expertise across technology areas that contribute to AR and the proven track record of IEEE for serving as a facilitator and catalyst in widely adopted technologies, such as networking communications and the smart grid.

The IEEE-SA offers a platform for developers and users to innovate for open and interoperable AR. For example, the IEEE-SA’s standards-development process is based on broad global participation and consensus—in alignment with the “OpenStand” principles for global, open, market-driven standards. And, indeed, a wide variety of IEEE standards and projects relevant to AR already exists today.

To facilitate participation from emerging AR domains, the IEEE also explores the establishment of new study groups, projects or standards based on requirements for all segments of the AR ecosystem. To that end, an IEEE-SA Industry Connections activity has been launched to, in part, identify use cases, glossaries, and best practices in the AR technology space.

Furthermore, through participating in meetings of the AR Community and conferences like InsideAR, the IEEE-SA proactively engages with other leaders around the world to encourage global AR market growth.

Conclusion

Overall, we found InsideAR 2014 to be a well-organized and very enlightening event, shedding light on the endless opportunities in the AR space, as it relates to technology overall. There were great sessions covering a range of topics that could provide inspiration across many other industries. We’re looking forward to next year’s event.




It Is All About People

In his presentation on the InsideAR 2014 stage, AREA member Carl Byers of NGRAIN Corporation shared with the audience his conclusion that Augmented Reality is “all about people.” When in the middle of a technology-centric event taking place in the center of the densest AR-populated region of the world (Munich, Germany), it is important to reframe why all activities and investments matter: Augmented Reality helps people to see digital data in context.

The “all about people” guideline applies in medicine as well. Improving patient outcomes is at the heart of Dr. Kavin Andi’s research at St. George’s Hospital at the University of London.

Dr. Andi is an oral and maxillofacial surgery consultant who also practices microvascular reconstructive facial plastic surgery. In his InsideAR presentation Dr. Andi explained how Augmented Reality could provide value to:

  • Designing and communicating tumor removal and reconstructive processes
  • Detecting airway obstruction
  • Planning bone and tissue harvesting

The presentation also introduced some of the many tools surgeons use to achieve positive patient outcomes. Some tools are physical: scalpel, saw, clamps and hoses of many types. And others use software. In addition to the many credentials he has earned in his journey from molecular biology to reconstructive surgery, Dr. Andi has invested heavily in mastering the use of a dozen different software products in a graphics pipeline.

Beginning with scans of patient bodies, the processes he has defined permit surgeons to better prepare for surgery. By studying and planning procedures to be performed in the operating theater in minute detail in advance, the time spent in the actual theater is reduced. 

Patient scanning is highly sophisticated, involving measurements of tumor and bone density, blood flow and other critical factors. But as in other fields where big data is accessible to the expert, the data needs to be accompanied with analytical tools, and in the case of surgery with real time visualization.

The first gap Dr. Andi needs technology to address is advanced segmentation. Better segmentation of the data will separate the volume of the tumor from the surrounding area that is affected. Then, with this data superimposed on the patient, Augmented Reality can help surgeons and assistants—and ultimately the patient—to visualize the proposed treatment.

Leaving diseased tissue in the patient, or removing too much tissue due to low accuracy visualization can impact patient outcomes. That is why surgeons also need (for registration and tracking with hands-free displays) to have sub-millimeter accuracy on deforming surfaces.

When this can be achieved, Dr. Andi envisages that he and other surgeons will be able to perform complex procedures with 3D digital overlay on the patient instead of constantly referring to a display on the side.

To learn more, watch Dr. Andi’s InsideAR 2014 presentation below.




Christine Perey to Speak at 8th Annual InsideAR Conference

We are excited to be a part of Metaio’s InsideAR in Munich, Germany, the largest annual Augmented Reality conference in Europe. Every year, the conference brings together international corporations and thought leaders to discuss developments in Augmented Reality and to showcase the latest innovations. Last year’s conference brought together over 800 participants, 45 speakers and more than 400 international companies.

This year, the two-day conference’s themes will be wearable computing and 3D sensors. Among the main stage speakers will be Mary Lynne Nielsen of AREA member IEEE Standards Association and AREA executive director Christine Perey. The two will present an exciting IEEE project in a talk entitled “AR in 2020: Scenarios for the Future.”

Following the AR in 2020 presentation, there will be a panel discussion about enterprise Augmented Reality, in which Christine will share her views on the opportunities and barriers ahead.

We also look forward to meeting our readers and members at the conference. We will be reporting our experiences and impressions about InsideAR on our blog, so keep an eye out for new posts!