1

Tech trends driving Industry to v5.0 – Rockwell Automation

Rarely has industrial automation changed at such an exponential rate. The combination of various technology trends has propelled enterprises into Industry 4.0 so fast that Frost & Sullivan has already delivered an Industry 5.0 blueprint to guide the journey.

Edge-and-cloud integration, converged development environments, artificial intelligence (AI) and autonomous production are far more than conceptual. These technological innovations are already happening.

“This is a unique time in our industry,” explained Cyril Perducat, who shared the automation supplier’s plans for the immediate future at Automation Fair 2021 in Houston. “The future is a trajectory, a path that we are already on. When I think of Industry 4.0, which was first coined in 2011, there is certainly a lot of learning over the past 10 years of what Industry 4.0 can deliver. And COVID has accelerated many of those dimensions.”

Remote connectivity, advanced engineering with multiple digital twins, mixing physical and digital assets, and the change of human-machine interaction are driving industry along that path toward Industry 5.0.

Perducat questioned whether it’s too soon to look at Industry 5.0 when all the promise of Industry 4.0 has not yet been delivered, but he identified five changes that are attainable and impactful in Frost & Sullivan’s comparison of Industry 4.0 to Industry 5.0:

  • delivery of customer experience,
  • hyper customization,
  • responsive and distributed supply chain,
  • experience-activated (interactive) products, and
  • return of manpower to factories.

“We are able to bring more capabilities to people,” said Perducat. “Human resources are scarce. By delivering systems that make the human-machine interaction more efficient, we make it more impactful while remaining safe.”

size=0 width=”100%” noshade style=’color:white’ align=center>

Rockwell Automation has identified four areas where technology can move companies along that journey:

  • evolution of cloud, edge and software,
  • universal control and converged integrated development environments (IDEs),
  • AI native operation management, including software as a service (SaaS) and digital services, and
  • autonomous systems and augmented workforce.

“We believe in control at the enterprise level,” explained Perducat. “We believe in systems with software-defined architecture and the underlying hardware. It doesn’t mean hardware is becoming obsolete. And it’s not that every piece of the system needs to be smart. The entire system, from the device to the edge and to the cloud, is smart. Edge + cloud architecture is fundamental.”

In the converged environment, control, safety and motion all come together and must work in an integrated fashion. This is especially true with the growth of robotics. “The boundaries between control and robotics are becoming more and more blurred,” said Perducat. “Safety is very fundamental in this more complex architecture. It does not work if it is not safe.”

Operations management becomes more efficient when AI is native to the architecture and is at the level of the enterprise. “A holistic view requires a lot of data and the ability to process that data,” explained Perducat. “Part of this has to be autonomous using the power of applied AI; it’s not just one more tool but is everywhere in the architecture. We can use AI on the machine to translate vibrations into data. We can think of AI in terms of process modeling. And model predictive control is evolving with AI. When you can orchestrate all the elements of the architecture, that is a system.”

FactoryTalk Analytics LogixAI is a modeling engine that enables closed-loop optimization through four steps—observe (sensor), infer (model), decide (controller) and act (actuator).

Finally, by transforming from automated systems to autonomous systems, it enables better decisions to expand human possibility.

AI can also help to simplify a new generation of design. “You can use AI to help to generate blocks of code, like individuals working together peer-to-peer, but one of them is AI, augmenting human possibility,” explained Perducat.

“We see the next step to autonomous manufacturing as an opportunity to deliver value to our customers,” he said. “The autonomous system is reimagining the fundamental principles of autonomous control systems. You don’t need to rip and replace. We have the ability to augment existing systems with new technology.”

Perducat stressed that it cannot be just technology innovation. “Technology only creates possibilities or potential values,” he explained. “It has to be accessible by users, so we have to innovate on the user experience point of view. We want to bring that to all the products, experiences and models. In a digital native world, innovation extends beyond technology and features.




HP is Using HoloLens to Help Customers Remotely Repair Industrial Printers

While many AR companies are focused on building AR products, HP is making an interesting move in using the technology as an add-on to improve an existing line of its business. The company’s newly announced xRServices program promises to deliver remote AR support for its industrial printer customers.

The program employs Microsoft’s HoloLens 2 headset, which HP’s customers can use to access AR training and live guided instructions to fix issues that arrive with complex, commercial scale printers.

HP is pitching the solution as a way to allow even untrained individuals to fix issues with the help of a specialist on the other end who can guide them step-by-step through troubleshooting and repairs with AR instruction. Further the company says the service can be used to provide AR training for various workflows and issues that may arise with the company’s industrial printers.

HP hasn’t clearly detailed exactly what software it’s running on HoloLens to facilitate xRServices, but it seems likely that it is leveraging Microsoft’s Dynamics 365 Remote Assist platform which includes many of the AR functions that HP showcased in its xRServices concept video—like augmented annotation, document visualization, and video chatting through the headset.





Hands-Free Thanks to Augmented Reality

The Bazeley Pilot Facility at the Parkville site in Melbourne, has been trying out Apprentice IO, an intelligent batch execution system that includes augmented reality. CSL, the world’s third largest biotech company, uses pilot plants to test manufacturing processes on a small scale. The company specializes in rare and serious diseases as well as influenza prevention.

Learn about a CSL Behring pilot plant in Illinois.

Working with Apprentice IO in Australia (AREA member) means a near-complete reimagining of core operations, including product development, manufacturing and supply chain solutions, said Sharon Orr, CSL’s Manager, Innovation and Technical Operations, Pilot Scale Operations. As part of the six-month test, the team is exploring alternatives to paper-based batch records, standard operating procedures and work instructions. The exercise has fundamentally changed the way operators of the system think about and approach instructions from the outset, Orr said.

When integrated with lab facilities, the augmented reality headset and linked iPad, can provide-on-the spot feedback, process directives and problem-solving techniques in real time. Paper records require a four-eyes approach for calculations, checking raw material information and weighs, she said. The experimental platform replaces manual cross-checking methods with automated formulas, ranges and barcoding. If results don’t match, then the system flags it to the operator saving time and ensuring compliance.

“Once a procedure has been augmented and approved, I can use this technology to perform a process ‘hands-free’ without having to worry about cumbersome paper-based data collection and manual checking,” Orr said.

Approved standard operating procedures can be accessed at the click of a button, Pilot Scale Operations scientist Hugh Harris said. With the new manufacturing software and augmented reality capabilities, colleagues at different sites would also be able to share data and see what’s happening in real time. CSL may bring the software system to other pilot facilities in Australia and the United States.

“Watching the team explore this new technology at the forefront of next generation manufacturing has been truly inspiring,” said Matthias Zimmermann, CSL’s Executive Director of Bioprocess Development. “They have shown a willingness to embrace the technology and incorporate it within our already existing processes. They have also worked with the software program designers to inform the next version, in some ways advancing this technology together.”

 




Visor-Ex® 01 – Collaboration between ECOM Instruments and IRISTICK

The Pepperl+Fuchs brand ECOM Instruments introduces Visor-Ex® 01 smart glasses for industrial use in hazardous areas.

The intelligent wearable, weighing just 180 g, combines high camera / display quality and reliable communication features in an ergonomic design for user’s utmost comfort. This provides mobile workers with an optimal companion for tasks that require hands-free use as well as continuous communication, for example with a remote expert.

This product is the result of a longterm close collaboration between ECOM Instruments and IRISTICK, bringing together ECOM’s in-depth knowledge of the requirements of hazardous areas with IRISTICK’s profound experience of smart glasses development.

Innovative tool for the mobile worker in hazardous areas

A total of three integrated cameras transform Visor-Ex® 01 into the remote expert’s bionic eye. Two 16-megapixel cameras are centrally positioned to depict the wearer’s natural field of vision – this way the remote expert views what is happening from the same angle and perspective as the mobile worker. A secondary camera offers a 6x optical zoom for zooming in without loss of quality and fast scanning of barcodes and QR codes. The system utilises the ECOM Smart-Ex® 02 smartphone for hazardous areas as a computing unit with LTE connectivity and a pocket unit with a replaceable battery for power supply, all combined in an intelligent ecosystem for a wide range of application scenarios in the industrial sector.

The distribution of functions across the individual system components helps to minimise the weight of the headset unit – without compromising on performance, connectivity or battery life.

By connecting to the Smart-Ex® 02, users can continue to use their tried-and-tested smartphone for harsh environmental conditions without restriction and benefit from all the advantages and security features and controls of the Android 11 operating system, including over-the-air updates. Leading to ease of use and low Total Cost of Ownership.

Visor-Ex® 01 will be certified for ATEX/IECEx Zone 1/21 and 2/22 as well as NEC/CEC Division 1 and 2 and will have protection class IP68. It can be used within a temperature range of -20 to +60 °C.

Read Iristick’s AREA member profile 

For more information: www.visor-ex.com

For sales information : [email protected] or [email protected]




Magic Leap partners with Geopogo on Augmented Reality solution for architecture and design

Geopogo is a California-based 3D design software company that is working to transform the design and construction process. The company’s software allows architects and designers to create renderings and a virtual reality (VR) or augmented reality experience in minutes by importing existing CAD models or building directly with the Geopogo 3D creator tool.

Now, with Geopogo’s software on Magic Leap’s AR headset platform, the interaction of digital content with the physical world will help to bring architectural designs to life, according to the companies. “This is a phenomenal opportunity to make architectural design understandable and accessible to project clients, city officials, and the general public,” said Geopogo’s Creative Director, Michael Hoppe.

According to Magic Leap, the American Institute of Architects, San Francisco (AIASF) utilized the partnership’s technology as part of its ‘Shape Your City’ campaign, an ongoing fundraising effort to build its new headquarters in the Bay Area’s new Center for Architecture + Design. The organization also sought to fund expanded architecture-focused tours, exhibitions, educational programs, and events for people of all ages.

As a result, AIASF hosted on-site building tours to build excitement and engagement for the project from the architectural community and the public, and offered tour participants a 3D virtual model of the future Center. The integration of AR technology during the building tours allowed for a more interactive, transparent, immersive, and exciting way to visualize what the space will look like, even before construction has started.

“The power of the AR experience succeeded in inspiring donors to contribute much-needed construction funding for the project, as hoped for by the non-profit organizations. We were especially happy to see how the AR experience brought so much delight to the faces of the non-profit Board, the organization members, and members of the larger community,” said Dave Alpert, Geopogo CEO and Cofounder. 

“The AR model has allowed our project partners, Board members, potential donors, and community to experience the future Center first-hand and visualize the positive impact it will have on future generations,” agreed AIASF Executive Director, Stacy Williams.

For more information on Geopogo and its augmented reality solutions for the architecture and design industry, click here. For more information on Magic Leap and its AR hardware solutions, click here.

 




Qualcomm is trying to simplify app creation for AR glasses

The ultimate aim is to make AR more accessible. Ideally, developers will make apps directly available to you through mobile app stores, using glasses tethered to smartphones. You might not see Snapdragon Spaces used for stand-alone glasses, at least not at first.

The manufacturer support will be there. Spaces won’t be widely available until spring 2022, but Qualcomm has lined up partners like Lenovo (including Motorola), Oppo and Xiaomi. Carriers like T-Mobile and NTT DoCoMo will help build “5G experiences” using Spaces. Lenovo will be the first to make use of the technology, pairing its ThinkReality A3 glasses with an unnamed Motorola phone.

It’s too soon to know if Snapdragon Spaces will have a meaningful effect on AR. While this should streamline app work, that will only matter if there are both compelling projects and AR glasses people want to buy. This also won’t be much help for iPhone owners waiting on possible Apple AR devices. Efforts like this might lower some of the barriers, though, and it’s easy to see a flurry of AR software in the near future.

 




Extended Reality – Mixed Reality Versus Augmented Reality

Augmented Reality Defined

Augmented Reality is quickly making its way into a variety of settings. Retailers use it to help customers visualize a product before they buy it. Engineers turn to augmented reality as a way of accessing valuable information about a product without fumbling with physical manuals. With AR, users can embed or overlay elements of the digital world into the physical world.

Tools like ARkit from Apple and Google ARCore even allow users to build their own smartphone immersive experiences. However, it is possible to further enhance AR experiences through things like smart glasses. These overlay the digital content you need to see in the real world in a much more immersive way, without requiring you to hold a phone in front of your face.

Mixed Reality Defined

Mixed Reality is a hybrid of AR and VR (virtual reality), though it goes further than AR when it comes to immersion. Through MR virtual or digital content isn’t just overlaid into the real world; it’s embedded in a way that users can interact with it.

This form of MR is an advanced kind of AR, which makes the digital elements you bring into your environment feel more authentic and realistic. MR can have elements of both virtual and augmented reality within it. However, the major difference is that the focus is on blending everything together. You’re not entirely replacing an environment, or simply augmenting it with new content. Instead, you’re creating an entirely new reality by combining both the physical and digital environment.

Exploring AR and MR

There are numerous differences between AR and MR, but the biggest noticeable aspects are:

  • Device requirements – AR is usable on most smartphones or tablets, with the added option of specialist headsets. However, to provide a MR experience, more power and sensors are required.
  • Realistic interaction – AR offers limited interactivity with the virtualized elements. The computer-generated content can’t interact with the real-world elements users see.

It’s up to you whether to use VR, AR for your project. Each of them is made for particular tasks. For many companies, augmented reality will be one of the easiest ways to enter the world of extended reality. The environment is accessible because you can create applications and tools that work in smartphones, as well as through smart glasses and headsets. However, as the technology available to us continues to evolve, Mixed Reality may also become more accessible.

Many leading companies are experimenting with MR already, though it’s still technically the youngest technology in the XR space.

In manufacturing, an important hurdle to overcome when trying to bring together several emerging technologies in one place is data connectivity. At the Manufacturing Technology Center (MTC) in the UK, they understand this issue all too well and are working to combat it using ATS Bus.

ATS Bus is a platform for their VIVAR (Virtual Instruction, Inspection and Verification using Augmented and/or Virtual Reality) project which investigates “how augmented and virtual reality could be used to enhance the operator experience when viewing work instructions and increase efficiency and accuracy for both instruction delivery and data capture.”

The work orders received are translated by ATS Bus into a standard data format where they are then sent down to the shop floor where ATS Bus translates them again into the required format for use on the Adv (Advanced Display Device) server.

You can read the original article on INFRASI’s website.




As the Metaverse & AR Mature, Will They Fall Into Tech’s Common Silos

As the world of AR and the Metaverse matures, the ability for software and hardware products to integrate with one another becomes a huge factor in the adoption and use of these technologies.

Dan chats with Christine Perey, the founder and principal analyst of Perey Research & Consulting and founder of The AREA, on how history reflects tech’s tendency to embrace operational and hardware silos, and why siloed products cause significant inefficiencies and increase cost.

Abridged Thoughts:

“[Interoperability in the AR world] is the ability for components, software, hardware, services from any vendor, to be able to exchange data without the user needing to concern themselves with who made that part, and so it’s the ability for multiple vendors to combine parts and their customers also to be able to combine parts into new and unique ways and come up with new, innovative solutions that solve a specific problem.

And so the interoperability also allows the market to go to scale because you’re no longer going to be focusing only on one use case or only on one component of the whole system. You can take your component into many, many different pieces of hardware, for example, something I know a lot about, or software; you could take your content and deliver it on any browser, any player.”

– Christine Perey 

 




AR enables efficient remote support – XMReality

One of the greatest examples of AR technology is the popular mobile app Pokémon Go, which allows players to locate and capture Pokémon characters that appear in the real world. In addition to entertainment, augmented reality is also used in other areas, such as marketing, fashion, tourism, and retail.

Overall, the use of AR is growing as mobile devices that are powerful enough to handle AR software become more accessible around the world. However, AR is not a new invention. In fact, the first AR technology was developed back in 1968, when the Harvard computer scientist Ivan Sutherland created an AR head-mounted display system.

Following in Surtherland’s footsteps, lab universities, companies, and national agencies developed AR for wearables and digital displays. But it was not until 2008 that the first commercial AR application was created by German agencies in Munich. They designed a printed magazine ad for a BMW Mini car. When held in front of a computer’s camera, the user was able to control the car on the screen simply by manipulating the magazine ad.

Since then, one of the most successful uses of AR for commercial purposes has been the ability to try on products, such as clothes, jewelry, and even make-up, without having to leave your house. In addition, many tourism apps use AR technology to bring the past to life at historical sites. For example, at Pompeii in Italy, AR can project views of ancient civilizations over today’s ruins. Other examples include neurosurgeons using an AR projection of a 3D brain to aid them in surgeries and airport ground crews wearing AR glasses to see information about cargo containers. Needless to say, the potential of augmented reality is endless.

 

AR enables efficient remote support 

At XMReality, we have embraced augmented reality from the beginning. Founded in 2007 by researchers from the Swedish Defense Research Agency, our first project was to help bomb disposal experts defuse landmines in the field. For six years, we performed advanced contract research in AR for the Swedish Defense Materiel Administration and BAE Systems.

Though we continue to work and innovate in the defense sector, we expanded to help other industries with our remote support solution XMReality Remote Guidance. In remote support calls, you can use the AR feature Hands Overlay to guide your counterpart by overlaying your hand gestures on top of real time video.

This is especially useful when you need to show someone how to turn a screw, explain what cord goes where, or provide other instructions where technical support is needed. And it comes in handy when you need both your hands to give instructions or guide someone through complex tasks.

The user-friendly software and AR technology enables you to improve operational efficiency and quality for processes like audits, maintenance, service, repair, training, and support at production sites, packaging, energy grids or properties. Find more information about how to use remote support in different industries here.

Don’t tell it, show it with AR

In a rapidly growing ​​AR marketplace, we always continue to develop the use of AR technology. To enhance the Hands Overlay experience, we have introduced additional hardware: The Pointpad.

Together with the Hands Overlay, the Pointpad is useful for experts in a helpdesk setup who is using XMReality Remote Guidance from a desktop computer or support stations. This allows you to enhance hand gestures for clear instructions during everyday calls.

Imagine that you are a technician dealing with electricity sub-stations, which include extremely complex industrial installations with myriad switch-gear, screens, and interfaces. When you are restricted to voice only support, you have to rely on the customer to explain what they see in front of them, and you must give them support while acting blind.

By using XMReality and its’ AR technology, you can both see exactly what the customer sees but also guide their hands with your own.  This way you don’t have to trust the customer to explain everything just right, and you don’t need to keep in mind every detail that the customer has said, since you can continuously see it while you and the customer are troubleshooting together. You also don’t need to worry about language barriers and having to say every instruction in the most easily-understood way, since you will use your hands to show the customer what to do with their own. The reduced risk for misunderstandings combined with faster trouble resolution is a great way to achieve happier customers and more efficient processes

You can read the original blog post by XMReality here.




Case Study of AR Technology Hirschmann Automotive and RealWear

The Challenge

With seven factories worldwide, Hirschmann Automotive needed a more cost-effective and time-efficient knowledge-transfer approach to maintaining and repairing equipment than flying experts around the world.

“If something isn’t working properly at one of our plants, technicians have to call our headquarters in Austria. And even then, they might not be able to solve the problem. Then it becomes an issue of flying someone around the world to assess the problem in person”

That’s when Fliri and his team looked at virtual and augmented reality solutions. Unfortunately, most devices were too delicate for the production plant environment — until Fliri discovered the RealWear HMT-1.

The Solution

Deploying RealWear running Cisco Webex Expert on Demand allowed Hirschmann Automotive to streamline collaboration and reduce equipment downtime.

The Results

  • Reduced travel needs and costs
  • Improved maintenance and repair response
  • Streamlined information accessibility and collaboration
  • Increased first-time fix rates
  • Shortened first-time resolution time

Hands-Free Use Case

  • Remote mentoring

Readers can download the case study for free on RealWear’s website