3 Steps to Escaping Pilot Purgatory & Nailing Your Wearable Tech Pilot

Ever since I became involved in the wearable and immersive tech space, I’ve wondered how a digital revolution really gets underway in an organization. What goes on behind the scenes within organizations? What’s the best starting point? What are the most common mistakes made during the pilot phase? What should enterprises know before piloting or adopting wearables, and how can they avoid pilot purgatory? I spoke with Sanjay Jhawar, co-founder and president of RealWear, maker of the HMT-1 and HMT-1Z1, to get the inside scoop. Read on for best practice advice, pilot lessons, and steps to nailing a pilot:

In a May 2017 survey of companies exploring digital manufacturing strategies, 84% of respondents said they had been stuck in pilot mode for over a year, while less than 30% were beginning to scale (McKinsey & Co.) In another McKinsey report, 41% of industrial firms surveyed said they were in pilot limbo and 30% were still discussing how to start a pilot—that’s 71% stuck in pilot purgatory. Though these findings aren’t wearable tech-specific, a similar story holds across the industry spectrum—pilot purgatory remains a common dead end for companies pursuing wearable technologies like smart glasses and mixed reality headsets.

“Every sales cycle looks like this: Evaluation, pilot, deployment, scale-up. What’s exciting is that we have hundreds of evaluations and pilots, and a large handful now moving into full-scale, large deployments for their enterprises.” – Sanjay, RealWear

Though enterprise wearables are new tech, we’re beyond the first mover stage. At this point, there have been hundreds of pilots by early enterprise adopters for newcomers to learn from. Over the last several years, companies big and small in all areas of industry have tested wearables, making mistakes, establishing some best practices, and even making it to the rollout phase. Solution providers have also learned lessons. At RealWear, according to Sanjay, “the more pilots we do the faster they go.”  So, why do pilots fail? One root problem is the use case itself.

Step number one to nailing a pilot is finding a high-value, hole-in-one use case, and the best place to start is with those closest to the problem, i.e. real workers.


Step 1: Choose a viable use case

“The biggest pitfall is when there’s a customer [looking] for an AR wearable to solve a problem that may not exist. We’ve found that in the conservative world of industrial, pragmatic applications that provide value now as opposed to eye candy demos of AR are the way to go. When we get engaged with the operations, quality or training executive who owns the profit and loss for the specific problem, that’s when things go fast—solving for a specific pain point that yields measurable ROI.  We need to be talking to the executive that owns a seven-figure dollar problem that they must address in under 6 months.” – Sanjay, RealWear

Start simple by matching a known business problem or need to a wearable solution. To identify a “good” problem, you can, of course, look at past safety data, quality statistics, etc. to figure out where the business incurs the greatest risk of injury and profit loss; you should also brainstorm with actual end users by going out into the field or onto the factory floor and speaking with respected frontline workers.

Ask employees what tools and methods they use to access task-based information, get help from others, verify or record their work, and interact with customers on the job. Do they have any complaints about the tools they use? Have they come up with any makeshift solutions or hacks to speed up their work or make themselves more comfortable? When is vital information not at the ready or delivered to workers in an inconvenient, inefficient manner? Are you using the best training methods for a multigenerational, changing workforce? Try to pinpoint sources of error, fatigue and injury, paid travel and rework, downtime and customer dissatisfaction; and consider inserting a wearable. And if you have the resources, consider setting up a kind of hub for employees to try out new devices on their own.

Choosing a use case around a clear business problem will help you determine an appropriate wearable form factor and guide you to the right software partner. The enterprise wearable tech ecosystem has matured to the point where most hardware companies have multiple software partners and many software solutions are cross-device/platform. If working with a hardware provider like RealWear, consult with them to find a software match for your use case.


Step 2: Determine requirements

“[Our] type of customer, which is medium to heavy industrial, is very concerned about not violating any of their sacrosanct safety standards. We’ve also seen a heightened awareness in IT security.” “My biggest advice is to involve IT from the start, rather than hiding your project from IT in the hope that it will go faster…Try to understand and address IT’s objections as soon as possible, even if takes a few months, because when IT has weighed in as an internal stakeholder, you’ll have IT pulling for you. Remember that wearables are part of IT’s jurisdiction as it’s connected to the enterprise.” – Sanjay, RealWear

Security reviews following software selection are often the greatest hold-up in the pilot phase. It’s so important not to lose momentum, so engage with IT right away. Give them a sense of ownership, as Sanjay said, and they’ll try hard to make the solution compliant to the business’ needs. Support from IT will also be critical to scale up down the road.

In this step, work with IT as well as EHS (Environmental Health & Safety) to determine all the operational factors you would need to account for in order to deploy the technology. This includes security as well as usability, safety, connectivity, mobile device management, and training. Sanjay perfectly summarizes the process of setting up a pilot: “It’s really to say if we had to deploy this headwear, how would we do it?”

How would you integrate the tech into existing processes, systems and facilities? Determine the limits and requirements of the workplace and use case, including:

  • How many devices you will need to test and where the funds will come from
  • Who will participate and what are their needs (comfort, safety, ease of use)
  • How you will measure the results (what KPIs you will track) and for how long
  • Any aspects of the work environment itself that might interfere with use
  • The scope of current MDM platforms and policies
  • Industry safety requirements

Given these factors, what needs to be addressed, worked around or changed before the pilot begins? A common workaround, for instance, has been to deploy a wearables-only wireless network when the existing network’s security protocols are incompatible with the new tech.


Step 3: Wrap it up in 6 months or less

Time. Kills. All. Pilots. The longer it takes, the more risks there are that something will happen: The budget goes away, a new shiny object steals the focus, an organizational change or your sponsor changes roles or jobs. If it takes more than six months, it’s almost not going to succeed by definition. A successful pilot should take three months. What we recommend is to have entry and exit criteria defined and agreed in writing up-front while designing your pilot.”  – Sanjay, RealWear

Before pressing “Go,” prepare to measure results and gather feedback. Work with all stakeholders to define the pilot objectives and agree on a method for measuring success. Prepare the workers involved, as well, clearly explaining to them the potential benefits, assuaging concerns, and providing a channel for honest feedback. Hopefully you chose a use case based on a problem the entire organization wants to solve.

Common pilot killers:

  • Not knowing what problem you’re trying to solve (going tech-first)
  • Overly complex use case
  • Unrealistic expectations
  • Lack of top management and IT support
  • Employees weren’t properly trained on the devices
  • Too much time: You want a quick win to prove the business case and justify next steps

A successful pilot should expose security vulnerabilities and opportunities for improvement to work out and apply in the rollout phase. I asked Sanjay from RealWear if he could share any examples of improvements made to the HMT-1 as a result of pilot feedback:

The core hardware hasn’t really changed, but the software and accessories have evolved. On the accessories side, as one example, we started out with a head strap to attach the device to your head and clips for different types of hardhats…We eventually came up with a succession of different baseball cap mounting options but we didn’t have a way to accommodate an existing baseball cap without damaging it.”

In that case, workers wanted to be able to use the HMT-1 with their own baseball caps, so RealWear had to innovate, figuring out a way to combine form, function and user preference. The company recently came up with a special clip that achieves this. In another example, Sanjay recalled customers having trouble with Wi-Fi password entry using RealWear’s voice keyboard. While the voice tech was great for words or commands, it was less so for entering secure, enterprise-standard passwords. In response, RealWear is preparing to release a new voice keyboard with a radically improved user experience for entering complex text. The company also built more functionality into its smartphone companion app, allowing users to enter a Wi-Fi SSID and password and generate a QR code that the HMT-1 is scanning for with its camera, right out of the box on the very first power-up. “From using another device to configure, we’re moving towards a single sign-on in the Cloud which will take away the need for passwords altogether. That has been a lot of learning from end users and customers.”  – Sanjay, RealWear

 

About Sanjay Jhawar:

Sanjay Jhawar is Co-founder, President and Chief Product Officer at RealWear, makers of the world’s first head-mounted tablet computer, a wearable that completely frees the hands of industrial workers. Known as a strategist, innovator and leader for over 25 years, Sanjay has a deep product background in mobile devices, including smartphones and wearables, mobile SaaS cloud services, client apps, accessories and core network infrastructure. Prior to RealWear, Sanjay served on the senior executive teams at three tech startups:

  • VP/GM Solutions and Marketing at Sonim Technologies, maker of the world’s toughest mobile and smartphones for industrial and public safety users, a private company that quadrupled revenues to $115M in a 3-year period during Sanjay’s tenure
  • SVP Marketing and Product Management at BridgePort Networks who invented the telecom technology that lets you use voice and messages to your phone seamlessly between Wi-Fi and cellular networks
  • VP Marketing, Bus Dev and Product Management at Sendit AB in Sweden, a mobile email pioneer acquired by Microsoft in 1999 for $128M

Sanjay also product managed the world’s first Java based smart phone at Motorola and co-founded WAP Forum, the standards body for the early mobile Internet. Sanjay started his career at IBM and has also spent time in venture capital in Milan and Boston, and in consulting. He holds a Masters with Honors in Electric Engineering from Cambridge University.

 

The Enterprise Wearable Technology Summit (EWTS) is an annual conference dedicated to the use of wearable technology for business and industrial applications. As the leading event for enterprise wearables, EWTS is where enterprises go to innovate with the latest in wearable tech, including heads-up displays, AR/VR/MR, body- and wrist-worn devices, and even exoskeletons. The 5th annual EWTS will be held October 9-10, 2018 at The Fairmont in Austin, TX. For more details, please visit the conference website.


Augmented World Expo (AWE), the world’s #1 AR+VR conference and expo, comes to Munich, Germany on October 18-19, 2018. CXOs, designers, developers, futurists, analysts, investors and top press will gather at the MOC Exhibition Center to learn, inspire, partner and experience first-hand the most exciting industry of our times. Tickets now available at www.aweeu.com.

Over 20 Use Cases of Smart Glasses, VR Headsets, and Smartwatches at Airports

If going through airport security is a flyer’s biggest pain, then capacity is the airport manager’s living nightmare: Airports around the world today are hard-pressed to process more passengers and cargo than their terminals were originally designed to manage, and projected air traffic growth indicates no coming relief. Most American airports were built between the 1950s and 1970s. Take Chicago O’Hare International Airport: By 1975, O’Hare was the world’s busiest airport, handling 37 million passengers a year. In 2017, more than double that number – 79.8 million people – traveled through O’Hare, along with 1.9 million tons of cargo!

Capacity issues have led to a multibillion-dollar infrastructure crisis in the airport industry, not to mention low customer expectations on the part of airlines and airline passengers (airports’ two main customers). It’s not enough for the industry to work on quickly processing travelers and avoiding delays; improvements and solutions are needed for the end-to-end travel journey, as well, including the terminal experience and flying between destinations. The pressure is on for airports to invest in new technologies that improve the efficiency of airport processes and reduce service disruptions; thereby allowing passengers to spend less time in queues and more time enjoying airport facilities.


Ideas on the ground and on board:

The airport industry first began toying with wearable technology with the release of the original Google Glass in 2013. Early on, a number of airlines trialed smart glasses at the boarding gate and offered digital boarding passes for consumer smartwatches. More recently, the use of wearable augmented and virtual reality devices by airport and airline technicians to train, perform maintenance, and receive remote support has gained traction. Additionally, the growing popularity of AR and VR in architecture, engineering and construction has implications for the future of airport renovations and new airport design. Other ideas floating around look to a future in which travelers regularly use wearables and even lightweight smart glasses to receive real-time flight notifications, directions to their gate, and pre-flight shopping and dining promotions.

In IATA’s 2017 Global Passenger Survey, 85% of those surveyed indicated they would be willing to give up more personal data in exchange for faster process checks and more personalized service at the airport. As consumers become increasingly receptive to sharing wearable-generated biometric data and are exposed to augmented reality via smartphones; ideas like replacing traditional travel documents with personal wearables and implementing AR wayfinding in airports seem less and less far-fetched.

The history of wearable technologies in the airport industry:

From supporting airport ground operations with AR to in-flight VR entertainment; the airport industry has experimented with wearable technologies throughout the travel experience. In fact, airports and airlines gave us some of the earliest – and incredibly imaginative – use cases of Google Glass Explorer Edition, arguably the device that set enterprise wearables in motion. Let’s look back:


Early trials:

Virgin Atlantic’s 2014 trial at London Heathrow Airport – in collaboration with SITA – included both Google Glass and the Sony SmartWatch 2. Staff at the airline’s premium entrance at Heathrow used the devices to view individual passenger information and real-time travel updates. This allowed agents to greet first-class passengers by name, process them quickly for their flights, and provide the most up-to-date travel information. The following year, Virgin partnered with Sony to equip its Heathrow engineers with the Sony SmartWatch 3 and Sony’s SmartEyeglass to test out real-time job notifications and live video streaming to remote expert technicians.

Around the same time, Vueling, Iberia, and Air Berlin launched smartwatch boarding passes for early Pebble and Samsung smartwatches. EasyJet and British Airways followed with Apple Watch apps allowing travelers to receive real-time flight updates and board their planes with just a flick of the wrist.

Japan Airlines made another early attempt to prove Google Glass in maintenance, with airline personnel wearing Glass on the tarmac at Honolulu Airport so that experienced staff at headquarters could inspect planes remotely. Airports got into the game, as well; including Copenhagen Airport, where duty managers used Google Glass to document issues and answer travelers’ questions on the spot, and London City Airport, which considered how Glass might be leveraged in its operations. Allegiant Systems, a software company, also developed a proof of concept in which airline staff used Vuzix smart glasses to create a more personalized passenger experience. Scenarios included using the glasses at security, at the gate, and at the door to the aircraft to identify passengers (facial recognition tech) and to view preferences of frequent First-Class fliers in the air.

While these trials made an early splash for wearables, most did not amount to full-blown adoption. This was especially true in the case of smartwatches. Smart glasses did, however, enable workers to keep eye contact and better engage with customers.


Later use cases:

By 2017, the idea of using smart glasses to improve airport processes no longer seemed so futuristic. That year, SITA worked with Helsinki Airport to explore visualizing airport operations with the Microsoft HoloLens. Using the feed from its Day of Operations software (already in use by Helsinki Airport), SITA reproduced the airport operational control center (AOCC) in mixed reality. This made for a new way of visualizing and analyzing the airport’s complex operational data (aircraft movement, retail analytics, etc.) to make decisions. It also allowed remote viewing of the AOCC in real time.

Along with delays, heavy commercial passenger and cargo traffic can produce unexpected changes in an airport’s operations that put the airport’s facilities to test. Cincinnati / Northern Kentucky Airport (CVG), which sees 6.7 million passengers a year, turned to wearable technology when quality metrics revealed that the state of the airport’s restrooms had a great impact on traveler satisfaction. In what became a successful use case, CVG installed counting sensors in its restrooms and gave housekeeping staff Samsung Gear S3 smartwatches with Hipaax’ TaskWatch platform. The sensor data helped to better direct staff resources, so instead of following a standard cleaning schedule, housekeepers were notified in real time via smartwatch when a nearby restroom required attention.

Out from behind the scenes in 2017, AR and VR began to make more public appearances in the airport industry. Heathrow Airport worked with Ads Reality to create an augmented reality app for entertaining and distracting children – some of them first-time travelers – during the long wait to board a flight. As an added benefit, tracking the triggering of the AR markers through the airport’s five terminals also tracked foot traffic, revealing busy areas where customer experience could be improved. Qantas Airways was actually the first to introduce VR headsets, partnering with Samsung in 2015 to bring the devices to select first-class cabins and lounges for travelers to virtually experience some of Australia’s greatest attractions (like the Great Barrier Reef). The airline has since released a multi-platform mobile app showing off Australia’s beautiful scenery, with the goal of inspiring consumers to book with Qantas.

Using VR as a sales tool has been popular at other airlines, too, including Lufthansa and KLM Royal Dutch Airlines, which offer VR experiences of destinations and the aircraft itself to encourage seat upgrades. The KLM Flight Upgrader is a VR experience enabling people on budget flights to “pretend” to fly KLM, complete with in-flight movie, reading your favorite newspaper, and a virtual meal served by a caring crew. Singapore Airlines, Eithad Airways and Finnair have also experimented with VR to show off their airplanes, cabin classes, and travel destinations. Very recently, Air New Zealand announced a partnership with Magic Leap and Framestore to develop MR content highlighting New Zealand as a travel destination. The airline has also trialed HoloLens for displaying key passenger information like preferred meal choices and emotional state to flight crew and Google Pixel Bud Bluetooth earphones to help employees with live translation onboard and in the airport terminal.


Most recent

Late 2017 saw larger and more ambitious trials of wearable technologies at airports. Changi Airport, one of the busiest in Asia, announced plans to pilot 600 pairs of smart glasses among its staff to improve the accuracy, efficiency and safety of cargo and luggage handling. Using its camera to see visual markers and labels on luggage and containers, the glasses project information like loading instructions on top of the user’s real-world view, shortening loading time by as much as 15 minutes. This will create a competitive advantage for the Changi’s airline customers, while video streaming will allow real-time monitoring of ramp handling operations.

Hamad International Airport signed a Memorandum of Understanding with SITA, providing a framework to trial biometrics together for seamless identity management across all key passenger touch points at the Doha airport, along with robotics, blockchain, AR and VR.

Though Copenhagen Airport was actually the first to provide an AR wayfinding tool back in 2013, Gatwick Airport installed 2,000 beacons to enable the same in 2017. At Gatwick, through which 45 million people travel every year, passengers can use their smartphones to view AR directions to wherever they need to go. Helping people navigate the airport prevents minor disruptions resulting in late departures and missed flights. It’s also the perfect use case for consumer AR glasses, allowing you to travel heads-up to your gate with your hands just on your luggage.

SkyLights, maker of immersive, cinematic in-flight entertainment (IFE), has content partnerships with the likes of 20th Century Fox, DreamWorks, and BBC. Last year, Air France and Corsair trialed SkyLights’ Bravo Evo VR headset in some business class cabins. In the spring of this year, Emirates and Eithad announced their own trials of the new Allosky headset in select first- and business-class lounges. Japan Airlines and JetFly have also tested the headset, which can store up to 40 high-def films including five VR titles. Such VR entertainment could transform the cabin experience.

In the last few months, both Philippine Airlines and Lufthansa have revealed they’re using VR for training. Lufthansa is just the latest in the aviation world to consider VR for pilot training. The German airline already uses VR to teach flight attendants how to search the aircraft for foreign objects and is now seeking to keep up with the growing attrition rate among its 10,500 pilots. Philippine Airlines is applying the technology to cabin crew training, which, unlike flight simulators, has evolved very little over the years. The first batch of cabin crew trainees to use VR are now being deployed to select craft.


Future

Whereas the use cases for wearable technologies in industry – on the construction site, in the factory, etc. – are clear, consumer-facing industries like retail, financial services, and travel are less certain about how to go digital. There’s no shortage of experimentation: In the last five years alone, the airport industry has turned to wearables to make boarding more convenient, improve the in-flight experience, better understand airport operations in order to correlate events and manage staff, speed up flight inspection and turnaround, entice consumers to upgrade their travel, distract those waiting for flights, and more.

Wearable and immersive tech is accelerating across the industry, most recently popping up in air traffic control, and even carving out new revenue streams as in the case of First Airlines, the world’s first virtual airline based in Ikebukuro. Actual consumer-facing use cases, however, are not really sticking; but what has been consistent from trial to trial ever since gate agents for Virgin Airlines first put on Google Glass is that feedback is largely positive—consumers generally support technology that will speed up and simplify the airport experience. IATA’s Global Passenger Survey confirmed this last year. Passengers may not be aware of wearable notifications flying across airport hubs but they do notice when airline employees look them in the eye, know the answer to all their questions, and predict their beverage choice before the cart reaches their row. 

Everything Enterprise XR Announced at AWE USA 2018

The scope of the Augmented World Expo is large to say the least—six tracks, a huge expo divided into pavilions, a Playground of entertaining immersive experiences, workshops, and more. As opposed to EWTS’ enterprise focus, AWE truly gathers everyone interested in defining and progressing the future of XR in every aspect of life; and BrainXchange was happy to partner with the show’s producers to help plan the industry event.

There were many announcements at the 9th AWE and some really cool tech on the expo floor (mixed reality backpack, anyone?) For our followers interested in the business and industrial applications of wearable XR technologies, we’ve separated enterprise from consumer in recapping the major developments (yet still beta in many cases) that came out of last week’s event:


Kopin

One of the most anticipated announcements was for the Kopin Golden-i Infinity: A compact and lightweight, gesture- and voice-controlled smart screen that attaches magnetically to turn any pair of suitable eyewear into an AR display. The Golden-i is powered by an Android or Windows mobile device – thereby offloading the heavy lifting – and can connect to apps using a USB-C cable. It’s intended for enterprise use and will arrive by the third quarter of this year at a price of around $899.


Qualcomm

Qualcomm revealed the Snapdragon XR1 Platform, the first chip specially made for standalone XR devices. The new processor features special optimizations for better interactivity, power consumption and thermal efficiency; and could potentially reduce the cost of entry for new AR/VR hardware developers. Qualcomm also released a reference design that has already influenced forthcoming standalone devices from VIVE, Meta, Vuzix and Picoare.


Vuzix

In addition to taking the stage alongside Qualcomm to reveal the new Snapdragon XR1, Vuzix announced a partnership with Plessey Semiconductor and a shipping date of June 1st for the Blade AR Smart Glasses. Both partnerships will affect Vuzix’s next-gen smart glasses (expected in 2019) by increasing processing power and upgrading the display engine. During his keynote presentation, Lance Anderson also called on developers to help augmented reality move forward by creating practical and entertaining apps for the Vuzix Blade, the first fashion-friendly smart glasses for both work and play.


RealWear

AWE attendees were introduced to the HMT-1Z1, the first commercially available, ruggedized head-mounted AR computer certified for use in potentially explosive work environments (ATEX Zone 1 and C1/D1). The intrinsically safe wearable computer presents no ignition risk, allowing all workers to go hands-free and take advantage of the efficiency benefits of the HMD, and will ship on June 15th.


eSight

SPEX, a new division of eSight Corporation, showcased its first AR headset platform offering “breakthrough enhanced vision” in commercial, industrial and medical scenarios that require precision vision. The lightweight HMD has no release date as of yet but has been described as comfortable, providing an augmented view of the world without obstructing the user’s natural vision.


Atheer

Atheer announced the latest version of its AR platform, which includes secure group collaboration so that multiple remote experts can provide live video guidance and support across the supply chain (think of manufacturers with multiple suppliers). The company also widened the range of business processes supported by the Atheer AR Workflow Engine to include dynamic warehouse pick lists, contextual task guidance, checklists, link workflows, surveys, and note-taking for seamless process documentation.


Epson

Epson released the Moverio AR SDK for its line of Moverio Smart Glasses, which adds new capabilities like 3D object tracking using CAD data and 2D image tracking to the former SDK. The update enables the creation of 3D content for Moverio glasses and can detect various objects from 3D CAD files (no need for QR codes or other markers) as well as track multiple 2D images on a 3D plane. Epson is accepting applications for beta testers to help identify bugs.


Kaaya Tech

Kaaya Tech’s HoloSuit, a motion capture suit featuring haptic feedback for full immersion, was on showcase at AWE. The MoCap suit with haptic tech comes in two models, a basic one with 26 sensors and a higher-end version with 36 sensors. As opposed to games and entertainment, Kaaya Tech sees its technology being used in physical training simulations for industrial jobs, factory line work and the operation of heavy machinery.


ODG

ODG demonstrated a working model of an AR oxygen mask it has been developing with FedEx. The mask, named SAVED for Smoke Assured Vision Enhanced Display, has a heads-up AR display to help pilots make a safe landing despite smoke filling up the plane. In the near future, ODG plans to offer the technology to civil and commercial aircraft manufacturers and pilots as well as the military.


ScopeAR

ScopeAR debuted a new AR platform offering real-time remote assistance and augmented reality smart instructions. The all-in-one solution combines Scope AR’s video calling app Remote AR and the AR content creation library WorkLink to enable increased levels of collaboration and guidance.


Toshiba

At AWE, Toshiba demonstrated its dynaEdge AR Smart Glasses with two new applications resulting from recently-announced partnerships with Applied Computer Services (ACS) and Ubimax. ACS’ Timer Pro Storyboard software for video training and the Ubimax Frontline application suite are now both available on the dynaEdge.


Meta

AWE attendees got a live, on-stage demo of the Meta Viewer, the first software application for the Meta 2 headset that lets users view 3D CAD models in AR. Currently in beta state, the app will save time and reduce costs in the product development process—everyone in the development chain (designers, salespeople, etc.) will be able to use Meta Viewer to collaborate and interact with 3D designs without having any special technical skills.


RE’FLEKT 

The company has added Sync – “the first software solution to automatically create edge-based tracking from CAD data” – to REFLEKT ONE, its suite of AR/MR app development tools. Sync is designed to further simplify the transformation of existing technical documentation and CAD data into AR/MR manuals and enterprise applications. With Sync, RE’FLEKT claims AR apps for maintenance, training and operations can be built completely in-house. Companies can save time and money and do not have to share their proprietary CAD and other data with a third party.

 

Image source: Wareable

 

The Enterprise Wearable Technology Summit (EWTS) is an annual conference dedicated to the use of wearable technology for business and industrial applications. As the leading event for enterprise wearables, EWTS is where enterprises go to innovate with the latest in wearable tech, including heads-up displays, AR/VR/MR, body- and wrist-worn devices, and even exoskeletons. The 5th annual EWTS will be held October 9-10, 2018 at The Fairmont in Austin, TX. For more details, please visit the conference website or download the EWTS 2018 Brochure.


Augmented World Expo (AWE), the world’s #1 AR+VR conference and expo, comes to Munich, Germany on October 18-19, 2018. CXOs, designers, developers, futurists, analysts, investors and top press will gather at the MOC Exhibition Center to learn, inspire, partner and experience first-hand the most exciting industry of our times. Apply to exhibit, submit a talk proposal and buy Super Early Bird tickets now at www.aweeu.com.

Manufacturing 4.0: Checking In with Expert Peggy Gulick of AGCO

A true enterprise wearable tech pioneer, Peggy Gulick, Director of Digital Transformation, Global Manufacturing at AGCO Corporation, spearheaded one of the most successful use cases of Google Glass in enterprise to date. Where others saw challenges, Peggy and her team saw opportunities to turn a device that was then (2013) struggling to find a purpose into a powerful lean manufacturing tool. We last interviewed Peggy in July of 2016, before she first graced the EWTS stage. Since then, AGCO has become a poster child of Glass Enterprise, the second generation of Google Glass developed with the input of enterprise visionaries like Peggy; and Peggy herself has become a star speaker, her story undoubtedly inspiring many others. Below, Peggy answers our questions about the state of manufacturing today:

 

BrainXchange: What are the greatest challenges faced by manufacturers today?

PG: All manufacturers that I have spoken to seem to face similar challenges with rising employer costs (many related to healthcare) and the need to reduce operational costs while projecting longer-term strategic plans. In addition, the expectations on employers by employees and the communities that they exist in have changed. Employees expect more from their employers, including a sense of purpose. Communities expect both social and environmental contribution.

In the midst of this, there is a gap in qualified labor and the high-tech skill sets required to meet new operational budgets and strategic plans to increase quality, reduce time and cost to market.

Automation, industrial revolution 4.0, Internet of Things and big data are all being touted as responses to these shared challenges, yet most organizations have not figured out how to incorporate them into current business processes. Although these new technologies can provide relief to manufacturers, they continue to face perception challenges, identified as replacing rather than augmenting humanity.


BrainXchange: What are the effects of automation and big data in manufacturing?

PG: Currently, there are two types of companies benefitting from big data. One is, of course, big data companies, ranging from expanded infrastructures to storage, management, processing and analytics of massive amounts of collected and stored information. The second is the strategic few organizations that have found ways to incorporate the data into problem solving and to deliver the right information to the critical point of decision making. By treating big data and automation as dependent and collaborative solutions, both as drivers of continuous improvement and lean manufacturing processes, we have been able to determine the elements that are most likely to impact outcomes that matter the most –to our product and process quality, productivity and safety. Big data, unless transformed into actionable information, is meaningless.


BrainXchange: Is AGCO experiencing a “skilled labor crunch?”

PG: Yes, but we are addressing it through investment in our employees, both current and potential (apprentices). Mechatronics, assembly academy, scholarships and on the job training combined with a work environment that allows employees to contribute and feel a sense of purpose has allowed us to retain and recruit successfully. Our employees are motivated by the organization’s concern for quality products/processes and employee safety, not cost-reduced workforces.


BrainXchange: How might smart glasses and Augmented Reality help address some of the above challenges?

PG: Smart glasses and augmented reality have been deployed in our manufacturing operations to further our continuous improvement efforts across the site. The use of wearable technology helps eliminate motion, over-processing, defects and even transportation. Excessive travel to workstations to retrieve work instructions and bills of material is eliminated. Defects are minimized due to comprehensive (pictures, videos) and easy-to-access to work instructions. Our plant makes highly complex, low-volume agricultural equipment. Wearable tools help minimize over-processing caused by the need to rework due to misguided assembly. When workers can do their job smarter, faster, safer, it resonates throughout the entire culture. As we realize labor crunches, it is more and more important for companies to offer the tools and training required to create, grow and retain their employees. Smart glasses has helped us to do that.


BrainXchange: What tools do AGCO workers currently use to do their jobs? How are new workers currently trained?

PG: All of our assembly and assembly quality gate employees attend 40 hours of Assembly Academy followed by 40 hours of Lean Work-cell training. In addition to reading blueprints and interpreting supplemental information, assemblers must be proficient at hand, power and assembly tools. Since employees are now expected to use wearable tools including smart eyewear (Google Glass) to access work instructions and quality checklists, wearable tools are introduced immediately in the learning academies.  Wearable tools not only inform but also capture and flow pertinent information (including pictures, text and video) for non-conformance issues and missed thresholds.

It was critical to the success of wearables to acknowledge that all employees are not equal in training and skills. As employees’ skills mature, specific to operations, our wearable applications allow for personalized levels of instructions, tailoring them based on algorithms of training and experience.

The wearable tools themselves are easy to implement and support. Most employees are excited to wear the technology and realize the benefits quickly.


BrainXchange: Where do you see the greatest opportunities for smart glasses in the manufacturing plant?

PG: Our product design team finds great value in virtual reality glasses. Not only do they broaden the ability for a team to “see” what others are thinking, but they allow design teams to remotely interact, all in virtual glass, all seeing the same product and projected design strategies.

As a problem-solving organization and culture, we have weighed the value of wearable smart glasses in many areas, including welding, paint preparation, assembly, quality, technical services, material management and even plant tours. The first thing that we have discovered is that the projected value of replacing current tools, whether it be paper work orders or terminal work instructions, with smart glasses is 2x what we initially thought. The results have been so beneficial in some areas that we have retested, thinking it was a mistake. It is important to note that every pilot we have conducted has been in response to a defined problem. And, after 5 whys, fishbones and cross-functional involvement, sometimes even a kaizen, smart glasses are a part of the proposed solution with metrics associated. Knowing that smart glasses are a lean tool, and not an industry requirement or cool factor, we have reported 30% reduction in processing times, 50% reduction in amount of time employees train on the job (new hire and cross functional) and reduced quality and safety incidents that we are still calculating. The greatest value for the glasses has been in assembly and quality, both needing easy and quick access to hands-free instructions. As a manufacturer of complexly configured products, we have discovered that training by smart glasses is the grand slam. New product launches, multi-operation and new hire training are easily administered and audited for success.


BrainXchange: How do smart glasses further lean manufacturing?

PG: Simple. Lean is all about waste elimination. Smart glasses, when implemented for the right reasons, reduce waste. The use of wearable solutions was discovered as we did what we do best every day–solve problems (4873 problem solutions implemented by employees in 2016.)

Introducing Google Glass to our manufacturing floor was not intended as disruptive technology or even competitive advantage. They were introduced as solutions to make employee’s jobs easier and safer while driving higher quality to our product and our processes. In the end, we have accomplished both.


We are delighted that Peggy will be speaking again at EWTS 2018 this October, and cannot wait to hear how AGCO’s Google Glass success story has progressed. 

 

The 5th Annual Enterprise Wearable Technology Summit 2018, the leading event for enterprise wearables, will take place October 9-10, 2018 at The Fairmont in Austin, TX. EWTS is where enterprises go to innovate with the latest in wearable tech, including heads-up displays, AR/VR/MR, body- and wrist-worn devices, and even exoskeletons. For details, early confirmed speakers and preliminary agenda, please stay tuned to the conference website.


Augmented World Expo (AWE,) the world’s largest conference and expo dedicated to Augmented and Virtual Reality, is taking place May 30-June 1, 2018 in Santa Clara, CA. Now in its 9th year, AWE USA is the destination for CXOs, designers, developers, creative agencies, futurists, analysts, investors and top press to learn, inspire, partner and experience first-hand the most exciting industry of our times.

 

Photo credit: Google X

Enterprise Wearables: What to Consider in Choosing Hardware

Watch Ramon Llamas, Research Manager at IDC, take leaders at General Dynamics Electric Boat, Gulfstream Aerospace, Duke Energy, Rogers-O’Brien Construction, and Atheer through the hardware considerations of wearables in enterprise. Whether used in designing submarines or to provide power and gas to millions of people, an enterprise wearable device has to suit both the environment in which it is used and the user or wearer in terms of ergonomics and performance.  

 

 

Key Learnings and Takeaways:

Hardware considerations are first and foremost in beginning one’s wearable journey in the enterprise, and often the customer is internal—a group of workers. Ramon does a great job of eliciting key words of advice from the panelists, like this takeaway from Ken Fast of General Dynamics Electric Boat: Things can take a long time to implement in a large company, so maintain a kind of childlike excitement about the technology.

Ken develops solutions to support those who design and build nuclear submarines at General Dynamics. He points out that employees who only need a few minutes of textual instruction before doing, say, half an hour of work don’t really need a heads-up display—a tablet is fine. But for employees requiring constant guidance, Augmented Reality glasses are desirable to feed them information at every step of a process. The shortcomings of current hardware options are significant here because while AR glasses would be ideal, many models are ill-fitting at the moment. You can imagine if the information or data shown to the worker has to align precisely with the real world (ex. installation info,) glasses that slip or move around won’t do.

Drew Holbrook of Gulfstream Aerospace advises listeners to keep pushing through the roadblocks. In addition to looking at emerging technologies for engineering, marketing and training; Drew works with designers at Gulfstream to bring Virtual Reality tools to Gulfstream’s customers to help them visualize their private jets. To do that, the technology has to look as real as possible, with good resolution and color clarity.

Each aircraft Gulfstream sells is unique; the cabin configuration, interior design, materials, paint job, etc. are all selected by the customer. The concept is to take a VR headset to the client to let him experience the design. In this case, the user experience and performance are important, for the client’s virtual experience will reflect upon Gulfstream’s brand. (Drew also mentioned device tracking—when implementing wearables into your operations, consider how you’ll track and maintain them once you’ve scaled up the solution. What happens when a device fails or is dropped?)

Don’t believe everything you see in videos until you try it yourself, warns Aleksandar Vukojevic. Having tested many HMDs at Duke Energy, Aleksandar has determined that hardware choice ultimately depends upon the use case and what kind of information will be displayed. If remote communication is the main use case, for instance, factors like resolution and speed matter. The working environment presents its own barriers that affect hardware selection, especially in electric utilities where jobs are dangerous, safety-rated glasses are required, and connectivity is an issue. Since electrical workers must be a certain distance away from a power line to use metal objects, device components currently prevent their use of most smart eyewear. (Again, environment and user.)

Keep it simple is another piece of wisdom, this one from Todd Wynne of Rogers-O’Brien, who suggests focusing on making the wearer’s life easier and safer. For Todd, that person is the construction worker with a wrench in hand who needs easily consumable information to make quality decisions within a tight building schedule.

On a construction site, documentation and safety are critical. Building doesn’t stop for rain; there’s dust everywhere and things break easily. Workers need to stay hydrated and be constantly aware of their surroundings and movement. Key wearable hardware considerations, therefore, are user interface and display (easy to use, glanceable,) as well as ruggedness and form factor (non-intrusive.) The device has to be invisible; workers should be able to forget about them.

A software partner can really help in the hardware evaluation process. Theo Goguely from Atheer recommends going about it systematically, creating a kind of matrix or graph of all possible devices on one axis and all use cases within your organization on another in order to find your sweet spots. There isn’t one best piece of hardware for a business—within an organization, different devices will be best for different use cases and the device won’t necessarily be a wearable.

It all comes down to where and by who the technology is to be used: A $3,000 HoloLens headset isn’t necessary to pick a box off of a warehouse shelf; a smaller monocular device is more appropriate. It’s not a specs race, so figure out the device that will give the “guy on the ground” just the information he needs. And if the technology isn’t there yet (i.e you need it to be intrinsically safe,) begin exploring your software options and training on a tablet if possible, or target a different use case in a less demanding environment while the hardware catches up to its potential.

 

The 5th Annual Enterprise Wearable Technology Summit 2018, the leading event for enterprise wearables, will take place October 9-10, 2018 at The Fairmont in Austin, TX. EWTS is where enterprises go to innovate with the latest in wearable tech, including heads-up displays, AR/VR/MR, body- and wrist-worn devices, and even exoskeletons. For details, early confirmed speakers and preliminary agenda, please stay tuned to the conference website.

 

Smart Glasses, AR, VR and MR: Head-Worn Devices in the Enterprise

Watch Picavi’s Johanna Bellenberg talk about head-worn devices with the very people implementing the technology at Walmart, GE Transportation, Gensler, USPS, and FM Global. The group shares the insights, “aha” moments, and limitations realized in implementing AR/VR glasses and headsets; and come to a common consensus on the value of these technologies especially for employee training.

 

 

AR/VR is helping the Postal Service meet the demands of a changing digital world, in which its 20-year-old fleet of vehicles needs fixing and replacing and more and more part-time employees need fast training. Passing information from carrier to carrier via a physical book containing information on every route isn’t an efficient method, not with millions of delivery points each day. Using AR/VR for vehicle maintenance and to eliminate 50% of training time for new employees is what it takes to keep the Postal Service alive.

As there isn’t a solid use case yet for HMDs in the retail world, Walmart is using VR at its training academies to simulate exceptional customer experience problems you wouldn’t want to create in a real store and shopping events that only happen once a year. VR is ideal as you “can get multiple reps over and over.” For Walmart, how associates feel on the floor is important. While allowing them to be hands-free and heads-up in stores might help them engage more confidently with customers, VR training goes a long way towards increasing their confidence before they have to face shoppers.

FM Global, a commercial property risk insurer that counts one out of every three Fortune 1000 companies as a customer, is using AR for remote engineering surveys of client facilities and VR as a selling tool. If political restrictions make it difficult to send out a field engineer, FM Global sends a pair of smart glasses to the customer, having a remote expert guide the customer through the task. VR has also proven to be a compelling medium for convincing policyholders to take the proper measures in case of a flood or fire by showing them the potential damage.

At GE Transportation, training doesn’t always mean a brand new person needing to learn a brand new process, not when you’re dealing with 20,000 locomotive SKUs that ship all over the world. So, GE is using AR/VR to design and build kits of locomotive parts for operators, thinking through the presentation of these kits and how they align to manufacturing or service processes. From a plant layout perspective, VR is also incredibly useful for designing and planning operations. 

Finally at Gensler, visualization technologies are impacting how architects design and develop structures of every kind. The architecture and design firm is also considering how these tools will impact the places it designs as those buildings and environments mature. The environments we work in are increasingly contributing to the jobs we do, so Gensler is thinking about the future: AR/VR will influence the structures we design (not just help design them) because of the way they will fundamentally change how we consume information.

Just in Time: AR/VR Spark a Digital Renaissance in Aviation and Aerospace

About 20 years ago, Boeing, the world’s largest aerospace company, identified the need for a hands-free, heads-up technology in its operations. Flash forward to 2014, when a device fitting this vision (Google Glass) finally appeared on the scene. Today, the aviation and aerospace industries are experiencing a digital renaissance, and the timing is critical for several reasons:

Demand is high

Demand is being driven by two factors: 1) Rapidly aging fleets that need to be replaced or maintained at great cost; and 2) New, more technologically advanced aircraft needed to stay competitive. (Boeing, for one, has a backlog of some 5,000 planes it is under contract to build.) Next-generation aircraft boast features like advanced avionics, noise reduction capabilities, improved interior cabin designs, and greater fuel efficiency. Aviation and aerospace companies are under pressure to ramp up production to replace customers’ older fleets and supply them with state-of-the-art vehicles. And, of course, as demand for new aircraft rises so too does the need to operate and maintain those crafts.

A talent gap is creating a need for fast, low-cost training

As in pretty much all manufacturing sectors, the aviation and aerospace industries are dealing with a skilled labor crunch as experienced workers retire and leave the workforce, taking their careers’ worth of knowledge with them. By some estimates, the aerospace industry is going to need to attract and train nearly 700,000 new maintenance technicians alone by the year 2035. More jobs are being created and more baby boomers retiring than can be filled or replaced by new workers. Aerospace manufacturers and suppliers are therefore looking for innovative technologies to maximize the productivity of their existing workforces and quickly onboard new workers.

The stakes are high: Operations are complex, downtime is costly, safety is crucial, and the market is competitive

Building aircraft (commercial airplanes, military jets, spacecraft, etc.) and the engines and propulsion units that drive them involves extremely complex processes in which thousands of moving parts are assembled in precise order, carefully inspected, and maintained for years. Speed is desirable to meet demand and for competitive advantage, yet there can be no compromise or negligence when it comes to accuracy and safety—after all, we’re talking about aircraft that transport hundreds of passengers across oceans or even dodge enemy missiles at over 1,000 mph. Boeing, Airbus, Lockheed Martin and other large firms are all vying to sell to the U.S. Department of Defense, NASA and large airlines (the aviation, aerospace and defense industries’ biggest U.S. customers;) so errors and downtime are, of course, expensive and bad for business, and can also greatly affect human lives.


To accelerate production, close the talent gap, reduce errors, limit downtime, and improve safety; the leading aviation and aerospace companies are employing wearable technology, especially smart (Augmented Reality) glasses. In general, smart glasses are good for complex industrial processes that are very hands-on, time-consuming, error-prone, and loaded with information—processes like wiring an electrical system or installing the cabin of an airplane. AR glasses and VR headsets are proving useful in aircraft assembly, quality and safety inspection, field maintenance and repair, and training. The technology is providing aviation and aerospace workers with instant, hands-free access to critical information, and reducing training requirements for technicians and operators alike. Here’s how some of the aerospace giants are applying wearable tech in their operations:

Airbus

In 2015, the French aerospace company teamed up with Accenture on a proof of concept in which technicians at Airbus’ Toulouse plant used industrial-grade smart glasses to reduce the complexity of the cabin furnishing process on the A330 final assembly line, decreasing the time required to complete the task and improving accuracy.

Sans smart glasses, operators would have to go by complex drawings to mark the position of seats and other fittings on the cabin floor. With Augmented Reality, a task that required several people over several days can be completed by a single worker in a matter of hours, with millimeter precision and 0 errors.

Airbus went ahead with this application: Technicians today use Vuzix smart glasses to bring up individual cabin plans, customization information and other AR items over their view of the cabin marking zone. The solution also validates each mark that is made, checking for accuracy and quality. The aerospace giant is looking to expand its use of smart glasses to other aircraft assembly lines (ex. in mounting flight equipment on the No. 2 A330neo) and other Airbus divisions.

Boeing

Every Boeing plane contains thousands of wires that connect its different electrical systems. Workers construct large portions of this wiring – “wire harnesses” – at a time—a seemingly monumental task demanding intense concentration. For years, they worked off PDF-based assembly instructions on laptops to locate the right wires and connect them in the right sequence. This requires shifting one’s hands and attention constantly between the harness being wired and the “roadmap” on the computer screen.

In 2016, Boeing carried out a Google Glass pilot with Upskill (then APX Labs,) in which the company saw a 25% improvement in performance in wire harness assembly. Today, the company is using smart glasses powered by Upskill’s Skylight platform to deliver heads-up, hands-free instructions to wire harness workers in real time, helping them work faster with an error rate of nearly zero. Technicians use gesture and voice commands to view the assembly roadmap for each order in their smart glasses display, access instructional videos, and receive remote expert assistance.

Boeing believes the technology could be used anywhere its workers rely on paper instructions, helping the company deliver planes faster. AR/VR are also significantly cutting training times and assisting with product development. For instance, HoloLens is proving useful in the development of Starliner, a small crew transport module for the ISS.

Boeing’s Brian Laughlin will lead a thought-provoking closing brainstorm on Day One of EWTS Fall 2017

GE Aviation

General Electric is using Augmented Reality and other IoT technologies in multiple areas of its far-ranging operations. At GE Aviation, mechanics recently tested a solution consisting of Upskill’s AR platform on Glass Enterprise Edition and a connected (WiFi-enabled) torque wrench.

The pilot involved 15 mechanics at GE Aviation’s Cincinnati manufacturing facility, each receiving step-by-step instructions and guiding visuals via Glass during routine engine assembly and maintenance tasks. At any step requiring the use of the smart wrench, the Skylight solution ensured the worker tightened the bolt properly, automatically verifying and recording every torqued nut in real time.

GE Aviation mechanics normally use paper- or computer-based instructions for tasks, and have to walk away from the job whenever they need to document their work. With smart glasses, workers were 8-12% more efficient, able to follow instructions in their line of sight and automatically document steps thanks to the device’s built-in camera. And reducing errors in assembly and maintenance saves GE and its customers millions of dollars.

Lockheed Martin

In early 2015 it came out that Lockheed Martin was trialing the Epson Moverio BT-200 glasses with partner NGRAIN, to provide real-time visuals to its engineers during assembly of the company’s F-35 fighter jets and ensure every component be installed in the right place. Previously, only a team of experienced technicians could do the job, but with Augmented Reality an engineer with little training can follow renderings with part numbers and ordered instructions seen as overlay images through his/her smart glasses, right on the plane being built.

In the trial, Lockheed engineers were able to work 30% faster and with 96% accuracy. Those workers were learning by doing on the job as opposed to training in a classroom environment, which amounted to less time and cost for training. And although increased accuracy means fewer repairs, the AR solution could be used to speed up the repair process, too, from days- to just hours-long, with one engineer annotating another’s field of view. At the time, however, Lockheed acknowledged that getting the technology onto actual (secured) military bases would be difficult.

Lockheed is also interested in Virtual Reality, seeing AR/VR as key to lowering acquisition costs (all costs from the design/construction phase of a ship to when the vessel is decommissioned.) The company is applying VR to the design of radar systems for navy ships. The challenge lies in integrating the radar system with a ship’s other systems, which requires very precise installation. VR can help identify errors and issues during the design stage and prevent expensive corrections.

Using HTC Vive headsets, engineers can virtually walk through digital mock-ups of a ship’s control rooms and assess things like accessibility to equipment and lighting. Lockheed is also using Microsoft’s HoloLens to assist young naval engineers with maintenance tasks at sea—much more effective than a dense manual.

*Learn more about this application from Richard Rabbitz of Lockheed Martin Rotary Mission Systems (RMS) at EWTS Fall ‘17

Lockheed is allegedly saving $10 million a year from its use of AR/VR in the production line of its space assets, as well, by using devices like the Oculus Rift to evaluate human factors and catch engineering mistakes early. For the Orion Multi-Purpose Crew Vehicle and GPS 3 satellite system, Lockheed ran virtual simulations in which a team of engineers rehearsed assembling the vehicles in order to identify issues and improvements. A network platform allows engineers from all over to participate, saving the time and money of travelling.

Last but not least, Lockheed Martin is also actively developing and testing commercial industrial exoskeletons. Keith Maxwell, the Senior Product Manager of Exoskeleton Technologies at Lockheed, attested to this at the Spring 2017 EWTS. The FORTIS exoskeleton is an unpowered, lightweight suit, the arm of which – the Fortis Tool Arm – is available as a separate product for operating heavy power tools with less risk of muscle fatigue and injury.


While Augmented Reality has been around for decades in the form of pilots’ HMDs, only now has the technology advanced enough to become a standard tool of engineers, mechanics and aircraft operators across aviation and aerospace operations. In a high-tech industry like aerospace, AR/VR are critical for keeping up production during a mass talent exodus from the workforce. Workers won’t need years of experience to build a plane if they have on-demand access to instructions, reference materials, tutorials and expert help in their field of view.

 

The Fall Enterprise Wearable Technology Summit 2017 taking place October 18-19, 2017 in Boston, MA is the leading event for wearable technology in enterprise. It is also the only true enterprise event in the wearables space, with the speakers and audience members hailing from top enterprise organizations across the industry spectrum. Consisting of real-world case studies, engaging workshops, and expert-led panel discussions on such topics as enterprise applications for Augmented and Virtual Reality, head-mounted displays, and body-worn devices, plus key challenges, best practices, and more; EWTS is the best opportunity for you to hear and learn from those organizations who have successfully utilized wearables in their operations. 

How Your Business Can Prepare for an Augmented Reality Future

Whether you believe Apple’s latest announcements mark the arrival of mainstream Augmented Reality or still think mass use of AR is years away; smart (AR) glasses are the future. The question is how long we will hold onto our smartphones for (and yes, which device and/or platform will tip the technology in the consumer market’s favor.)

Just as glasses are the ultimate form factor for workers in factories, out in the field, in the O.R., etc.; heads-up and hands-free is ideal for consumers. The biggest problem with our phones is that we carry them everywhere and are constantly looking down at them. AR will not only provide better contextual information to enrich our daily lives, but it will also revive an element of society that today can feel somewhat foreign compared to texting or email (especially to Millennials;) and that is face-to-face human interaction. (FaceTime doesn’t count.)

So why aren’t people more eager to free their hands and gaze from a hand-held screen? Smartwatches seem to have broken into the mainstream or are at least accepted by consumers. What is it about putting on a pair of glasses? It’s not just aesthetics and privacy concerns. In enterprise, you identify a problem in the workplace – some source of inefficiency – that AR can address; but when the work day is done, what is the problem that AR would fix, that would motivate us to finally give up our phones beyond sheer convenience or entertainment? I can only guess as it’s outside my area of expertise.

Nevertheless, one day AR glasses will be acceptable outside the workplace, and once that happens a whole new world of enterprise applications will open up—those applications that depend upon consumers owning/wearing glasses and headsets, and not necessarily as often as they carry their smartphones now.

 

So, what can enterprises do in the meantime, while waiting for consumer AR glasses to take off?

1) Provide the experience for the customer or partner, like “HaaS” (hardware as a service) or an in-store demo. Some architects, realtors, automotive companies, major retailers and even airlines are already doing this, and some manufacturers are supplying customers with smart glasses to facilitate remote equipment troubleshooting and customer support.

2) Share the benefits of smart glasses with the customer/partner. Ex. HVAC worker wearing smart glasses to a job to let the customer see the problem or service in real time; a store salesperson doing the same to help an online shopper make a purchasing decision; a flight attendant viewing information about a passenger to provide better, more personalized service; doctors wearing glasses with patients, etc.

Or 3) Start with a mobile app or create a 360-degree video with the intent of making it heads-up in AR or VR in the future. While this can be very expensive (a 360˚ video can cost anywhere between $10,000 and $100,000 to produce, according to Forrester Research,) it puts the organization in the best position to capitalize on these technologies in different form factors and environments down the road. Until then, the videos can be shared on social media, at pop-up events, on the company website, etc.

 

Some example use cases:

Hyundai

In dealerships across Australia, Hyundai has introduced the Hyundai AR Showroom app for the iPad, a sales tool for dealers to show car shoppers the built-in safety and performance features of the “all-new i30.”

The app, created by Auggd, allows the salesperson to demonstrate features of Hyundai’s reinvented hatchback that are normally difficult to explain in a showroom environment (without having multiple vehicles on the floor.) By holding up an iPad in front of the real i30, shoppers can manipulate a 3D model overlay of the car; they can change its appearance and accessory options, and view animations of safety features like autonomous emergency braking and lane-keeping assist.

It seems Hyundai has been making an effort to get both its customers and representatives familiar with Augmented Reality. In early 2016, the South Korean automaker created an AR owner’s manual for some of its more popular models. The manual app and new Hyundai AR Showroom app could easily transition to glasses or a headset in the future for a more immersive and effective experience. These apps are also providing Hyundai with valuable consumer insights.

Wayfair

This Boston-based online furniture and home goods retailer envisions its customers one day shopping for Wayfair products at home using Mixed Reality headsets. In the meantime, the company’s R&D team Wayfair Next has created WayfairView, a mobile app that leverages Google’s Augmented Reality technology Tango along with Wayfair’s growing library of 3D product models. The app lets users view full-scale virtual models of furniture and décor in their homes with an AR-capable smartphone; they can look at items from multiple angles, see whether a piece of furniture will fit in a room, etc. before buying.

For over a year now, Wayfair has been visualizing millions of its home products in 3D. The models are currently used in the shopping app and on the company’s website but are ultimately destined for a headset.

*Mike Festa, Director of Wayfair Next, will speak at EWTS Fall 2017

Excedrin

Virtual Reality is a powerful storytelling medium, which is why it makes for great marketing as well as an effective job training tool. After the success of last year’s online “Migraine Experience” campaign in which users could experience migraine symptoms like blurry vision and flashing lights through AR filters; Excedrin created “Excedrin Works,” a new VR video campaign from the P.O.V. of real migraine sufferers at work.

The 2016 AR campaign saw close to 400,000 social engagements. The latest VR one is expected to be even more engaging, driving home the medication brand’s purpose and driving sales. By appealing to human emotions, Excedrin is hoping viewers will understand how crippling migraines can be and why its product is necessary.

The two VR videos, created with Weber Shandwick and Hogarth, can be found on Excedrin’s website and YouTube channel. To round out the campaign, the company is also running several documentary-style videos on TV and social media, and collaborating with race car driver Danica Patrick to share her history of migraines.

Tesco

The British supermarket chain has dropped a few hints that Virtual Reality is the future of shopping at Tesco. Way back in 2011, the company partnered with Cheil Worldwide to “open” a virtual supermarket in South Korea: An entire wall of a Korean subway station was made to appear like rows of shelves in a market, containing Tesco products with QR codes that commuters could scan to buy groceries on their phones. (After a long workday, it would be nice to get the food shopping done while waiting for your train—Tesco even arranged for deliveries to take place the same night.)

The subway experiment provided Tesco with insight for growing its business in SK. Around 2014, the grocery chain again used VR for R&D, wanting to improve its marketing and how it merchandized and reorganized stores. The company collaborated with Figure Digital on an Oculus Rift demo video called “Tesco Pelé” in which customers wearing VR headsets shop in a virtual supermarket, the layout of which represented an actual Tesco store design up for review. At the end of the simulation, the wearer steps onto a pro soccer field.

The possibilities here include, of course, virtual grocery shopping and consumer research; but the Pelé element (famous soccer player) suggests opportunities for corporate sponsorships, as well.

Lowe’s 

Like Wayfair, Lowe’s wants to be ready for the day when consumers use their own AR glasses and VR headsets. In Fall 2016, the home improvement chain debuted Lowe’s Vision, an app powered by Tango that lets customers measure any room in their homes and design it with virtual Lowe’s products using the Lenovo Phab 2 Pro phone.

In Spring 2017, Lowe’s began piloting Lowe’s Vision: In-Store Navigation, another Tango-powered app, in two of its stores. This second AR app makes it easier to shop for your home improvement project: Customers can use any Tango-enabled smartphone (or demo one with a sales associate) to search for products, read reviews, create shopping lists, and find the most efficient route to items throughout the store with the help of digital directions overlaid onto the real world.

One of the first AR/VR ideas to come out of Lowe’s Innovation Labs was the Holoroom in 2014/15. Now available in select stores, it’s essentially a how-to section in the store where shoppers can put on the HTC Vive headset and practice home improvement projects like tiling a bathroom in virtual reality.

Lowe’s is onto something in exposing its customers to emerging technologies that transition from their homes into actual Lowe’s stores, helping them with their home improvement projects from start to finish.

 

So how can your business prepare for an AR future? This is a time for innovation. Augmented and Virtual Reality represent new paradigms for sharing and taking in information. The same factors that make the technology ideal for workers – heads-up and hands-free, immersive, proven to be a superior learning method – can work for your customers and partners–figure out their pain points just as you would in determining a great use case for your workforce.How might AR/VR make it easier or more appealing for consumers to interact with your brand, seek your services, buy (and use) your product, etc.? Consider the scenario in which the business provides AR glasses for the customer/partner as well as the future one in which consumers have access to their own devices. What can you do now to begin forming a bridge between those two scenarios? 

 

About EWTS Fall 2017:

The Fall Enterprise Wearable Technology Summit 2017 taking place October 18-19, 2017 in Boston, MA is the leading event for wearable technology in enterprise. It is also the only true enterprise event in the wearables space, with the speakers and audience members hailing from top enterprise organizations across the industry spectrum. Consisting of real-world case studies, engaging workshops, and expert-led panel discussions on such topics as enterprise applications for Augmented and Virtual Reality, head-mounted displays, and body-worn devices, plus key challenges, best practices, and more; EWTS is the best opportunity for you to hear and learn from those organizations who have successfully utilized wearables in their operations. 

 

photo credit: dronepicr Kölner Dom aus Lego Gamescom via photopin (license)

Why the Logistics Industry is Going Hands-Free

The logistics industry has been thinking hands-free for years now. In my research for this blog post, I came across an article from 2007 on the use of voice headsets and arm-mounted computers in the warehouse. More recently, ABI Research found that 61% of logistics companies it surveyed are adopting wearable technologies as part of their technology innovation strategy. In addition to logistics companies, enterprises in other verticals are using wearables within their warehouse or supply chain operations. Below are some of the top use cases:

 

DHL

The number one wearable use case in the logistics industry today is arguably vision picking with Augmented Reality glasses like Google Glass and the Vuzix M300. DHL has been exploring wearables with its customers and in different units of its business for several years. In 2014 with the help of Ubimax, DHL Supply Chain and DHL customer Ricoh carried out a successful vision picking pilot in a warehouse in the Netherlands.

For the pilot, staff went about their picking duties, taking cues from simple graphics and text displayed in smart glasses to navigate the warehouse and locate each pick. The glasses allowed for hands-free order picking, which sped up the picking process and reduced errors.

Using Ubimax’s vision picking solution, DHL and Ricoh realized a 25% efficiency increase over the course of the three-week trial. Exel, a unit of Deutsche Post DHL Group, achieved similar results the following year when it gave smart glasses to workers in two of its U.S. warehouses. In August 2016, DHL Supply Chain announced it was expanding its “Vision Picking Program,” with additional pilot sites established across Europe and the U.S.

In addition to picking and e-fulfillment, DHL sees potential in using AR and smart eyewear in other areas, including transportation, last mile delivery, and training of seasonal or temporary workers. In November 2016, Fujitsu announced a partnership with DHL Supply Chain UK to develop innovative services around wearable technology and the Internet of Things.

*Justin Ha, Director of Solutions Design at DHL Supply Chain, will be speaking at EWTS Fall 2017.

UPS

Way back in 2011, UPS adopted a wearable package scanning system consisting of a ring scanner plus a small wrist- or hip-worn terminal, both by Motorola Solutions. The goal was to speed up the time it takes to load packages, prevent misloads, and improve package tracking and data reliability. UPS rolled out tens of thousands of these devices. Of course, today there is more sophisticated technology: Smart glasses, often paired with ring scanners (for items on very low or high-up shelves,) are the new wearable scanning system and the new interface for logistics software.

In 2015, it was reported that UPS was testing smart glass technology to reduce the amount of labeling on packages. Instead of two labels on every package (an address label and a second label identifying the delivery route and truck;) a single barcoded address label could be used that – when scanned with Google Glass – would inform the package sorter of the box’s destination. This simplifies the job and allows workers to be more hands-free.

Currently, UPS is developing and rolling out a Virtual Reality driver training program at nine of its training facilities, to simulate the uncertainties and challenges of city driving. Wearing an HTC Vive or other VR headset, students will go through a virtual streetscape, using voice commands to identify road hazards. The VR training modules are designed for package delivery drivers but in the future UPS plans to expand the tech’s use to tractor trailer workers.

FedEx 

Every second counts when you handle millions of packages a day, which is why the shipping giants were early adopters of wireless technologies and why they continue to pursue the latest in mobile—for the opportunity to shave off seconds from the delivery process.

Since 2000, FedEx parcel handlers were equipped with ring scanners connected via Bluetooth to a device worn on their forearms. Similar to the system used at UPS, the wireless solution scanned each package and provided tactile feedback when a parcel was placed in the wrong container.

On top of supply chain efficiency, FedEx is also interested in wearable technology for the overall safety of the workforce. Its aircraft were equipped with heads-up displays (HUDs) to improve pilots’ situational awareness during night flights and bad weather conditions; and the logistics giant is exploring wearable wellness monitoring.

Crane Worldwide Logistics

From faster picking to better posture: Crane Worldwide Logistics, a large third-party logistics company, tried out a wearable device by KINETIC to reduce the number of ergonomic injuries among its workforce.

Back injuries, strains and sprains are the most frequent and costly injuries in warehouses and other industrial workplaces. REFLEX is a discreet wearable worn on one’s belt or waistband that automatically detects unsafe postures, providing instant feedback to the wearer whenever a high-risk motion occurs. In so doing, the solution helps teach workers how to move safely or use “good biomechanics” on the job.

Using REFLEX, Crane was able to reduce the number of unsafe postures at its Houston, TX distribution facility (where the KINETIC pilot took place) by 84%. The “most improved” worker saw a 96% reduction, from 320 bad postures in a day to just 12.

Bechtle 

Bechtle is one of Europe’s leading IT service providers. In January 2016, after extensive piloting, the company announced the deployment of Vuzix M100 Smart Glasses for vision picking at its distribution center in Neckarsulm, Germany.

Warehouse employees began using smart glasses running the mobile SAP AR Warehouse Picker app and connected to Bechtle’s WMS as an alternative to handheld scanners in select picking processes. The hands-free solution, featuring QR code scanning and voice recognition technology, guided the wearer through the picking process step-by-step without the need for any manual input of information.

This was the first of many potential use cases for smart glasses that Bechtle intended to pursue. The company believed the benefits of Augmented Reality could be reaped most quickly applied to a simple, labor-intensive process like the picking of small parts, though it plans to expand the use of wearables to additional workflows in receiving, complex delivery orders and more.

Sennheiser 

In November 2016, global service provider Arvato partnered with Picavi to launch a vision picking project for audio company Sennheiser. For the purposes of the pilot, a separate pick process was identified in order to evaluate Picavi’s Pick-by-Vision solution in a controlled environment.

Initial feedback from warehouse employees was positive. Having all essential task-based information displayed in front of their eyes through smart glasses allowed pickers to keep both hands on the job, which minimized errors and helped them stack the pallets faster. Workers also found the new pick solution intuitive to use and comfortable to wear while moving around the warehouse.

 

It seems a consensus has been reached after all these vision picking pilots, and that is that smart glasses are setting a new bar in the classic order picking process. Augmented Reality has proved superior to basic handheld scanners and tiring voice picking systems.

Beyond order picking, AR glasses can replace traditional tools in receiving, packing, shipping and replenishment–all areas of the warehouse or distribution center. A wearable device could conceivably “accompany” a package from the moment an order is received to the moment it’s loaded onto the truck for delivery, ensuring a smooth and accurate flow of goods all along the supply chain as well as the safety of all pickers, packers, drivers and other package handlers.

 

About EWTS Fall 2017:

The Fall Enterprise Wearable Technology Summit 2017 taking place October 18-19, 2017 in Boston, MA is the leading event for wearable technology in enterprise. It is also the only true enterprise event in the wearables space, with the speakers and audience members hailing from top enterprise organizations across the industry spectrum. Consisting of real-world case studies, engaging workshops, and expert-led panel discussions on such topics as enterprise applications for Augmented and Virtual Reality, head-mounted displays, and body-worn devices, plus key challenges, best practices, and more; EWTS is the best opportunity for you to hear and learn from those organizations who have successfully utilized wearables in their operations. 

 

photo credit: vic_206 DHL Air / Airbus A300B4-622R(F) / EI-OZM via photopin (license)

The Inevitable Rise of Google Glass 2.0

The use cases mentioned in Wired’s breaking story about Google Glass 2.0 are supreme examples of Google Glass’ success in the workplace. AGCO, Boeing, DHL and GE are certainly major companies validating the benefits of Glass to enterprise. Their stories have been shared here on EnterpriseWear as well as at every Enterprise Wearable Technology Summit.

(See Wearables in Manufacturing: Interview with AGCO’s Peggy Gulick  ; Wearables in Industry: Interview with GE’s Sam Murley ; and Wearables in Logistics: Now or Later?)

But there have been numerous use cases by big and small companies alike since Glass made its ill-fated consumer debut in 2012. Not all those early explorations were developed further; some of the first experiments were simply small, short pilots that were subsequently dropped because the tech wasn’t ready or because the company may not have had the resources, connections or patience of a Boeing or GE. But it was those cases that taught Google a big lesson, encouraging the company to direct its attention to the enterprise and get to work on what would ultimately become Google Glass Enterprise Edition.

While companies like GE and Boeing have been clandestinely using Google Glass EE for a while now, it’s worth looking back at some of the earliest – and incredibly imaginative – test runs of Google Glass Explorer Edition:

Airports & Airlines

  • In one of the most publicized early trials, Virgin Atlantic agents at London’s Heathrow Airport used Google Glass to process first-class passengers for their flights while maintaining eye contact with them.
  • At Copenhagen Airport, the device was used by airport duty managers to document issues and answer travelers’ questions on the spot.
  • Japan Airlines had personnel on the tarmac at Honolulu Airport wear Glass so that staff at headquarters could perform remote visual inspections of planes and send instructions.

Doctors

  • Dr. Rafael Grossmann was the first to use Google Glass during live surgery.
  • Glass was tested at Stanford University Medical Center to guide residents through surgery, at UC San Francisco to broadcast surgeries for faculty and students to watch, and at UC Irvine Medical Center to monitor anesthesia residents.
  • At Beth Israel Deaconess Medical Center, four ER doctors used the glasses in lieu of tablets to get real-time clinical information.
  • Dr. Peter Chai used the technology in ED to facilitate remote consultations in dermatological cases.
  • Several physicians and administrators at Mayo Clinic tested Glass in different specialties and departments for viewing patient info, documenting injuries, and learning.
  • Indiana University Health Methodist Hospital used Glass as an aid in a tumor removal and abdominal wall reconstruction procedure. IU Health’s Paul Szotek also livestreamed a hernia repair with the device.
  • Chicago-based MedEx had its paramedics use Glass to communicate with specialists from the ambulance and show ER doctors the status of incoming patients in real time.

*Dr. Szotek will talk about his experiences since that first livestream at the Fall 2017 Enterprise Wearable Technology Summit.

Rogers-O’Brien Construction

  • The Texas-based general contractor used Google Glass to capture, share and collaborate on jobsite information hands-free. It was an early foray for the company, which has since experimented and adopted all kinds of emerging technologies including VR headsets and partial exoskeletons.

*Todd Wynne and Joe Williams of Rogers-O’Brien are also speaking at the fall event.

Car Companies

  • In a pilot project at one of BMW’s U.S. plants, Google Glass was tested for quality assurance, used by workers to document potential defects and improve communication between the quality testers and development engineers.
  • GM experimented with the device in quality inspection and as a tool for viewing procedural instructions. 

Food Industry

  • Several restaurant chains have tested Glass for training purposes: KFC tried out the device to record tutorials and play them back for new recruits. Similarly, Capriotti’s Sandwich Shop used Glass to record new workers’ performance and the lunchtime rush, hoping to spot areas for improvement.

QAD

  • The global ERP software company used short video interviews recorded with Glass to introduce new employees to team members outside of the corporate office.

Las Vegas Air Conditioning

  • The HVAC company was one of the first to have its technicians wear Google Glass on jobs, to live stream their work for the customer to see.

Sullivan Solar Power

  • The Southern California company’s field technicians wore Glass to safely (hands-free) view specs and plans while installing solar panels atop homes and businesses.

Schlumberger

  • The oilfield service company tried out 30 pairs of Google Glass to provide hands-free intelligence to workers in the field, improving their safety and efficiency.

Active Ants

  • Stock pickers at the Dutch e-fulfillment company were able to reduce their error rate by 12% and increase their speed by 15% using Glass.

San Francisco’s de Young Museum

  • One of the first museums to integrate Google Glass into an art exhibit: Visitors used the tech to gain more insight into the artist and featured works in de Young’s 2014/15 Keith Haring show.

Fennemore Craig (now Lamber Goodnow)

  • Two attorneys at the personal injury law firm used Google Glass to win cases, loaning the device to clients so they could record a day in their lives post-injury.

 

Find out just how much Google Glass has progressed – both the hardware and applications – since those early days at the upcoming Fall 2017 Enterprise Wearable Technology Summit, where real end-users will speak about their “secret” deployments of the technology.

 

photo credit: jurvetson Sergey Brin Solves for X via photopin (license)