Everything Enterprise XR Announced at AWE USA 2018

The scope of the Augmented World Expo is large to say the least—six tracks, a huge expo divided into pavilions, a Playground of entertaining immersive experiences, workshops, and more. As opposed to EWTS’ enterprise focus, AWE truly gathers everyone interested in defining and progressing the future of XR in every aspect of life; and BrainXchange was happy to partner with the show’s producers to help plan the industry event.

There were many announcements at the 9th AWE and some really cool tech on the expo floor (mixed reality backpack, anyone?) For our followers interested in the business and industrial applications of wearable XR technologies, we’ve separated enterprise from consumer in recapping the major developments (yet still beta in many cases) that came out of last week’s event:


One of the most anticipated announcements was for the Kopin Golden-i Infinity: A compact and lightweight, gesture- and voice-controlled smart screen that attaches magnetically to turn any pair of suitable eyewear into an AR display. The Golden-i is powered by an Android or Windows mobile device – thereby offloading the heavy lifting – and can connect to apps using a USB-C cable. It’s intended for enterprise use and will arrive by the third quarter of this year at a price of around $899.


Qualcomm revealed the Snapdragon XR1 Platform, the first chip specially made for standalone XR devices. The new processor features special optimizations for better interactivity, power consumption and thermal efficiency; and could potentially reduce the cost of entry for new AR/VR hardware developers. Qualcomm also released a reference design that has already influenced forthcoming standalone devices from VIVE, Meta, Vuzix and Picoare.


In addition to taking the stage alongside Qualcomm to reveal the new Snapdragon XR1, Vuzix announced a partnership with Plessey Semiconductor and a shipping date of June 1st for the Blade AR Smart Glasses. Both partnerships will affect Vuzix’s next-gen smart glasses (expected in 2019) by increasing processing power and upgrading the display engine. During his keynote presentation, Lance Anderson also called on developers to help augmented reality move forward by creating practical and entertaining apps for the Vuzix Blade, the first fashion-friendly smart glasses for both work and play.


AWE attendees were introduced to the HMT-1Z1, the first commercially available, ruggedized head-mounted AR computer certified for use in potentially explosive work environments (ATEX Zone 1 and C1/D1). The intrinsically safe wearable computer presents no ignition risk, allowing all workers to go hands-free and take advantage of the efficiency benefits of the HMD, and will ship on June 15th.


SPEX, a new division of eSight Corporation, showcased its first AR headset platform offering “breakthrough enhanced vision” in commercial, industrial and medical scenarios that require precision vision. The lightweight HMD has no release date as of yet but has been described as comfortable, providing an augmented view of the world without obstructing the user’s natural vision.


Atheer announced the latest version of its AR platform, which includes secure group collaboration so that multiple remote experts can provide live video guidance and support across the supply chain (think of manufacturers with multiple suppliers). The company also widened the range of business processes supported by the Atheer AR Workflow Engine to include dynamic warehouse pick lists, contextual task guidance, checklists, link workflows, surveys, and note-taking for seamless process documentation.


Epson released the Moverio AR SDK for its line of Moverio Smart Glasses, which adds new capabilities like 3D object tracking using CAD data and 2D image tracking to the former SDK. The update enables the creation of 3D content for Moverio glasses and can detect various objects from 3D CAD files (no need for QR codes or other markers) as well as track multiple 2D images on a 3D plane. Epson is accepting applications for beta testers to help identify bugs.

Kaaya Tech

Kaaya Tech’s HoloSuit, a motion capture suit featuring haptic feedback for full immersion, was on showcase at AWE. The MoCap suit with haptic tech comes in two models, a basic one with 26 sensors and a higher-end version with 36 sensors. As opposed to games and entertainment, Kaaya Tech sees its technology being used in physical training simulations for industrial jobs, factory line work and the operation of heavy machinery.


ODG demonstrated a working model of an AR oxygen mask it has been developing with FedEx. The mask, named SAVED for Smoke Assured Vision Enhanced Display, has a heads-up AR display to help pilots make a safe landing despite smoke filling up the plane. In the near future, ODG plans to offer the technology to civil and commercial aircraft manufacturers and pilots as well as the military.


ScopeAR debuted a new AR platform offering real-time remote assistance and augmented reality smart instructions. The all-in-one solution combines Scope AR’s video calling app Remote AR and the AR content creation library WorkLink to enable increased levels of collaboration and guidance.


At AWE, Toshiba demonstrated its dynaEdge AR Smart Glasses with two new applications resulting from recently-announced partnerships with Applied Computer Services (ACS) and Ubimax. ACS’ Timer Pro Storyboard software for video training and the Ubimax Frontline application suite are now both available on the dynaEdge.


AWE attendees got a live, on-stage demo of the Meta Viewer, the first software application for the Meta 2 headset that lets users view 3D CAD models in AR. Currently in beta state, the app will save time and reduce costs in the product development process—everyone in the development chain (designers, salespeople, etc.) will be able to use Meta Viewer to collaborate and interact with 3D designs without having any special technical skills.


The company has added Sync – “the first software solution to automatically create edge-based tracking from CAD data” – to REFLEKT ONE, its suite of AR/MR app development tools. Sync is designed to further simplify the transformation of existing technical documentation and CAD data into AR/MR manuals and enterprise applications. With Sync, RE’FLEKT claims AR apps for maintenance, training and operations can be built completely in-house. Companies can save time and money and do not have to share their proprietary CAD and other data with a third party.


Image source: Wareable


The Enterprise Wearable Technology Summit (EWTS) is an annual conference dedicated to the use of wearable technology for business and industrial applications. As the leading event for enterprise wearables, EWTS is where enterprises go to innovate with the latest in wearable tech, including heads-up displays, AR/VR/MR, body- and wrist-worn devices, and even exoskeletons. The 5th annual EWTS will be held October 9-10, 2018 at The Fairmont in Austin, TX. For more details, please visit the conference website or download the EWTS 2018 Brochure.

Augmented World Expo (AWE), the world’s #1 AR+VR conference and expo, comes to Munich, Germany on October 18-19, 2018. CXOs, designers, developers, futurists, analysts, investors and top press will gather at the MOC Exhibition Center to learn, inspire, partner and experience first-hand the most exciting industry of our times. Apply to exhibit, submit a talk proposal and buy Super Early Bird tickets now at www.aweeu.com.

Manufacturing 4.0: Checking In with Expert Peggy Gulick of AGCO

A true enterprise wearable tech pioneer, Peggy Gulick, Director of Digital Transformation, Global Manufacturing at AGCO Corporation, spearheaded one of the most successful use cases of Google Glass in enterprise to date. Where others saw challenges, Peggy and her team saw opportunities to turn a device that was then (2013) struggling to find a purpose into a powerful lean manufacturing tool. We last interviewed Peggy in July of 2016, before she first graced the EWTS stage. Since then, AGCO has become a poster child of Glass Enterprise, the second generation of Google Glass developed with the input of enterprise visionaries like Peggy; and Peggy herself has become a star speaker, her story undoubtedly inspiring many others. Below, Peggy answers our questions about the state of manufacturing today:


BrainXchange: What are the greatest challenges faced by manufacturers today?

PG: All manufacturers that I have spoken to seem to face similar challenges with rising employer costs (many related to healthcare) and the need to reduce operational costs while projecting longer-term strategic plans. In addition, the expectations on employers by employees and the communities that they exist in have changed. Employees expect more from their employers, including a sense of purpose. Communities expect both social and environmental contribution.

In the midst of this, there is a gap in qualified labor and the high-tech skill sets required to meet new operational budgets and strategic plans to increase quality, reduce time and cost to market.

Automation, industrial revolution 4.0, Internet of Things and big data are all being touted as responses to these shared challenges, yet most organizations have not figured out how to incorporate them into current business processes. Although these new technologies can provide relief to manufacturers, they continue to face perception challenges, identified as replacing rather than augmenting humanity.

BrainXchange: What are the effects of automation and big data in manufacturing?

PG: Currently, there are two types of companies benefitting from big data. One is, of course, big data companies, ranging from expanded infrastructures to storage, management, processing and analytics of massive amounts of collected and stored information. The second is the strategic few organizations that have found ways to incorporate the data into problem solving and to deliver the right information to the critical point of decision making. By treating big data and automation as dependent and collaborative solutions, both as drivers of continuous improvement and lean manufacturing processes, we have been able to determine the elements that are most likely to impact outcomes that matter the most –to our product and process quality, productivity and safety. Big data, unless transformed into actionable information, is meaningless.

BrainXchange: Is AGCO experiencing a “skilled labor crunch?”

PG: Yes, but we are addressing it through investment in our employees, both current and potential (apprentices). Mechatronics, assembly academy, scholarships and on the job training combined with a work environment that allows employees to contribute and feel a sense of purpose has allowed us to retain and recruit successfully. Our employees are motivated by the organization’s concern for quality products/processes and employee safety, not cost-reduced workforces.

BrainXchange: How might smart glasses and Augmented Reality help address some of the above challenges?

PG: Smart glasses and augmented reality have been deployed in our manufacturing operations to further our continuous improvement efforts across the site. The use of wearable technology helps eliminate motion, over-processing, defects and even transportation. Excessive travel to workstations to retrieve work instructions and bills of material is eliminated. Defects are minimized due to comprehensive (pictures, videos) and easy-to-access to work instructions. Our plant makes highly complex, low-volume agricultural equipment. Wearable tools help minimize over-processing caused by the need to rework due to misguided assembly. When workers can do their job smarter, faster, safer, it resonates throughout the entire culture. As we realize labor crunches, it is more and more important for companies to offer the tools and training required to create, grow and retain their employees. Smart glasses has helped us to do that.

BrainXchange: What tools do AGCO workers currently use to do their jobs? How are new workers currently trained?

PG: All of our assembly and assembly quality gate employees attend 40 hours of Assembly Academy followed by 40 hours of Lean Work-cell training. In addition to reading blueprints and interpreting supplemental information, assemblers must be proficient at hand, power and assembly tools. Since employees are now expected to use wearable tools including smart eyewear (Google Glass) to access work instructions and quality checklists, wearable tools are introduced immediately in the learning academies.  Wearable tools not only inform but also capture and flow pertinent information (including pictures, text and video) for non-conformance issues and missed thresholds.

It was critical to the success of wearables to acknowledge that all employees are not equal in training and skills. As employees’ skills mature, specific to operations, our wearable applications allow for personalized levels of instructions, tailoring them based on algorithms of training and experience.

The wearable tools themselves are easy to implement and support. Most employees are excited to wear the technology and realize the benefits quickly.

BrainXchange: Where do you see the greatest opportunities for smart glasses in the manufacturing plant?

PG: Our product design team finds great value in virtual reality glasses. Not only do they broaden the ability for a team to “see” what others are thinking, but they allow design teams to remotely interact, all in virtual glass, all seeing the same product and projected design strategies.

As a problem-solving organization and culture, we have weighed the value of wearable smart glasses in many areas, including welding, paint preparation, assembly, quality, technical services, material management and even plant tours. The first thing that we have discovered is that the projected value of replacing current tools, whether it be paper work orders or terminal work instructions, with smart glasses is 2x what we initially thought. The results have been so beneficial in some areas that we have retested, thinking it was a mistake. It is important to note that every pilot we have conducted has been in response to a defined problem. And, after 5 whys, fishbones and cross-functional involvement, sometimes even a kaizen, smart glasses are a part of the proposed solution with metrics associated. Knowing that smart glasses are a lean tool, and not an industry requirement or cool factor, we have reported 30% reduction in processing times, 50% reduction in amount of time employees train on the job (new hire and cross functional) and reduced quality and safety incidents that we are still calculating. The greatest value for the glasses has been in assembly and quality, both needing easy and quick access to hands-free instructions. As a manufacturer of complexly configured products, we have discovered that training by smart glasses is the grand slam. New product launches, multi-operation and new hire training are easily administered and audited for success.

BrainXchange: How do smart glasses further lean manufacturing?

PG: Simple. Lean is all about waste elimination. Smart glasses, when implemented for the right reasons, reduce waste. The use of wearable solutions was discovered as we did what we do best every day–solve problems (4873 problem solutions implemented by employees in 2016.)

Introducing Google Glass to our manufacturing floor was not intended as disruptive technology or even competitive advantage. They were introduced as solutions to make employee’s jobs easier and safer while driving higher quality to our product and our processes. In the end, we have accomplished both.

We are delighted that Peggy will be speaking again at EWTS 2018 this October, and cannot wait to hear how AGCO’s Google Glass success story has progressed. 


The 5th Annual Enterprise Wearable Technology Summit 2018, the leading event for enterprise wearables, will take place October 9-10, 2018 at The Fairmont in Austin, TX. EWTS is where enterprises go to innovate with the latest in wearable tech, including heads-up displays, AR/VR/MR, body- and wrist-worn devices, and even exoskeletons. For details, early confirmed speakers and preliminary agenda, please stay tuned to the conference website.

Augmented World Expo (AWE,) the world’s largest conference and expo dedicated to Augmented and Virtual Reality, is taking place May 30-June 1, 2018 in Santa Clara, CA. Now in its 9th year, AWE USA is the destination for CXOs, designers, developers, creative agencies, futurists, analysts, investors and top press to learn, inspire, partner and experience first-hand the most exciting industry of our times.


Photo credit: Google X

Enterprise Wearables: What to Consider in Choosing Hardware

Watch Ramon Llamas, Research Manager at IDC, take leaders at General Dynamics Electric Boat, Gulfstream Aerospace, Duke Energy, Rogers-O’Brien Construction, and Atheer through the hardware considerations of wearables in enterprise. Whether used in designing submarines or to provide power and gas to millions of people, an enterprise wearable device has to suit both the environment in which it is used and the user or wearer in terms of ergonomics and performance.  



Key Learnings and Takeaways:

Hardware considerations are first and foremost in beginning one’s wearable journey in the enterprise, and often the customer is internal—a group of workers. Ramon does a great job of eliciting key words of advice from the panelists, like this takeaway from Ken Fast of General Dynamics Electric Boat: Things can take a long time to implement in a large company, so maintain a kind of childlike excitement about the technology.

Ken develops solutions to support those who design and build nuclear submarines at General Dynamics. He points out that employees who only need a few minutes of textual instruction before doing, say, half an hour of work don’t really need a heads-up display—a tablet is fine. But for employees requiring constant guidance, Augmented Reality glasses are desirable to feed them information at every step of a process. The shortcomings of current hardware options are significant here because while AR glasses would be ideal, many models are ill-fitting at the moment. You can imagine if the information or data shown to the worker has to align precisely with the real world (ex. installation info,) glasses that slip or move around won’t do.

Drew Holbrook of Gulfstream Aerospace advises listeners to keep pushing through the roadblocks. In addition to looking at emerging technologies for engineering, marketing and training; Drew works with designers at Gulfstream to bring Virtual Reality tools to Gulfstream’s customers to help them visualize their private jets. To do that, the technology has to look as real as possible, with good resolution and color clarity.

Each aircraft Gulfstream sells is unique; the cabin configuration, interior design, materials, paint job, etc. are all selected by the customer. The concept is to take a VR headset to the client to let him experience the design. In this case, the user experience and performance are important, for the client’s virtual experience will reflect upon Gulfstream’s brand. (Drew also mentioned device tracking—when implementing wearables into your operations, consider how you’ll track and maintain them once you’ve scaled up the solution. What happens when a device fails or is dropped?)

Don’t believe everything you see in videos until you try it yourself, warns Aleksandar Vukojevic. Having tested many HMDs at Duke Energy, Aleksandar has determined that hardware choice ultimately depends upon the use case and what kind of information will be displayed. If remote communication is the main use case, for instance, factors like resolution and speed matter. The working environment presents its own barriers that affect hardware selection, especially in electric utilities where jobs are dangerous, safety-rated glasses are required, and connectivity is an issue. Since electrical workers must be a certain distance away from a power line to use metal objects, device components currently prevent their use of most smart eyewear. (Again, environment and user.)

Keep it simple is another piece of wisdom, this one from Todd Wynne of Rogers-O’Brien, who suggests focusing on making the wearer’s life easier and safer. For Todd, that person is the construction worker with a wrench in hand who needs easily consumable information to make quality decisions within a tight building schedule.

On a construction site, documentation and safety are critical. Building doesn’t stop for rain; there’s dust everywhere and things break easily. Workers need to stay hydrated and be constantly aware of their surroundings and movement. Key wearable hardware considerations, therefore, are user interface and display (easy to use, glanceable,) as well as ruggedness and form factor (non-intrusive.) The device has to be invisible; workers should be able to forget about them.

A software partner can really help in the hardware evaluation process. Theo Goguely from Atheer recommends going about it systematically, creating a kind of matrix or graph of all possible devices on one axis and all use cases within your organization on another in order to find your sweet spots. There isn’t one best piece of hardware for a business—within an organization, different devices will be best for different use cases and the device won’t necessarily be a wearable.

It all comes down to where and by who the technology is to be used: A $3,000 HoloLens headset isn’t necessary to pick a box off of a warehouse shelf; a smaller monocular device is more appropriate. It’s not a specs race, so figure out the device that will give the “guy on the ground” just the information he needs. And if the technology isn’t there yet (i.e you need it to be intrinsically safe,) begin exploring your software options and training on a tablet if possible, or target a different use case in a less demanding environment while the hardware catches up to its potential.


The 5th Annual Enterprise Wearable Technology Summit 2018, the leading event for enterprise wearables, will take place October 9-10, 2018 at The Fairmont in Austin, TX. EWTS is where enterprises go to innovate with the latest in wearable tech, including heads-up displays, AR/VR/MR, body- and wrist-worn devices, and even exoskeletons. For details, early confirmed speakers and preliminary agenda, please stay tuned to the conference website.


Smart Glasses, AR, VR and MR: Head-Worn Devices in the Enterprise

Watch Picavi’s Johanna Bellenberg talk about head-worn devices with the very people implementing the technology at Walmart, GE Transportation, Gensler, USPS, and FM Global. The group shares the insights, “aha” moments, and limitations realized in implementing AR/VR glasses and headsets; and come to a common consensus on the value of these technologies especially for employee training.



AR/VR is helping the Postal Service meet the demands of a changing digital world, in which its 20-year-old fleet of vehicles needs fixing and replacing and more and more part-time employees need fast training. Passing information from carrier to carrier via a physical book containing information on every route isn’t an efficient method, not with millions of delivery points each day. Using AR/VR for vehicle maintenance and to eliminate 50% of training time for new employees is what it takes to keep the Postal Service alive.

As there isn’t a solid use case yet for HMDs in the retail world, Walmart is using VR at its training academies to simulate exceptional customer experience problems you wouldn’t want to create in a real store and shopping events that only happen once a year. VR is ideal as you “can get multiple reps over and over.” For Walmart, how associates feel on the floor is important. While allowing them to be hands-free and heads-up in stores might help them engage more confidently with customers, VR training goes a long way towards increasing their confidence before they have to face shoppers.

FM Global, a commercial property risk insurer that counts one out of every three Fortune 1000 companies as a customer, is using AR for remote engineering surveys of client facilities and VR as a selling tool. If political restrictions make it difficult to send out a field engineer, FM Global sends a pair of smart glasses to the customer, having a remote expert guide the customer through the task. VR has also proven to be a compelling medium for convincing policyholders to take the proper measures in case of a flood or fire by showing them the potential damage.

At GE Transportation, training doesn’t always mean a brand new person needing to learn a brand new process, not when you’re dealing with 20,000 locomotive SKUs that ship all over the world. So, GE is using AR/VR to design and build kits of locomotive parts for operators, thinking through the presentation of these kits and how they align to manufacturing or service processes. From a plant layout perspective, VR is also incredibly useful for designing and planning operations. 

Finally at Gensler, visualization technologies are impacting how architects design and develop structures of every kind. The architecture and design firm is also considering how these tools will impact the places it designs as those buildings and environments mature. The environments we work in are increasingly contributing to the jobs we do, so Gensler is thinking about the future: AR/VR will influence the structures we design (not just help design them) because of the way they will fundamentally change how we consume information.

Just in Time: AR/VR Spark a Digital Renaissance in Aviation and Aerospace

About 20 years ago, Boeing, the world’s largest aerospace company, identified the need for a hands-free, heads-up technology in its operations. Flash forward to 2014, when a device fitting this vision (Google Glass) finally appeared on the scene. Today, the aviation and aerospace industries are experiencing a digital renaissance, and the timing is critical for several reasons:

Demand is high

Demand is being driven by two factors: 1) Rapidly aging fleets that need to be replaced or maintained at great cost; and 2) New, more technologically advanced aircraft needed to stay competitive. (Boeing, for one, has a backlog of some 5,000 planes it is under contract to build.) Next-generation aircraft boast features like advanced avionics, noise reduction capabilities, improved interior cabin designs, and greater fuel efficiency. Aviation and aerospace companies are under pressure to ramp up production to replace customers’ older fleets and supply them with state-of-the-art vehicles. And, of course, as demand for new aircraft rises so too does the need to operate and maintain those crafts.

A talent gap is creating a need for fast, low-cost training

As in pretty much all manufacturing sectors, the aviation and aerospace industries are dealing with a skilled labor crunch as experienced workers retire and leave the workforce, taking their careers’ worth of knowledge with them. By some estimates, the aerospace industry is going to need to attract and train nearly 700,000 new maintenance technicians alone by the year 2035. More jobs are being created and more baby boomers retiring than can be filled or replaced by new workers. Aerospace manufacturers and suppliers are therefore looking for innovative technologies to maximize the productivity of their existing workforces and quickly onboard new workers.

The stakes are high: Operations are complex, downtime is costly, safety is crucial, and the market is competitive

Building aircraft (commercial airplanes, military jets, spacecraft, etc.) and the engines and propulsion units that drive them involves extremely complex processes in which thousands of moving parts are assembled in precise order, carefully inspected, and maintained for years. Speed is desirable to meet demand and for competitive advantage, yet there can be no compromise or negligence when it comes to accuracy and safety—after all, we’re talking about aircraft that transport hundreds of passengers across oceans or even dodge enemy missiles at over 1,000 mph. Boeing, Airbus, Lockheed Martin and other large firms are all vying to sell to the U.S. Department of Defense, NASA and large airlines (the aviation, aerospace and defense industries’ biggest U.S. customers;) so errors and downtime are, of course, expensive and bad for business, and can also greatly affect human lives.

To accelerate production, close the talent gap, reduce errors, limit downtime, and improve safety; the leading aviation and aerospace companies are employing wearable technology, especially smart (Augmented Reality) glasses. In general, smart glasses are good for complex industrial processes that are very hands-on, time-consuming, error-prone, and loaded with information—processes like wiring an electrical system or installing the cabin of an airplane. AR glasses and VR headsets are proving useful in aircraft assembly, quality and safety inspection, field maintenance and repair, and training. The technology is providing aviation and aerospace workers with instant, hands-free access to critical information, and reducing training requirements for technicians and operators alike. Here’s how some of the aerospace giants are applying wearable tech in their operations:


In 2015, the French aerospace company teamed up with Accenture on a proof of concept in which technicians at Airbus’ Toulouse plant used industrial-grade smart glasses to reduce the complexity of the cabin furnishing process on the A330 final assembly line, decreasing the time required to complete the task and improving accuracy.

Sans smart glasses, operators would have to go by complex drawings to mark the position of seats and other fittings on the cabin floor. With Augmented Reality, a task that required several people over several days can be completed by a single worker in a matter of hours, with millimeter precision and 0 errors.

Airbus went ahead with this application: Technicians today use Vuzix smart glasses to bring up individual cabin plans, customization information and other AR items over their view of the cabin marking zone. The solution also validates each mark that is made, checking for accuracy and quality. The aerospace giant is looking to expand its use of smart glasses to other aircraft assembly lines (ex. in mounting flight equipment on the No. 2 A330neo) and other Airbus divisions.


Every Boeing plane contains thousands of wires that connect its different electrical systems. Workers construct large portions of this wiring – “wire harnesses” – at a time—a seemingly monumental task demanding intense concentration. For years, they worked off PDF-based assembly instructions on laptops to locate the right wires and connect them in the right sequence. This requires shifting one’s hands and attention constantly between the harness being wired and the “roadmap” on the computer screen.

In 2016, Boeing carried out a Google Glass pilot with Upskill (then APX Labs,) in which the company saw a 25% improvement in performance in wire harness assembly. Today, the company is using smart glasses powered by Upskill’s Skylight platform to deliver heads-up, hands-free instructions to wire harness workers in real time, helping them work faster with an error rate of nearly zero. Technicians use gesture and voice commands to view the assembly roadmap for each order in their smart glasses display, access instructional videos, and receive remote expert assistance.

Boeing believes the technology could be used anywhere its workers rely on paper instructions, helping the company deliver planes faster. AR/VR are also significantly cutting training times and assisting with product development. For instance, HoloLens is proving useful in the development of Starliner, a small crew transport module for the ISS.

Boeing’s Brian Laughlin will lead a thought-provoking closing brainstorm on Day One of EWTS Fall 2017

GE Aviation

General Electric is using Augmented Reality and other IoT technologies in multiple areas of its far-ranging operations. At GE Aviation, mechanics recently tested a solution consisting of Upskill’s AR platform on Glass Enterprise Edition and a connected (WiFi-enabled) torque wrench.

The pilot involved 15 mechanics at GE Aviation’s Cincinnati manufacturing facility, each receiving step-by-step instructions and guiding visuals via Glass during routine engine assembly and maintenance tasks. At any step requiring the use of the smart wrench, the Skylight solution ensured the worker tightened the bolt properly, automatically verifying and recording every torqued nut in real time.

GE Aviation mechanics normally use paper- or computer-based instructions for tasks, and have to walk away from the job whenever they need to document their work. With smart glasses, workers were 8-12% more efficient, able to follow instructions in their line of sight and automatically document steps thanks to the device’s built-in camera. And reducing errors in assembly and maintenance saves GE and its customers millions of dollars.

Lockheed Martin

In early 2015 it came out that Lockheed Martin was trialing the Epson Moverio BT-200 glasses with partner NGRAIN, to provide real-time visuals to its engineers during assembly of the company’s F-35 fighter jets and ensure every component be installed in the right place. Previously, only a team of experienced technicians could do the job, but with Augmented Reality an engineer with little training can follow renderings with part numbers and ordered instructions seen as overlay images through his/her smart glasses, right on the plane being built.

In the trial, Lockheed engineers were able to work 30% faster and with 96% accuracy. Those workers were learning by doing on the job as opposed to training in a classroom environment, which amounted to less time and cost for training. And although increased accuracy means fewer repairs, the AR solution could be used to speed up the repair process, too, from days- to just hours-long, with one engineer annotating another’s field of view. At the time, however, Lockheed acknowledged that getting the technology onto actual (secured) military bases would be difficult.

Lockheed is also interested in Virtual Reality, seeing AR/VR as key to lowering acquisition costs (all costs from the design/construction phase of a ship to when the vessel is decommissioned.) The company is applying VR to the design of radar systems for navy ships. The challenge lies in integrating the radar system with a ship’s other systems, which requires very precise installation. VR can help identify errors and issues during the design stage and prevent expensive corrections.

Using HTC Vive headsets, engineers can virtually walk through digital mock-ups of a ship’s control rooms and assess things like accessibility to equipment and lighting. Lockheed is also using Microsoft’s HoloLens to assist young naval engineers with maintenance tasks at sea—much more effective than a dense manual.

*Learn more about this application from Richard Rabbitz of Lockheed Martin Rotary Mission Systems (RMS) at EWTS Fall ‘17

Lockheed is allegedly saving $10 million a year from its use of AR/VR in the production line of its space assets, as well, by using devices like the Oculus Rift to evaluate human factors and catch engineering mistakes early. For the Orion Multi-Purpose Crew Vehicle and GPS 3 satellite system, Lockheed ran virtual simulations in which a team of engineers rehearsed assembling the vehicles in order to identify issues and improvements. A network platform allows engineers from all over to participate, saving the time and money of travelling.

Last but not least, Lockheed Martin is also actively developing and testing commercial industrial exoskeletons. Keith Maxwell, the Senior Product Manager of Exoskeleton Technologies at Lockheed, attested to this at the Spring 2017 EWTS. The FORTIS exoskeleton is an unpowered, lightweight suit, the arm of which – the Fortis Tool Arm – is available as a separate product for operating heavy power tools with less risk of muscle fatigue and injury.

While Augmented Reality has been around for decades in the form of pilots’ HMDs, only now has the technology advanced enough to become a standard tool of engineers, mechanics and aircraft operators across aviation and aerospace operations. In a high-tech industry like aerospace, AR/VR are critical for keeping up production during a mass talent exodus from the workforce. Workers won’t need years of experience to build a plane if they have on-demand access to instructions, reference materials, tutorials and expert help in their field of view.


The Fall Enterprise Wearable Technology Summit 2017 taking place October 18-19, 2017 in Boston, MA is the leading event for wearable technology in enterprise. It is also the only true enterprise event in the wearables space, with the speakers and audience members hailing from top enterprise organizations across the industry spectrum. Consisting of real-world case studies, engaging workshops, and expert-led panel discussions on such topics as enterprise applications for Augmented and Virtual Reality, head-mounted displays, and body-worn devices, plus key challenges, best practices, and more; EWTS is the best opportunity for you to hear and learn from those organizations who have successfully utilized wearables in their operations. 

How Your Business Can Prepare for an Augmented Reality Future

Whether you believe Apple’s latest announcements mark the arrival of mainstream Augmented Reality or still think mass use of AR is years away; smart (AR) glasses are the future. The question is how long we will hold onto our smartphones for (and yes, which device and/or platform will tip the technology in the consumer market’s favor.)

Just as glasses are the ultimate form factor for workers in factories, out in the field, in the O.R., etc.; heads-up and hands-free is ideal for consumers. The biggest problem with our phones is that we carry them everywhere and are constantly looking down at them. AR will not only provide better contextual information to enrich our daily lives, but it will also revive an element of society that today can feel somewhat foreign compared to texting or email (especially to Millennials;) and that is face-to-face human interaction. (FaceTime doesn’t count.)

So why aren’t people more eager to free their hands and gaze from a hand-held screen? Smartwatches seem to have broken into the mainstream or are at least accepted by consumers. What is it about putting on a pair of glasses? It’s not just aesthetics and privacy concerns. In enterprise, you identify a problem in the workplace – some source of inefficiency – that AR can address; but when the work day is done, what is the problem that AR would fix, that would motivate us to finally give up our phones beyond sheer convenience or entertainment? I can only guess as it’s outside my area of expertise.

Nevertheless, one day AR glasses will be acceptable outside the workplace, and once that happens a whole new world of enterprise applications will open up—those applications that depend upon consumers owning/wearing glasses and headsets, and not necessarily as often as they carry their smartphones now.


So, what can enterprises do in the meantime, while waiting for consumer AR glasses to take off?

1) Provide the experience for the customer or partner, like “HaaS” (hardware as a service) or an in-store demo. Some architects, realtors, automotive companies, major retailers and even airlines are already doing this, and some manufacturers are supplying customers with smart glasses to facilitate remote equipment troubleshooting and customer support.

2) Share the benefits of smart glasses with the customer/partner. Ex. HVAC worker wearing smart glasses to a job to let the customer see the problem or service in real time; a store salesperson doing the same to help an online shopper make a purchasing decision; a flight attendant viewing information about a passenger to provide better, more personalized service; doctors wearing glasses with patients, etc.

Or 3) Start with a mobile app or create a 360-degree video with the intent of making it heads-up in AR or VR in the future. While this can be very expensive (a 360˚ video can cost anywhere between $10,000 and $100,000 to produce, according to Forrester Research,) it puts the organization in the best position to capitalize on these technologies in different form factors and environments down the road. Until then, the videos can be shared on social media, at pop-up events, on the company website, etc.


Some example use cases:


In dealerships across Australia, Hyundai has introduced the Hyundai AR Showroom app for the iPad, a sales tool for dealers to show car shoppers the built-in safety and performance features of the “all-new i30.”

The app, created by Auggd, allows the salesperson to demonstrate features of Hyundai’s reinvented hatchback that are normally difficult to explain in a showroom environment (without having multiple vehicles on the floor.) By holding up an iPad in front of the real i30, shoppers can manipulate a 3D model overlay of the car; they can change its appearance and accessory options, and view animations of safety features like autonomous emergency braking and lane-keeping assist.

It seems Hyundai has been making an effort to get both its customers and representatives familiar with Augmented Reality. In early 2016, the South Korean automaker created an AR owner’s manual for some of its more popular models. The manual app and new Hyundai AR Showroom app could easily transition to glasses or a headset in the future for a more immersive and effective experience. These apps are also providing Hyundai with valuable consumer insights.


This Boston-based online furniture and home goods retailer envisions its customers one day shopping for Wayfair products at home using Mixed Reality headsets. In the meantime, the company’s R&D team Wayfair Next has created WayfairView, a mobile app that leverages Google’s Augmented Reality technology Tango along with Wayfair’s growing library of 3D product models. The app lets users view full-scale virtual models of furniture and décor in their homes with an AR-capable smartphone; they can look at items from multiple angles, see whether a piece of furniture will fit in a room, etc. before buying.

For over a year now, Wayfair has been visualizing millions of its home products in 3D. The models are currently used in the shopping app and on the company’s website but are ultimately destined for a headset.

*Mike Festa, Director of Wayfair Next, will speak at EWTS Fall 2017


Virtual Reality is a powerful storytelling medium, which is why it makes for great marketing as well as an effective job training tool. After the success of last year’s online “Migraine Experience” campaign in which users could experience migraine symptoms like blurry vision and flashing lights through AR filters; Excedrin created “Excedrin Works,” a new VR video campaign from the P.O.V. of real migraine sufferers at work.

The 2016 AR campaign saw close to 400,000 social engagements. The latest VR one is expected to be even more engaging, driving home the medication brand’s purpose and driving sales. By appealing to human emotions, Excedrin is hoping viewers will understand how crippling migraines can be and why its product is necessary.

The two VR videos, created with Weber Shandwick and Hogarth, can be found on Excedrin’s website and YouTube channel. To round out the campaign, the company is also running several documentary-style videos on TV and social media, and collaborating with race car driver Danica Patrick to share her history of migraines.


The British supermarket chain has dropped a few hints that Virtual Reality is the future of shopping at Tesco. Way back in 2011, the company partnered with Cheil Worldwide to “open” a virtual supermarket in South Korea: An entire wall of a Korean subway station was made to appear like rows of shelves in a market, containing Tesco products with QR codes that commuters could scan to buy groceries on their phones. (After a long workday, it would be nice to get the food shopping done while waiting for your train—Tesco even arranged for deliveries to take place the same night.)

The subway experiment provided Tesco with insight for growing its business in SK. Around 2014, the grocery chain again used VR for R&D, wanting to improve its marketing and how it merchandized and reorganized stores. The company collaborated with Figure Digital on an Oculus Rift demo video called “Tesco Pelé” in which customers wearing VR headsets shop in a virtual supermarket, the layout of which represented an actual Tesco store design up for review. At the end of the simulation, the wearer steps onto a pro soccer field.

The possibilities here include, of course, virtual grocery shopping and consumer research; but the Pelé element (famous soccer player) suggests opportunities for corporate sponsorships, as well.


Like Wayfair, Lowe’s wants to be ready for the day when consumers use their own AR glasses and VR headsets. In Fall 2016, the home improvement chain debuted Lowe’s Vision, an app powered by Tango that lets customers measure any room in their homes and design it with virtual Lowe’s products using the Lenovo Phab 2 Pro phone.

In Spring 2017, Lowe’s began piloting Lowe’s Vision: In-Store Navigation, another Tango-powered app, in two of its stores. This second AR app makes it easier to shop for your home improvement project: Customers can use any Tango-enabled smartphone (or demo one with a sales associate) to search for products, read reviews, create shopping lists, and find the most efficient route to items throughout the store with the help of digital directions overlaid onto the real world.

One of the first AR/VR ideas to come out of Lowe’s Innovation Labs was the Holoroom in 2014/15. Now available in select stores, it’s essentially a how-to section in the store where shoppers can put on the HTC Vive headset and practice home improvement projects like tiling a bathroom in virtual reality.

Lowe’s is onto something in exposing its customers to emerging technologies that transition from their homes into actual Lowe’s stores, helping them with their home improvement projects from start to finish.


So how can your business prepare for an AR future? This is a time for innovation. Augmented and Virtual Reality represent new paradigms for sharing and taking in information. The same factors that make the technology ideal for workers – heads-up and hands-free, immersive, proven to be a superior learning method – can work for your customers and partners–figure out their pain points just as you would in determining a great use case for your workforce.How might AR/VR make it easier or more appealing for consumers to interact with your brand, seek your services, buy (and use) your product, etc.? Consider the scenario in which the business provides AR glasses for the customer/partner as well as the future one in which consumers have access to their own devices. What can you do now to begin forming a bridge between those two scenarios? 


About EWTS Fall 2017:

The Fall Enterprise Wearable Technology Summit 2017 taking place October 18-19, 2017 in Boston, MA is the leading event for wearable technology in enterprise. It is also the only true enterprise event in the wearables space, with the speakers and audience members hailing from top enterprise organizations across the industry spectrum. Consisting of real-world case studies, engaging workshops, and expert-led panel discussions on such topics as enterprise applications for Augmented and Virtual Reality, head-mounted displays, and body-worn devices, plus key challenges, best practices, and more; EWTS is the best opportunity for you to hear and learn from those organizations who have successfully utilized wearables in their operations. 


photo credit: dronepicr Kölner Dom aus Lego Gamescom via photopin (license)

Why the Logistics Industry is Going Hands-Free

The logistics industry has been thinking hands-free for years now. In my research for this blog post, I came across an article from 2007 on the use of voice headsets and arm-mounted computers in the warehouse. More recently, ABI Research found that 61% of logistics companies it surveyed are adopting wearable technologies as part of their technology innovation strategy. In addition to logistics companies, enterprises in other verticals are using wearables within their warehouse or supply chain operations. Below are some of the top use cases:



The number one wearable use case in the logistics industry today is arguably vision picking with Augmented Reality glasses like Google Glass and the Vuzix M300. DHL has been exploring wearables with its customers and in different units of its business for several years. In 2014 with the help of Ubimax, DHL Supply Chain and DHL customer Ricoh carried out a successful vision picking pilot in a warehouse in the Netherlands.

For the pilot, staff went about their picking duties, taking cues from simple graphics and text displayed in smart glasses to navigate the warehouse and locate each pick. The glasses allowed for hands-free order picking, which sped up the picking process and reduced errors.

Using Ubimax’s vision picking solution, DHL and Ricoh realized a 25% efficiency increase over the course of the three-week trial. Exel, a unit of Deutsche Post DHL Group, achieved similar results the following year when it gave smart glasses to workers in two of its U.S. warehouses. In August 2016, DHL Supply Chain announced it was expanding its “Vision Picking Program,” with additional pilot sites established across Europe and the U.S.

In addition to picking and e-fulfillment, DHL sees potential in using AR and smart eyewear in other areas, including transportation, last mile delivery, and training of seasonal or temporary workers. In November 2016, Fujitsu announced a partnership with DHL Supply Chain UK to develop innovative services around wearable technology and the Internet of Things.

*Justin Ha, Director of Solutions Design at DHL Supply Chain, will be speaking at EWTS Fall 2017.


Way back in 2011, UPS adopted a wearable package scanning system consisting of a ring scanner plus a small wrist- or hip-worn terminal, both by Motorola Solutions. The goal was to speed up the time it takes to load packages, prevent misloads, and improve package tracking and data reliability. UPS rolled out tens of thousands of these devices. Of course, today there is more sophisticated technology: Smart glasses, often paired with ring scanners (for items on very low or high-up shelves,) are the new wearable scanning system and the new interface for logistics software.

In 2015, it was reported that UPS was testing smart glass technology to reduce the amount of labeling on packages. Instead of two labels on every package (an address label and a second label identifying the delivery route and truck;) a single barcoded address label could be used that – when scanned with Google Glass – would inform the package sorter of the box’s destination. This simplifies the job and allows workers to be more hands-free.

Currently, UPS is developing and rolling out a Virtual Reality driver training program at nine of its training facilities, to simulate the uncertainties and challenges of city driving. Wearing an HTC Vive or other VR headset, students will go through a virtual streetscape, using voice commands to identify road hazards. The VR training modules are designed for package delivery drivers but in the future UPS plans to expand the tech’s use to tractor trailer workers.


Every second counts when you handle millions of packages a day, which is why the shipping giants were early adopters of wireless technologies and why they continue to pursue the latest in mobile—for the opportunity to shave off seconds from the delivery process.

Since 2000, FedEx parcel handlers were equipped with ring scanners connected via Bluetooth to a device worn on their forearms. Similar to the system used at UPS, the wireless solution scanned each package and provided tactile feedback when a parcel was placed in the wrong container.

On top of supply chain efficiency, FedEx is also interested in wearable technology for the overall safety of the workforce. Its aircraft were equipped with heads-up displays (HUDs) to improve pilots’ situational awareness during night flights and bad weather conditions; and the logistics giant is exploring wearable wellness monitoring.

Crane Worldwide Logistics

From faster picking to better posture: Crane Worldwide Logistics, a large third-party logistics company, tried out a wearable device by KINETIC to reduce the number of ergonomic injuries among its workforce.

Back injuries, strains and sprains are the most frequent and costly injuries in warehouses and other industrial workplaces. REFLEX is a discreet wearable worn on one’s belt or waistband that automatically detects unsafe postures, providing instant feedback to the wearer whenever a high-risk motion occurs. In so doing, the solution helps teach workers how to move safely or use “good biomechanics” on the job.

Using REFLEX, Crane was able to reduce the number of unsafe postures at its Houston, TX distribution facility (where the KINETIC pilot took place) by 84%. The “most improved” worker saw a 96% reduction, from 320 bad postures in a day to just 12.


Bechtle is one of Europe’s leading IT service providers. In January 2016, after extensive piloting, the company announced the deployment of Vuzix M100 Smart Glasses for vision picking at its distribution center in Neckarsulm, Germany.

Warehouse employees began using smart glasses running the mobile SAP AR Warehouse Picker app and connected to Bechtle’s WMS as an alternative to handheld scanners in select picking processes. The hands-free solution, featuring QR code scanning and voice recognition technology, guided the wearer through the picking process step-by-step without the need for any manual input of information.

This was the first of many potential use cases for smart glasses that Bechtle intended to pursue. The company believed the benefits of Augmented Reality could be reaped most quickly applied to a simple, labor-intensive process like the picking of small parts, though it plans to expand the use of wearables to additional workflows in receiving, complex delivery orders and more.


In November 2016, global service provider Arvato partnered with Picavi to launch a vision picking project for audio company Sennheiser. For the purposes of the pilot, a separate pick process was identified in order to evaluate Picavi’s Pick-by-Vision solution in a controlled environment.

Initial feedback from warehouse employees was positive. Having all essential task-based information displayed in front of their eyes through smart glasses allowed pickers to keep both hands on the job, which minimized errors and helped them stack the pallets faster. Workers also found the new pick solution intuitive to use and comfortable to wear while moving around the warehouse.


It seems a consensus has been reached after all these vision picking pilots, and that is that smart glasses are setting a new bar in the classic order picking process. Augmented Reality has proved superior to basic handheld scanners and tiring voice picking systems.

Beyond order picking, AR glasses can replace traditional tools in receiving, packing, shipping and replenishment–all areas of the warehouse or distribution center. A wearable device could conceivably “accompany” a package from the moment an order is received to the moment it’s loaded onto the truck for delivery, ensuring a smooth and accurate flow of goods all along the supply chain as well as the safety of all pickers, packers, drivers and other package handlers.


About EWTS Fall 2017:

The Fall Enterprise Wearable Technology Summit 2017 taking place October 18-19, 2017 in Boston, MA is the leading event for wearable technology in enterprise. It is also the only true enterprise event in the wearables space, with the speakers and audience members hailing from top enterprise organizations across the industry spectrum. Consisting of real-world case studies, engaging workshops, and expert-led panel discussions on such topics as enterprise applications for Augmented and Virtual Reality, head-mounted displays, and body-worn devices, plus key challenges, best practices, and more; EWTS is the best opportunity for you to hear and learn from those organizations who have successfully utilized wearables in their operations. 


photo credit: vic_206 DHL Air / Airbus A300B4-622R(F) / EI-OZM via photopin (license)

The Inevitable Rise of Google Glass 2.0

The use cases mentioned in Wired’s breaking story about Google Glass 2.0 are supreme examples of Google Glass’ success in the workplace. AGCO, Boeing, DHL and GE are certainly major companies validating the benefits of Glass to enterprise. Their stories have been shared here on EnterpriseWear as well as at every Enterprise Wearable Technology Summit.

(See Wearables in Manufacturing: Interview with AGCO’s Peggy Gulick  ; Wearables in Industry: Interview with GE’s Sam Murley ; and Wearables in Logistics: Now or Later?)

But there have been numerous use cases by big and small companies alike since Glass made its ill-fated consumer debut in 2012. Not all those early explorations were developed further; some of the first experiments were simply small, short pilots that were subsequently dropped because the tech wasn’t ready or because the company may not have had the resources, connections or patience of a Boeing or GE. But it was those cases that taught Google a big lesson, encouraging the company to direct its attention to the enterprise and get to work on what would ultimately become Google Glass Enterprise Edition.

While companies like GE and Boeing have been clandestinely using Google Glass EE for a while now, it’s worth looking back at some of the earliest – and incredibly imaginative – test runs of Google Glass Explorer Edition:

Airports & Airlines

  • In one of the most publicized early trials, Virgin Atlantic agents at London’s Heathrow Airport used Google Glass to process first-class passengers for their flights while maintaining eye contact with them.
  • At Copenhagen Airport, the device was used by airport duty managers to document issues and answer travelers’ questions on the spot.
  • Japan Airlines had personnel on the tarmac at Honolulu Airport wear Glass so that staff at headquarters could perform remote visual inspections of planes and send instructions.


  • Dr. Rafael Grossmann was the first to use Google Glass during live surgery.
  • Glass was tested at Stanford University Medical Center to guide residents through surgery, at UC San Francisco to broadcast surgeries for faculty and students to watch, and at UC Irvine Medical Center to monitor anesthesia residents.
  • At Beth Israel Deaconess Medical Center, four ER doctors used the glasses in lieu of tablets to get real-time clinical information.
  • Dr. Peter Chai used the technology in ED to facilitate remote consultations in dermatological cases.
  • Several physicians and administrators at Mayo Clinic tested Glass in different specialties and departments for viewing patient info, documenting injuries, and learning.
  • Indiana University Health Methodist Hospital used Glass as an aid in a tumor removal and abdominal wall reconstruction procedure. IU Health’s Paul Szotek also livestreamed a hernia repair with the device.
  • Chicago-based MedEx had its paramedics use Glass to communicate with specialists from the ambulance and show ER doctors the status of incoming patients in real time.

*Dr. Szotek will talk about his experiences since that first livestream at the Fall 2017 Enterprise Wearable Technology Summit.

Rogers-O’Brien Construction

  • The Texas-based general contractor used Google Glass to capture, share and collaborate on jobsite information hands-free. It was an early foray for the company, which has since experimented and adopted all kinds of emerging technologies including VR headsets and partial exoskeletons.

*Todd Wynne and Joe Williams of Rogers-O’Brien are also speaking at the fall event.

Car Companies

  • In a pilot project at one of BMW’s U.S. plants, Google Glass was tested for quality assurance, used by workers to document potential defects and improve communication between the quality testers and development engineers.
  • GM experimented with the device in quality inspection and as a tool for viewing procedural instructions. 

Food Industry

  • Several restaurant chains have tested Glass for training purposes: KFC tried out the device to record tutorials and play them back for new recruits. Similarly, Capriotti’s Sandwich Shop used Glass to record new workers’ performance and the lunchtime rush, hoping to spot areas for improvement.


  • The global ERP software company used short video interviews recorded with Glass to introduce new employees to team members outside of the corporate office.

Las Vegas Air Conditioning

  • The HVAC company was one of the first to have its technicians wear Google Glass on jobs, to live stream their work for the customer to see.

Sullivan Solar Power

  • The Southern California company’s field technicians wore Glass to safely (hands-free) view specs and plans while installing solar panels atop homes and businesses.


  • The oilfield service company tried out 30 pairs of Google Glass to provide hands-free intelligence to workers in the field, improving their safety and efficiency.

Active Ants

  • Stock pickers at the Dutch e-fulfillment company were able to reduce their error rate by 12% and increase their speed by 15% using Glass.

San Francisco’s de Young Museum

  • One of the first museums to integrate Google Glass into an art exhibit: Visitors used the tech to gain more insight into the artist and featured works in de Young’s 2014/15 Keith Haring show.

Fennemore Craig (now Lamber Goodnow)

  • Two attorneys at the personal injury law firm used Google Glass to win cases, loaning the device to clients so they could record a day in their lives post-injury.


Find out just how much Google Glass has progressed – both the hardware and applications – since those early days at the upcoming Fall 2017 Enterprise Wearable Technology Summit, where real end-users will speak about their “secret” deployments of the technology.


photo credit: jurvetson Sergey Brin Solves for X via photopin (license)

The 6 Most Popular Applications for Wearables in Enterprise


  1. Heads-up, Hands-free Information
    1. Easy access to information from an ERP system (using touch, gesture or voice commands; a heads-up or glanceable display)
    2. Step-by step instructions for building, assembling, fixing and inspecting; plus safety procedures and other company protocols
    3. Hands-free documentation of information

Paper manuals and lists, bulky tablets and PC stations are not ideal for workers who need their hands free and eyes on the job. Instructions for assembling the wing of an airplane, servicing an elevator, or inspecting a vehicle can be displayed via smart glasses, overlaid on top of the actual assembly or machine. For short checklist items, a smartwatch could be used; and employees can verify each step of their work.

  1. Remote Expert /SME/Support/Assistance/Guidance/Troubleshooting/Collaboration
    1. Many terms for this but essentially telepresence: Using the front-facing camera and microphone in a pair of smart glasses to share one’s view of a situation with a centrally- or remotely-located expert via live audio and (point-of-view) video
    2. Enhancing service efficiency in the field: Mainly applies to emergency situations, i.e. when a piece of equipment breaks down or a field worker encounters a problem he or she is unable to diagnose or resolve
    3. Saves time and money: Problem can be fixed in just a few hours (less downtime,) and don’t have to pay for an SME to travel to the worksite
    4. Having a second pair of eyes, from Skype-like collaboration (including ability to annotate the user’s field of view) to two people interacting in a 3D world through immersive technologies
    5. Virtual meeting spaces

When a printing press breaks down in a newspaper factory, production halts. In any time-critical business, downtime means lost profit and unhappy customers. Rather than wait for a technician to arrive, the operator could show the broken machine to one of the manufacturer’s techs using smart glasses. The tech would be able to direct the operator around the machine, identify the issue and verbally or visually guide him through the fix. If a new part is required, the order can be immediately placed.

  1. Design Visualization
    1. For design conception, collaboration, and communication
    2. Helping an architect, engineer or designer develop an idea without the use of 2D paper drawings or expensive/wasteful 3D models; allowing him/her to inhabit the design as it’s refined (opens up new possibilities for experimentation)
    3. Helping two or more designers from all over collaborate on the same project: Creating a shared experience of the design to develop ideas, work out flaws and make more informed decisions affecting the rest of the project
    4. Design reviews: Identifying potential issues before significant investments are made and before building commences; saves time and money (less rework)
    5. Helping all stakeholders, including clients, contractors and future operators of a building, to visualize the design in a format they can understand
    6. Product development (ex. an automobile or product packaging)
    7. Planning: Configuring the layout of a factory, construction site or other facility to accommodate all necessary equipment, people and vehicles (making sure all equipment will fit)

Collaborating on a building project is challenging. There are many stakeholders, all of whom need to be working off the same design from wherever they’re based. These include the architect who designs the building to meet a client’s needs and preferences; along with engineers, construction crews, various contractors, government agencies, inspectors, and the public.

A building is first imagined in 3D then translated into 2D and finally executed in the real environment. Miscommunications and misunderstandings are common: Clients typically don’t understand building plans, and the project is always in a state of flux as designs are revised and building progresses. Using one platform like AR or VR from design to construction can help everyone involved visualize the final product.

  1. Training
    1. Onboarding/initial training as well as continuous training of employees (when new equipment is installed, a new problem encountered, a change in process)
    2. Hands-on, on-the-job, just-in-time training through step-by-step instructions, AR content, or a remote teacher (faster learning, lasting results, fewer errors)
    3. Recording training videos on real jobs by experienced workers; capturing best practices
    4. AR/VR simulations of different training scenarios
    5. Reduces training requirements (industry job certifications are costly)

When updates are made to a factory and new equipment is installed, both new and seasoned workers need to learn how to operate it. The vendor might supply a training video or send personnel to the factory to conduct the training, but AR/VR instructions and simulations have been found to be a more effective medium for learning than lectures or demonstrations. Training isn’t just time-consuming; it’s also expensive: For instance, training an airplane pilot can cost $1,500 an hour due to costly on-plane training and full-motion simulations. VR can greatly reduce these costs. Any business might also use VR to onboard new employees, introducing them to the company culture and basic procedures.

  1. Sales
    1. Improving customer service by helping customers visualize designs; enabling them to remotely view or virtually experience products and services; providing contactless payments, proof of service, and remote access to an in-store salesperson
    2. Creating a more personalized customer experience to increase customer satisfaction and close more sales (using wearable tech to view client information at the point of sale or service)
    3. Giving customers behind-the-scenes access by streaming video from a job or allowing them to shop remotely (builds trust, shorter sales process, reduces returns)
    4. Innovative marketing, advertising and customer engagement strategies
    5. Using AR/VR in the sales pitch; bringing the design or sales pitch to the customer
    6. Becomes a major differentiator for the business (can market use of the tech to improve brand reputation, revamp company image, compete, and attract new business)

AR and VR allow for new, convenient, and highly persuasive shopping opportunities: A car buyer could virtually test drive a vehicle from his living room or go to a dealership and view different vehicle options using AR glasses. A homeowner could picture how her new kitchen will look before renovation begins, or watch on as the HVAC worker fixes the air conditioner in her home from her desk at work. A homebuyer could go on virtual home tours or have the architect create a VR experience of a house design for him to view, allowing for faster input and approval.

  1. Safety
    1. A big part of improving safety in the workplace is making sure workers do things correctly and follow proper procedures (providing guidance via a remote expert or instructions in the worker’s FOV.) In the case of consumer product manufacturing, eliminating errors improves consumer safety (prevents recalls)
    2. Wearable sensors: Come in a variety of shapes and sizes, are contained in a variety of form factors (bracelets, patches, watches, clothing items, work gear,) and track a wide range of metrics that influence the user’s safety on the job
    3. Tracking biometrics, ergonomics, and environmental conditions; analyzing the collected data (in real time;) and alerting workers when risk levels are reached via wearable devices (text or haptic alerts)
    4. Behavior modification: Gaining insight from wearable sensor data to influence behavior (ex. giving feedback or sending haptic notifications to teach workers to lift correctly and safely;) using exoskeletons or AR to lessen the physical and cognitive stress of a job
    5. Efficient workforce management: Tracking employees (their location, health factors like fatigue and motion, exposure) to keep them out of hazardous areas, optimize shift times, and make sure proper PPE is worn and tasks are performed safely from an ergonomics perspective

Tracking sleep and preventing sleep-deprived workers from operating vehicles or heavy machinery; using a wearable GPS tracker to make sure employees have the proper paperwork to work in different areas on a job site; and analyzing machine data to predict and alert workers when a piece of equipment is going to malfunction.

(See Using Wearable Tech for Workplace Safety and 3 Great Use Cases of Wearables for EHS)


About EWTS Fall 2017:

The Fall Enterprise Wearable Technology Summit 2017 taking place October 18-19, 2017 in Boston, MA is the leading event for wearable technology in enterprise. It is also the only true enterprise event in the wearables space, with the speakers and audience members hailing from top enterprise organizations across the industry spectrum. Consisting of real-world case studies, engaging workshops, and expert-led panel discussions on such topics as enterprise applications for Augmented and Virtual Reality, head-mounted displays, and body-worn devices, plus key challenges, best practices, and more; EWTS is the best opportunity for you to hear and learn from those organizations who have successfully utilized wearables in their operations. 

Applications and Hardware: 18 Guiding Questions for Your Business

Trying to get started on your enterprise wearables journey? How do you determine a good use case or figure out which technology is right for your work environment and workforce? Here are some helpful guiding questions and key considerations for identifying potential business cases:

  1. Are your standard work procedures still paper-based? Where in your operations do workers rely on paper instructions, manuals, lists, schematics or forms? Is the use of paper-based tools a source of inefficiency? Is it feasible to digitize this information? Would a wearable mode of delivery be more effective?
  2. Do workers use smartphones, tablets or other hand-held devices to carry out processes? Is it a problem that these devices are fragile and not hands-free in your work environment or in certain scenarios? Do they cause accidents? Do workers always carry these devices with them? Do they break often? Can you deliver the same information hands-free?
  3. Which processes require delivering work instructions to employees on the line, in the field, or with a customer? Where is that information located? Could you make it more readily available (bring it closer) to the worker? How do workers access or receive task-critical information? Is this method ergonomically in line with the task? Is it real-time info? Where would delivering instructions in a heads-up, hands-free manner make a difference? What about information from legacy systems?
  4. Where would employees benefit from frequent reminders about standard work or safety procedures? How about task prompts (ex. pick the next part, ask a customer something, put on the correct PPE?) Could you push simple but critical data, alerts and prompts to workers via a glanceable wearable device or with haptic technology?
  5. Which processes or tasks require workers to travel to get to a problem, access information, file a report or seek help? How much time does this add to the process? Can you cut down travel time with wearable tech?
  6. When do you have to bring in an SME? Could you instead train the employee in the field to fix the issue, to do what the SME would do? Would providing the field tech with on-demand, step-by-step instructions, videos and/or remote support enable him to diagnose and solve the problem by him/herself?
  7. Where does machine or vehicle downtime cost the business greatly? What delays repair? Is there a way to insert wearables into the process to reduce downtime and maintain productivity?
  8. Which processes require documentation or record keeping (for compliance, proof of service, quality inspection reports?) How do workers currently document issues? Can you make this easier, more accurate and faster with smart glasses? If you mainly want to make use of a hands-free, front-facing camera (to document a job, perform audits, etc.) do you need an advanced AR headset? Do you want to communicate this data in real-time, therefore requiring strong, reliable connectivity?
  9. Where are your customers located? Are they standing beside the worker as the job is performed? Do they need to approve of a design before manufacturing or building commences? Do they need to travel to your location to view a design or product, or do the designers/salespersons travel to them? Do they want progress updates? Do they trust you?
  10. What sales tools do you currently employ? Would immersive visualization be a more persuasive tool with your client base? How long does it take from sales pitch to closure? Would you close more deals on the spot using AR/VR? Could you provide customers with the right info at the right time through a wearable app to make their experience better? What information would that be? Would interacting with your customers through their wearable devices or having your employees use wearables when dealing with the customer (for remote visual sales, viewing personalized data from a CRM) improve customer service and increase sales?
  11. How are workers currently trained? Using videos, PowerPoints, written tests? How long does that take? How much does it cost? Is it effective? Do they require retraining? Would an AR/VR simulation be a better method? Could you provide on-the-job, just-in-time training with smart glasses?
  12. Do your employees carry heavy loads or work in non-ergonomic positions (ex. looking up, bending, lifting, overstretching, twisting) Would behavior modification with a wearable sensor and alert solution improve safety? In assessing exoskeleton technology, consider price and usability—Do the workers’ comp savings make up for the cost of the devices? How practical is the tech on the factory floor? How much time does it take to put on and take off the device? Is it possible to give one to each worker or do they have to share?
  13. Who is the end user? What are their pain points? Ask them outright. Do they need the use of both hands? When they encounter a problem, do they have to leave their work area to tell someone? Do they have to communicate the issue multiple times? What are their most error-prone tasks? Are they at risk for repetitive motion injuries? Do they have to remember procedures or recall a lot of information? Could you remove the cognitive and/or physical stress of their jobs with a wearable solution?
  14. Is a large segment of your workforce coming upon retirement age? How are you going to replace them? Can you use wearable tech to capture their knowledge and expertise, to recruit new workers?
  15. How long is your design review process? Who is involved and where are they located? How do people work together on a design? Is the customer included in the process? What slows down the process (communications, use of physical or 2D models?) Would immersive visualization or a virtual meeting space help collapse design time and avoid rework? Would it save money to manipulate a design virtually instead of using iterative models? Where will you find the content for virtual design?
  16. Where does your business suffer from poor planning and communication, from design to execution, operation and maintenance?
  17. Do you have facilities or offices all over the country/world? How do people in different locations communicate or collaborate with one another? When a piece of equipment goes down, how long does it take before the right person can correct the issue? Where is the right person or expert located? See-what-I-see might be the answer, but users must be able to connect from the work site to stream video.
  18. Does your business involve a high level of customization or variability? Do you manufacture custom-order products? How do you deal with customization in the factory? Do workers have to be cross-trained, remember a lot of information, or rotate jobs frequently? Can wearable technology help minimize this complexity?

Lastly, study other use cases, and not necessarily ones from your industry. Events like EWTS are great because most enterprises have strikingly similar business problems and requirements. It might surprise you to know that an aerospace company can learn a lot from a surgeon (even Mars scientists are using AR to collaborate from all over the globe.) First, identify the business problem; then find a technology solution that matches the need.


The Fall Enterprise Wearable Technology Summit 2017 taking place October 18-19, 2017 in Boston, MA is the leading event for wearable technology in enterprise. It is also the only true enterprise event in the wearables space, with the speakers and audience members hailing from top enterprise organizations across the industry spectrum. Consisting of real-world case studies, engaging workshops, and expert-led panel discussions on such topics as enterprise applications for Augmented and Virtual Reality, head-mounted displays, and body-worn devices, plus key challenges, best practices, and more; EWTS is the best opportunity for you to hear and learn from those organizations who have successfully utilized wearables in their operations.