A year and a half or so ago I wrote a post about the demise of Google Glass. I noted, however, that this outcome was only temporary:
I don’t think Glass failed at all. I think it was a test launch, and I have no doubt Google got most of what it needed from its interesting social experiment.
A recent discussion with a friend who is the CEO of the startup Proceedix brought this post back to mind, after he pointed me to this video, posted by the industrial manufacturer, AGCO:
In the video, we begin to see what I thought was going to come eventually, namely, the adoption of Google’s invention in the industrial space. While Glass had its issues in the consumer sphere, a lot of smart people understood its potential impact throughout various industries and this potential is starting to be realized. In manufacturing, maintenance, process controls, etc., Glass has the ability to redefine the way in which things are done and the relationship between remote and centralized information sharing. Moreover, the people I have spoken with on this topic suggest that Google itself knows this, of course, and when Glass 2.0 arrives it will be focused as much toward commercial and industrial applications as Glass 1.0 was to consumers.
In 2014, I wrote that “something like Glass will revolutionize human-computer interaction much as the iPhone did,” and I still believe this to be the case. The AGCO video only hints at the potential of this class of technologies, and there are other companies are working with the same goals in mind. The French company AMA, for example, has focused on medical and industrial applications. The U.S. company APX focuses on field service and maintenance. Proceedix, headquartered in Belgium, is focused on building a process control layer that complements visualization technologies, including Glass.
The results of this innovation around Glass are already at work, and not just at AGCO. A 2015 IndustryWeek article highlighted the collaboration between Fisher Dynamics and a software company called Plex to introduce Glass into Fisher’s manufacturing sites. The impact of Glass, notes one Fisher executive, is subtle but really does change the way in which workers interact with their environment:
“As you get used to it, you’re scanning without having to make a movement,” Vince said. “If you do something wrong, if you scan the wrong data identifier, it’s telling you right away, ‘Scan again.’ And if you have a color-coded key” – say, green for a successful scan, red for a failed one – “you’re just kind of glancing at the colors. Quick feedback, a little eye glance to see in your periphery if it’s good or bad without looking away.”
Tollafield and Vince said they estimate each hands-free scan is about a second faster than the more traditional hand-held scan. That might not sound like much time – until you consider Fisher scans thousands of bar codes every shift. If one person scans 2,000 codes in an eight-hour shift and trims one second with each, that works out to more than 33 minutes per day.
Glass is also used to monitor floor efficiency at a glance, and almost exclusively with colors. A handful of iBeacon sensors – originally Apple-developed technology – are spread across the floor. If you’re wearing connected Glass and walk into each iBeacon’s cone, you’ll receive notification about whether the corresponding work station is running efficiently (green), adequately (yellow) or poorly (red).
As the article makes clear, the Glass team at Fisher has just started to explore what can be done with the technology: “We’re able to start thinking about other features we could build in that aren’t existing functions, but are new functions,” he said.
While it’s easy to imagine many of the implications of Glass 2.0, one of the more subtle ones is on risk and insurance. This was pointed out me by a radiologist friend, who noted how the risk/insurance profiles of new doctors would change, if a more experienced one were available to see a procedure exactly as it occurred. The same dynamic could apply in settings as diverse as aircraft maintenance, security, and even child-care.
Furthermore, the new generation of technologies will allow not just information from wearer to spectator but the reverse as well. Indeed, this is part of what Proceedix is aiming to do: provide specific instructions and data to a Glass wearer that is not only contextually correct but also hierarchically structured so that it is accessed on an as-neede basis. This is a much better model than sending a tech into the field with a static manual.
We should expect to see Glass and its competitors come out of the shadows and into the mainstream over the next two years. There will be fits and starts, such as Toshiba’s recent delay in launching its Wearview TG-01 that was focused on the industrial space. These issues notwithstanding, I think Glass 2.0, when it arrives, and its competitors will have tremendous impact in fields as diverse as medicine, teaching, field services and manufacturing. Kudos again to Google for taking such a bold public experiment with the first version of their product. Those who thought Glass was dead are mistaken. We are not even at the start of the changes this new way of seeing the world will bring.