The trick to saving human factory jobs might be teaming up with the machines

Sometimes the man-machine enhancements are physical, and sometimes they’re mental. Sometimes it’s a Venn diagram that includes both aspects, as a skilled human worker collaborates with robotics and AI to complete a task.

We spoke yesterday about the security of the industrial Internet of Things with John Spooner, senior IoT analyst at 451 Research in Boston, and he also had a fair amount to say on the more positive concept of humans and machines teaming up to build things better. “Artificial intelligence and augmented humans includes a whole range of stuff, from implants that improve your memory and vision to exoskeletons that would help you ambulate or lift,” he told us.

Electric eyes on the prize

On the physical side of the equation, consider the example of Ford Motor Company, which has rolled out to its global factories wearable exoskeletons developed by Ekso Bionics. The 9.5-pound EksoVest adjusts to fit workers ranging in height from 5 feet even to 6-feet 4-inches (about 1.5 to 1.9 meters), and it provides lift assistance of 5 pounds to 15 pounds per arm (or about 2 to 7 kilograms). That doesn’t sound significant, but it adds up to a tremendous boost when a worker may be performing the same repetitive lifting motion thousands and thousands of times every day.

On the cognitive side, Nvidia Corp. and the American College of Radiology in April announced a collaboration that leverages Nvidia’s Clara AI toolkit to speed the analysis of masses of radiological images.

Nvidia Clara comprises libraries for data and image processing, AI model processing, and visualization. The AI toolkit includes libraries for data annotation, model training, model adaptation, model federation, and large-scale deployment. The new tools allow the ACR’s 38,000 radiologists to customize their own algorithms to spot symptoms in a mass of images.

Spooner elaborates on the process: “In order to use AI effectively, you have to learn how to use it, then have to build it into your processes… How do we train radiologists to be more efficient at reading ten X-rays instead of a hundred?” he asked. “Somebody still needs to look at the image and say, ‘That looks like a tumor, a torn meniscus, thinning of the cartilage.’” Software is getting better at identifying and diagnosing conditions, but it’s not there yet—we still need human doctors looking over things.

Still, a human-machine diagnostician team can make a pretty powerful combination. “Say somebody comes into the ER with broken ribs from a car accident,” Spooner continued. “The medical team needs to find the most pressing problem… But with this technology, an hour or two later, AI may come back to say, ‘Once those ribs are taken care of, you might want to check these other couple spots to biopsy.’ The software tools may identify subtle signs of life-threatening conditions totally outside the immediate condition the human observer is focused on.”

Unlike the seeming inevitability of fully autonomous cars and trucks on the road, it seems less likely that AI diagnosticians will completely negate the role of human physicians. “The idea that AI is going to replace a doctor is ludicrous,” Spooner said, “but what it can do is provide in many cases very good recommendations or decision support to make an esoteric assessment very quickly. It can take in ginormous amounts of data and give probabilities… Combined with someone who has 20 years of experience, you’re going to get the right decision. That’s where it’s really powerful.”

“The idea that AI is going to replace a doctor is ludicrous.”

NextAR from Progea is a good example of a human-machine interface that integrates enhancement to human perception as well as physicality. Tapping into applications for wearable augmented reality devices like Microsoft’s HoloLens, SmartGlass Epson Moverio, or Google Glass, operators on factory floors can work with real-time information transmitted to and from Progea’s Movicon supervision servers. By taking advantage of a whole bunch of interlinked bits of technology, supervisors can direct intelligent machines’ behavior directly, improving efficiency, productivity, and safety.

All about the bottom line

The technologies involved may be cutting-edge, but the decisions about when to deploy a human or machine follow the same basic economic principles that have been in place since the first Industrial Revolution. “Automation has been an ongoing process that continues for a really long time,” said Spooner. “It’s evolutionary and ongoing.”But there may be holdouts, or folks who prefer to think of things strictly in terms of “robots versus humans” or “robots are taking our jobs.” It doesn’t have to be this way—there’s a compassionate way for humans and machines to team up in a manufacturing context and have the result be more fulfilling work rather than unemployment for the human. “It’s all about making humans rather than replacing them,” Spooner explained. “It’s expensive creating a machine that can do what a person can do, but it’s not nearly as expensive to enhance a person’s ability to do a job one way or another. The question becomes: How do you make John better at doing his job, whether he’s writing a research report or replacing a transmission?”

There will be holdouts, and there will also probably be manufacturing companies that look at things strictly in terms of dollars and make mistakes, but if we’re careful and do things right, we’ll all benefit. “There are going to be people who don’t want to do it and people who are going to replace those holdouts,” Spooner said. “Yes, it will take time, and it’s not easy. But in a sense, people had to be taught to learn a spreadsheet or word processor—those things are augmentative tools as well, and we take them for granted now as support for human work.”

What’s more, Spooner sees the rise of the machines as a way of identifying precisely what humans do best. “Tesla found it’s very hard to install seats using smart machines,” he said. “It went back to having people do it because they can bend and shift in ways that robots can’t. What we’re talking about is making it easier and less wearing to work longer, and making it easier to project costs and demand… Are you going to get to the point where a futuristic system builds a car in 3D? Maybe someday, but not for the short term. Instead of autonomous technology, the ‘better together’ aspect is what these companies are shooting for.”

You must login or create an account to comment.

Channel Ars Technica

Related Stories

Sponsored Stories

Powered by

Today on Ars

CNMN Collection
WIRED Media Group
© 2019 Condé Nast. All rights reserved. Use of and/or registration on any portion of this site constitutes acceptance of our User Agreement (updated 5/25/18) and Privacy Policy and Cookie Statement (updated 5/25/18) and Ars Technica Addendum (effective 8/21/2018). Ars may earn compensation on sales from links on this site. Read our affiliate link policy.
Your California Privacy Rights
The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of Condé Nast.
Ad Choices

Facebook Notice for EU!
You need to login to view and post FB Comments!
[ufc-fb-comments url=""]

Latest Articles

Related Articles