A Blog by Jonathan Low

 

May 1, 2018

Jobs That Artificial Intelligence Will Create or Expand

Daniela Hernandez reports in the Wall Street Journal:

AI Builders, Customer-Robot Liaisons, Robot Managers, Data Labelers, Drone Performance Artists, Safety and Test Drivers, AI Lab Scientists. For AI to properly understand the world, it needs humans to explain what things are - identifying objects in images labeling which part of the image is a face - or parsing sentences to teach it what phrases mean. That need for human involvement is “the part that people underestimate the most.”
As machines get smarter, there is a persistent fear in the minds of economists, policy makers and, well, everybody: Millions of people will be left obsolete and jobless.

But the effects of artificial intelligence are likely to be a lot more complex than that.


Yes, jobs will be lost, and many people will be forced to learn new skills to keep up in this new environment. But experts say the picture has a surprisingly big silver lining.

For one thing, AI opens up opportunities for many new jobs to be created—some that we can imagine and many we probably can’t right now. Already, many companies are discovering that they need a range of new workers to keep their smart new world running smoothly—and not just software developers but also managers, operators and artistic designers.


Meanwhile, the machines aren’t smart enough right now, and probably won’t be for some time, to do all the things that people can, so they will need humans to supervise them. As a result, many jobs will be transformed rather than eliminated, as people rework their old roles to include collaborating with AI in some form. Consulting firm McKinsey & Co. predicts that investments in technology, including AI and automation, could add 20 million to 50 million jobs globally by 2030. Calculating job loss is more complicated, according to the firm, because in many cases people won’t lose their jobs outright, but instead will switch occupations.

In one study, the firm found that somewhere between 75 million and 375 million people may need to switch jobs by 2030 due to adoption of automation.

Here’s a look at some jobs that will be created—or transformed—by these new smart technologies.

AI Builders

As more products, services and research come to depend on AI, there will naturally be a greater need for people who can develop the underlying systems that make AI work. What’s more, other fields will need people with knowledge of how to integrate their work with AI to fit into this new world.
Consider how AI has changed things at just one company: iRobot  Corp.  IRBT -0.38%▲  , a maker of self-guided robotic mops and vacuums. As recently as five years ago, the Bedford, Mass.-based company was more focused on hiring hardware developers to build its machines’ physical bodies. The robots of the past had software brains that were much less complex, consisting of roughly 100 lines of code.

That number has jumped to millions. To keep pace with the higher workload, iRobot has roughly quadrupled its staff of software engineers focused on consumer robots to approximately 125 last year from 30 in 2015. The majority are involved in making iRobot’s products smarter through more advanced AI and computer-vision systems, according to Russ Campanello, executive vice president for human resources and corporate communications.

“Before, the robot would have to physically bump into something” to know there was a barrier, he says. “Now the robot can see,” thanks to more sensors.

The robot can also tell its owner when it has completed a cleaning job, how long the task took and where it cleaned. In addition, the machine gives its human owner feedback on where the dirtiest parts of the home are and when filters need to be changed.
Building out those capabilities requires iRobot to hire more Ph.D.-level scientists with expertise in AI, navigation, computer vision and robotics, who are able to not just build the company’s products but also conduct research that will lead to improvements in future devices, according to Mr. Campanello.
To find skilled candidates, iRobot has extended its recruitment efforts globally because such talent is scarce and in high demand. The result is an arms race among companies to find the best talent. According to jobs site Indeed, demand for AI-related roles has more than doubled in the past three years. The two most sought-after: machine-learning engineers and computer-vision engineers.
During a recent interview with a potential hire, Mr. Campanello jokes, he felt like the candidate was interviewing him. When parents ask Mr. Campanello what second language their children should learn, he suggests Python, a programming language popular in machine learning.
Customer-Robot Liaisons
For all the promise of AI systems, getting employees to accept them can be tough—especially when they involve changes as potentially intrusive as bringing robots into the workplace. That is why some companies that make AI applications use so-called customer success managers to help ease clients into working with the systems, answering complaints and making adjustments as needed. The role is currently among the most sought-after AI-related positions on jobs site ZipRecruiter.
Will Catron is one of those managers, at Cobalt Robotics Inc. in Palo Alto, Calif. Mr. Catron, 36, is in charge of ensuring clients are happy with the robots the company has rented out as security guards on graveyard-shift and weekend duty.
He starts by getting a “good handle” on how comfortable clients are interacting with robots, he says. “No one’s onboarded a robot.” He helps them get acquainted with other Cobalt employees with whom they’ll be interacting.
Mr. Catron spends most of his workday monitoring usage reports that Cobalt’s robots generate and interfacing with customers through calls, text, email and on-site visits. A lot of the job is building relationships and understanding what people want, he says.
Once, he went over to a client site to oil a squeaky wheel on a robot. On another occasion, Mr. Catron helped a client who was moving to a new location to secure a last-minute certificate of insurance, required by the client’s new landlord to have the robot on patrol. That prevented his client from having a gap in their robot-security service.

Robot Managers
It is a constant theme in AI work: Even though AI can be amazingly smart at some jobs, its judgment can be very limited compared with a person’s. Hence the job of robot specialist: someone who oversees the work of machines to make sure they’re doing their jobs properly, and intervenes if the AI asks for help in a tricky situation.
That need for human involvement is “the part that people underestimate the most,” says Mehdi Miremadi, a partner at McKinsey who focuses on automation.
At Cobalt, for instance, Shiloh Nordby works nights and weekends overseeing several robotic security guards at multiple sites across the San Francisco Bay Area, through feeds that capture video and other data from the machines’ sensors. That information is displayed on a computer at Cobalt’s Palo Alto offices. If the robot notices a pipe leak or an intruder, it alerts Mr. Nordby, who then susses out the situation and calls the proper authorities for backup, if needed.
His responsibilities also include ensuring the robot guards don’t behave awkwardly when people are around by lurking over a person’s desk or by barreling through a crowd of people during shifts, he says. In those circumstances, he takes over the machine’s autopilot to redirect it. In fact, Cobalt hired 36-year-old Mr. Nordby, a former restaurant manager, for his social finesse and administrative skills, rather than his technical chops.
The job is “about talking to people,” including employees and other on-site security guards at client sites, and “making them feel comfortable,” says Mr. Nordby. He also helps manage the other specialists in Cobalt’s crew.
Data Labelers
For AI to properly understand the world, it needs humans to explain what things are—meaning, the data that the AI absorbs need to be labeled. That could mean identifying objects in images—labeling which part of the image is a face, for instance—or parsing sentences to teach it what phrases mean.
Many workers are tasked with doing just that, looking over information and marking it for a computer. Companies such as self-driving-car developers and large tech firms can have “hundreds and hundreds of folks, sometimes even more, sitting and labeling data,” says McKinsey’s Mr. Miremadi.
Sometimes the data being labeled is fairly simple. At Cobalt, for instance, photographs and posters featuring people abound in office settings, which the robots’ computer-vision systems can mistake for trespassers. Employees, including engineers and robot specialists, have flagged these as false positives, so over time the machine has learned these aren’t potential threats.
Others are more subtle. At the Pacific Northwest National Laboratory in Richland, Wash., Donna Flynn has been spending a lot of time lately labeling images of clouds from laser-based radar, or lidar, so that the lab’s AI can learn to identify them more accurately. The idea is to use these data to improve scientific understanding of cloud formation, which, she says, should help improve climate and weather-prediction models.
Each image is roughly two million pixels, and for each pixel she must label whether it includes a piece of cloud or not—a complex job because debris, haze and dust can show up in the images and make them tough to decode.
Ms. Flynn, who has a background in atmospheric science and electrical engineering, has almost 20 years of experience working with lidar data. Interpreting the images requires not just deep knowledge of the atmosphere, but also understanding the intricacies of how lidar instruments work, she says.

Drone-Performance Artists
Drones are becoming a fixture at sporting events, where they provide overhead shots of the action. But they’re also starting to work their way into the arts, where they act as dynamic light installations and flying props. And there is a growing need for artists who can customize those drones to suit the needs of different performances.
Industrial designer Léa Pereyre fashions costumes for drones at Verity Studios AG , a small Zurich-based company that puts together drone shows for concerts, musicals, circuses and sporting events. She makes them look like birds and flowers, for instance, and has experimented with giving drones scales.
Before Ms. Pereyre, 25, joined Verity she helped design robots that taught children how to code. In her new role, she had to learn to work within the physical constraints of the drones she was dressing. That meant taking into consideration not just aesthetic qualities but also physical limitations, like weight, aerodynamics and a drone’s battery life.
Costumes, for instance, can get stuck in the drone’s propellers. For one show, Verity was outfitting drones as lamp shades. The decorative fringe on the circumference couldn’t be too long, or it would get sucked in. The team also had to minimize the noise the drones made.
“This is a crazy opportunity, because I have a blank slate and I can develop whatever I want this field to be,” Ms. Pereyre says. The company also has on staff drone choreographers who use Verity-built software to control the movements of dozens of autonomous flying machines. The drones use an indoor-positioning system similar to GPS to know where they are in space.
The choreographers are engineers by training, but also have a “strong artistic bent,” says Raffaello D’Andrea, Verity’s founder. He was the company’s original choreographer, but has since added two others.
AI Lab Scientists
Smart software is remaking drug development, sifting through vast amounts of information much more quickly than humans could, and coming up with new directions for medical research. That offers new opportunities for experts, like data scientists and computational biologists, to teach AIs about the life sciences or chemistry so that computers can surface novel ideas. Then there are the technicians who test the results that AI comes up with to see which are valid and which aren’t. Those results get fed back into AI machines to make them smarter.
BenevolentAI Ltd. uses AI to identify new molecules to develop into therapies for neurodegenerative disorders and other diseases based on analyses of medical journals, clinical trials and genetic and chemical databases.
The company’s “ideas come from a machine brain, rather than a human brain,” says Ken Mulvany, BenevolentAI’s founder.
But experimental biologists must test AI-suggested molecules in tissue and animal models, prerequisites to clinical testing and approval in humans. Last year, the company had about 100 unverified computer-generated hypotheses, but not enough scientists to validate them, according to Mr. Mulvany.
The London-based company recently acquired a lab-testing company, adding 50 scientists to its staff to test molecules AI software suggests. The data from these real-world experiments then feeds back into the company’s algorithms to improve them. Mr. Mulvany hopes this approach will make his company’s research-and-development pipeline more efficient.
Safety and Test Drivers
There are lofty predictions about the future of self-driving vehicles and how they will spread across the automotive landscape. But most of the vehicles aren’t fully capable of working on their own just yet—and that means opportunities for people who help the vehicles do their jobs safely. These workers act as a second set of eyes in tricky situations and can take over driving chores if necessary.
May Mobility Inc., a maker of self-driving microbuses, employs safety drivers who oversee the machines while they’re transporting people to and from offices and parking structures. The drivers provide feedback when a vehicle encounters a situation it is unsure how to handle—for instance, if a sensor isn’t quite sure of the color of a traffic light or if double-parked vehicles are obstructing its right of way and it needs to merge into oncoming traffic.
“The driver authorizes the vehicle to drive through” if it is safe, says Edwin Olson, chief executive of the Ann Arbor, Mich., vehicle maker. “It is much more of a permission system than a remote control.”
The company is about to triple its number of safety drivers as it expands its service. In addition to drivers, May Mobility hires test engineers, typically undergraduate-level employees, who devise scenarios for shuttles, like navigating a double-lane rotary. The company also has a maintenance crew responsible for everything from cleaning and charging cars to downloading data the vehicles collect while driving to “doing a dance in front of the car” to test whether its sensors detect objects, Dr. Olson says.
“These vehicles are not human-free,” he adds. “There are absolutely people behind the scenes."

0 comments:

Post a Comment