A Blog by Jonathan Low

 

Mar 28, 2019

What Happens When Smart Tech Marries Dumb Machines

In a merger of technologies, the lowest common denominator always wins. JL

Daniel Michaels reports in the Wall Street Journal:

Even before the Internet of Things becomes a pervasive reality, tech experts and public-safety professionals are fretting over the intersection of virtual and real in what they call cyber-physical security. The worry is that engineers are putting mechanical systems under the command of computers and algorithms without fully understanding the consequences. Problems include confusion about how controls work, software bugs leading to physical accidents and, most worryingly, cyberattacks.
Investigators probing two deadly crashes of Boeing Co. 737 MAX airliners are grappling with a hybrid of old and new technology, where a complex piece of software controls hydraulic pumps and motors similar to those used when Lyndon Johnson was president.
The plane, first designed in the 1960s and modernized three times, is caked with successive generations of technology superimposed on each other. Digital retrofits to older equipment like the 737 MAX’s anti-stall system—known as MCAS and suspected of having contributed to the crashes that together claimed 346 lives—are increasingly common. From smart-home devices that control oil-burning furnaces to mainframe computers that oversee decades-old power grids, digital controls are popping up everywhere around the mechanical world.
Software has underpinned the internet’s virtual world from inception, of course, and has shown both its potential and vulnerability. Now, with the cost and size of digital sensors plunging and the ability to transmit data ballooning, more physical objects than ever are getting linked through software. Even before the Internet of Things becomes a pervasive reality, tech experts and public-safety professionals are fretting over the intersection of virtual and real in what they call cyber-physical security.
The worry is that engineers are putting mechanical systems under the command of computers and algorithms without fully understanding the consequences. Problems include confusion about how controls work, software bugs leading to physical accidents and, most worryingly, cyberattacks on infrastructure like power stations or chemical plants that could cause catastrophes.
Cyber-physical systems are “embedded in virtually all aspects of our lives,” said Christos Papadopoulos, program manager of cyber-physical systems at the Science and Technology Directorate of the Department of Homeland Security. DHS has since 2013 sought to spot and address potential weaknesses in cyber-physical systems, initially involving cars, medical equipment, building controls and power grids, in a broad collaboration with academic institutions and research institutes.
Federal investigators have also pursued wrongdoers who exploited retrofit weaknesses, from Russian hackers who targeted U.S. utilities to Volkswagen AG engineers who fraudulently passed U.S. emissions tests by doctoring control software that had been added to diesel-engine designs.
Cyber crime and malware have long plagued the virtual world, though data breaches, theft and extortion rarely cause direct physical harm. Software bugs can also arise in physical equipment that is designed from scratch with digital controls, like electric cars, medical equipment and drones. But creators of those systems from the outset link hardware and software, and engineers test the products with both in mind. Retrofitted equipment, experts say, is rarely vetted so thoroughly.
“There’s a bigger temptation not to test things when you’re just making a little change by adding automation,” said Justin Cappos, a professor of computer science at New York University’s Tandon School of Engineering. With every adaptation, the potential for problems might accumulate without anyone noticing. “You’re kind of boiling the frog,” he said.
Mr. Papadopoulos said since adding security to older systems is often impossible, DHS is assessing technologies to protect their communication links, “to create an isolation layer and intercept attacks before they reach vulnerable devices.”
Prof. Cappos, who is participating in the automotive strand of the DHS project, said car makers awoke to their vulnerability several years ago after high-profile hackings that took control of newly connected vehicles.
“A lot of other industries haven’t had the same wake-up call,” he said.The aviation industry was an early adopter of computers to control physical equipment, always proceeding slowly and under intense scrutiny. Investigators are now probing whether Boeing’s integration of the automated Maneuvering Characteristics Augmentation System, or MCAS—which in certain circumstances pushed the nose down and confused pilots—met industry standards.
Automation offers huge benefits, even for decades-old mechanical equipment. Computers can run most machinery faster, more precisely and more efficiently than humans. While automation can cost workers jobs, it can also eliminate drudgery and danger. But integrating computers into “dumb” machines poses challenges.
Britt Storkson, a designer of electronic controls for industrial pumping equipment in The Dalles, Ore., says careless computer retrofitting of mechanical gear has become “a serious problem you see in industry all the time.” He has seen computer processors locking up, stopping heating and cooling equipment, conveyor belts and industrial-process systems.
It’s not that the software doesn’t work, it’s that it doesn’t work in all conditions,” Mr. Storkson said. And since software development is divorced from equipment design, “software developers don’t know, if I type in this command, what’s going to be the impact down the line.”
Accidents can cause expensive damage, but not nearly on the scale of hacking. Martyn Thomas, emeritus professor of IT at Gresham College in London who specializes in safety-critical systems, notes that traditionally in the physical world, safety plans are based on the assumption that things fail by chance. If two elements must fail before conditions get dangerous, the probability of a catastrophic accident shrinks further.
“But malware is designed to make everything fail at once,” Prof. Thomas noted. “The insecurity of cyberspace changes everything.”
Over recent years, malware attacks such as WannaCry and NotPetya have hit medical scanners in the U.K., A.P. Moller-Maersk A/S shipping facilities around the world and manufacturing, research and sales operations of pharmaceuticals giant Merck & Co. Aviation hasn’t faced notable hacks because planes use special software with extensive security.
While hackers can attack both new and retrofitted digital equipment, systems with network links added years later are harder to protect, said David Grout, chief technology officer for Europe at cybersecurity specialist firm FireEye Inc.
Malware dubbed Triton in 2017 almost disabled the industrial-safety software in a Saudi Arabian petrochemicals plant, potentially allowing hackers to control the facility and release toxic chemicals.
Triton was only discovered when a plant manager had to reboot equipment three times and wondered why. FireEye suspects state-backed hackers—likely from Russia—for the attack.
“Their objective was to show the world that they are there and can take action if they want to,” said Mr. Grout.

0 comments:

Post a Comment