A Blog by Jonathan Low

 

Oct 22, 2018

How Artificial Intelligence May Help Alleviate Tech's Productivity Crisis

With all this powerful, fast, intelligent new technology, how come so many enterprises are not more productive?

The 'productivity paradox' has bedeviled business leaders and economists since the dawn of the dotcom era. But research suggests they may have underestimated the human and organizational costs of optimally adapting that technology.

One of the advantages AI may offer is the ability to address exactly those process and learning challenges to better performance. Successful organizations are focusing not on immediate whiz-bang 'answers' but on integrating new advances for sustainable competitive advantage. JL


Joe McKendrick reports in Forbes:
Technology is often brought in to automate existing processes which merely speeds up what business is already doing. While a new iteration of software or hardware may offer more capacity, efficiency, or performance, those are partly offset by the time users have to spend learning to use it. And glitches often bedevil the transition. AI may be a catalyst for productivity because it eases collaboration in workplaces, allows applications to learn and improve without needing to roll out a new version (and), can extract the right data to "help employees make better-informed decisions."
So far, the impact of information technology on overall productivity has been a mixed bag, and even disappointing. IT has been reshaping workplaces in a big way since the 1980s, yet, there appears to be little to show for all this progress -- many argue that technology may even inhibit productivity growth.
There are many reasons why the proliferation of technology doesn't automatically translate to productivity growth. For one, "technological disruption is, well, disruptive," Harvard's Jeffrey Frankel observed in a recent World Economic Forum report. "It demands that people learn new skills, adapt to new systems, and change their behavior. While a new iteration of computer software or hardware may offer more capacity, efficiency, or performance, those advantages are at least partly offset by the time users have to spend learning to use it. And glitches often bedevil the transition." Add to that the fact that individuals and organizations are being inundated with security issues and cyberattacks, and things get even more gummed up. Finally, there's the fact that people are being inundated with information and distractions by the minute.
Irving Wladawsky-Berger recently weighed in on this question in a Wall Street Journal piece, observing that technology is often brought in to automate existing processes -- which essentially merely speeds up what the business is already doing. That was the dilemma with the first wave of IT in the 1980s into 1990s, and is likely what we're experiencing now.  The lesson the first time around was what Wladawsky-Berger identifies as the "Solow Paradox" --  that "companies realized that using IT to automate existing processes wasn’t enough," Wladawsky-Berger points out. "Rather, it was necessary that organizations leverage technology advances to fundamentally rethink their operations, and eliminate or re-engineer business processes that did not add value to the fundamental objectives of the business."

That's even more the case these days, as organizations pour more money into the promise of digital transformation, expecting overnight rewards. "We are experiencing a kind of Solow Paradox 2.0, with the digital age more around us than ever except in the productivity statistics.
There are several reasons for this lag. First of all, we’re in the early deployment years of major recent innovations, including cloud computing, IoT, big data and analytics, robotics, and AI and machine learning."
Will AI eventually increase productivity? Overall, it's still unknown, but there are benefits that can quickly be realized on an individual or departmental-level scale. A recent report out of Constellation Research states that AI may help boost personal productivity in a number of profound ways.  AI advances the traditional software model, "allowing applications to learn and improve over time, without needing to roll out a new version," relates Alan Lepofsky, the report's author. "AI-enhanced software can assist in a variety of processes, from automating mundane tasks such as scheduling meetings to filtering through thousands of documents in order to recommend the best content."
AI may be a catalyst for productivity because it eases collaboration in workplaces. Here are some of Lepofsky's ideas on how this is happening:AI promotes more natural interaction. "Perhaps the subtlest yet important manifestation of AI is how people can now interact with devices and applications in ways that mimic human interaction," Lepofsky states, pointing to user-friendly interfaces such as natural-language processing as an example.
AI helps to automatically categorize information. Until recently, tagging information or images was a manual process done by a dedicated few. "AI greatly assists in this process, either using image recognition to add tags to pictures or scanning documents to extract keywords," he observes.
AI automates recommendations. "As AI learns our patterns and preferences, tools can start to recommend answers or replies for us." These recommendations will eventually become automatic actions, Lepofsky says.
AI inspires creativity. "Not everyone has an eye for color, fonts, layout or other important elements of design. What if your applications could perform those functions for you?" Employees can become storytellers, he adds.
AI extracts insights. "One of the greatest benefits of AI is its ability to look at massive data sets and find patterns and trends.," says Lepofsky.  AI can extract the right and relevant background data to "help knowledge workers or first-line employees make better-informed decisions and recommendations."
Technology appears to be more overwhelming than productivity-boosting. AI may help sort things out. Here's hoping.

0 comments:

Post a Comment