A Blog by Jonathan Low


Apr 22, 2017

The Technological Forces That Are Changing the Nature of Work

Technology is accelerating change by democratizing access it, offering greater scale quicker and at a lower cost.

The result, as the following article explains, is that the nature of value and what is required to achieve it, is shifting, potentially making making basic characteristics of humanity we take for granted scarcer and more valuable themselves. JL

Greg Satell comments in Digital Tonto:

Work used to be stable because technology was stable. Functionality evolved slowly, which allowed business to maintain their business models for decades. As digital technology seeps into every product and endeavor, value is shifting from those who can perform tasks efficiently to those who can work with others to design jobs for machines, which means that we now need to hire, manage and train for new skills. As automation produces greater abundance, humanity itself is becoming the scarce, and most valuable, resource.
Work used to be pretty simple. You got up in the morning, did your job and came home at the end of the day. Most people spent their whole career doing pretty much the same thing for the same employer. They were judged by their skill, diligence and seniority and, at the end of it all, they looked forward to a peaceful retirement.
Today, those seem like quaint notions. Nobody spends an entire career doing the same job the same way anymore. In fact, a recent study at Oxford found that as many as almost half of the jobs in the US are at risk of being automated. A report by Deloitte also finds that technology is significantly changing how organizations function.
These trends are often attributed to artificial intelligence and machine learning, but that misses a huge part of the story. The truth is that the real shift has less to do with any single branch of information technology and more to do with how three digital forces are beginning to pervade everything else. As it turns out, our digital future is all too human.

1. Acceleration

In The Second Machine Age, MIT’s Erik Brynjolfsson  and Andrew McAfee tell a story about the invention of chess. As legend has it, the emperor was so impressed with the game that he invited its creator to name his reward.  The inventor’s request seemed modest, he simply told the Emperor:
‘All I desire is some rice to feed my family.’ Since the emperor’s largess was spurred by the invention of chess, the inventor suggested they use the chessboard to determine the amount of rice he would be given. ‘Place one single grain of rice on the first square of the board, two on the second, four on the third, and so on,’ the inventor proposed, ‘so that each square receives twice as many grains as the previous.’
For the first half of the chessboard, the emperor had to pay 232 grains of rice, or about the equivalent of one field, but as the doubling continued, the total amount owed far exceeded all the rice that existed in the world. That, in essence, is the concept of accelerating returns. When growth is exponential, even seemingly insignificant trends can become predominant.
This phenomenon is commonly known as Moore’s Law, the observation that Intel co-founder Gordon Moore made about the continuous doubling of transistors on a microchip, but it’s grown far beyond that. Today, as digital technology pervades everything else, we can see similar trends in everything from solar panels to gene sequencing.

2. Democratization

The decades after World War II saw a number of technological revolutions. During the 1950s and 60s, the discovery of the genetic code and nuclear energy, the rise of digital computing and humanity’s first forays into space all became realities. These achievements, unimaginable to previous generations, created completely new realms of possibility.
It was also an era of increasing scale. Companies mass produced products that were mass marketed to mass markets. Corporate strategy focused on wringing ever more efficiency out of the value chain, because lowering your cost basis meant that you could reinvest in assets that would create even more efficiency and unlock a virtuous cycle.
Those days are now over. Competitive advantage in a networked age is no longer the sum of all efficiencies, but the sum of all connections. To take just one example, consider the IBM 360 series of computers which dominated the 60s and 70s. It was vertically integrated and every piece of hardware and software had to come from IBM. Today, however, its Bluemix cloud computing platform is built on top of open source software and offers services from competitors.
The reason why is that now even the resources of a massive organization like IBM aren’t enough to compete anymore. So unless it can tap into the talents of thousands of developers across the world and make its technology accessible enough to allow others to build products on top of it, the company would fall behind its rivals.
This trend is constantly reducing the technical knowledge you need to create with technology. Want to build a website? Platforms like Wix will have you up and running in minutes. Want to build a mobile app with artificial intelligence capabilities? Platforms like Mendix are so simple even those with no coding expertise can get in on the action.

3. Convergence

The final force shaping technology today is convergence. A computer is no longer something in a room somewhere doing calculations for back office functions like payroll or inventory, or even a smaller machine that sits on a desk to help automate basic office work. Today, computing is more like electricity, an invisible force used to power other machines.
Moreover, the ubiquity and abundance of computing is what’s enabling the new technologies that are driving the 21st century, such as genomics, nanotechnology and robotics. These, in turn, are driving transformational change in scientific labs, factory floors and marketplaces.
IBM’s Angel Diaz observes that this convergence has had a profound effect on the computing industry. “To truly change the world today we need more than just clever code. We need computer scientists working with cancer scientists, with climate scientists and with experts in many other fields to tackle grand challenges and make large impacts on the world,” he says.
But the impact is even larger on other industries. As the bits of computer code pervade the atoms of our workplaces, we’re increasingly living in an automated age and the nature of work is becoming less about performing tasks and more about using technology to collaborate with other people.

Moving From Disruption To Collaboration

Let’s return to where we began. Work used to be fairly stable because technology was fairly stable. Products changed with the times, but mostly because of changing tastes and styles. Functionality evolved slowly, which allowed business to maintain their business models for decades and, in some cases, even longer.
Yet business models no longer last. The twin forces of acceleration and democratization made it possible for a couple of guys in a garage somewhere to compete with the world’s most powerful corporations. All of a sudden, the American dream morphed from getting a corner office in an executive suite to building a startup and achieving a billion dollar valuation.
The future, however, belongs to the third trend, convergence. As digital technology seeps into every product, service and endeavor, no one organization has all the skills and knowledge it needs to compete and collaboration itself is becoming a competitive advantage.
That, in turn, is changing the nature of work. To win in the new economy, you no longer need the best people, you need the best teams. Value is shifting from those who can perform tasks efficiently to those who can work with others to design jobs for machines, which means that we now need to hire, manage and train for new skills, such as empathy and social sensitivity.
As automation produces ever greater abundance, humanity itself is becoming the scarce, and therefore most valuable, resource.


Post a Comment