Posts

The Productivity Paradox and Its Link to Technology Innovation

From PCs to smartphones, office tech has always promised to make us more productive. But time and again, it’s come with unexpected side effects. In 1982, Time magazine skipped its annual tradition of naming a “Man of the Year” to instead crown the personal computer as the “Machine of the Year.” The Apple II had been released only a half-decade earlier, and the subsequent introduction of the VisiCalc spreadsheet software in 1979 seemingly all at once convinced the managerial class about the business potential of computers. Soon, IBM released its own PC, which went on to become both widely copied and wildly popular. The journalist who wrote the Time feature noted in his article that he had typed his contribution on a typewriter. By the next year, their newsroom switched to word processors. The revolution in workplace productivity had begun.

At least, this is the simple version of the tale we tell. A closer look at what happened next and, in the decades following has complicated matters. We’ re used to the idea that new office technologies make us strictly more productive, but the history of workplace tools teaches us that the quest to make common activities more efficient can yield unexpected side effects. This was true of the first PCs, and it likely explains the uneasy relationship we have with a more recent office innovation: email.

Not long after the arrival of the PC, experts began to question the miraculous nature of this suddenly ubiquitous device. In 1991, an article in The New York Times quoted an economist who pointed out that although companies continue to spend heavily on technology, “white-collar productivity has stagnated.” He concluded at the time: “No longer are chief executives confident that throwing computers at their office staffs will result in greater efficiency.”

The data supported these concerns. A study of the years 1987 to 1993, conducted by economists Daniel Sichel and Stephen Oliner, estimated that computer technology contributed at most 0.2 percentage points a year to business output growth, after adjusting for inflation, a period during which overall growth expanded by 1.9 percent a year. A contemporaneous article summarized these findings bluntly: “The impact of computers on recent productivity growth has been vastly overstated.”

Productivity growth in most of the world’s rich countries has been dismal since around 2004. Especially vexing is the sluggish pace of what economists call total factor productivity—the part that accounts for the contributions of innovation and technology. In a time of Facebook, smartphones, self-driving cars, and computers that can beat a person at just about any board game, how can the key economic measure of technological progress be so pathetic? Economists have tagged this the “productivity paradox.”

What are the problems that cause the productivity paradox?

  • Mis-measurement – The gains are real but our current measures miss them.
  • Redistribution – There are private gains, but they come at the expense of other firms and individuals, leaving little net gain.
  • Time lags – The gains take a long time to show up.
  • Mismanagement – There are no gains because of the unusual difficulties in managing IT or information itself.

According to an article in ‘The Economist‘ , research has shown that productivity growth did not accelerate until 40 years after the introduction of electric power in the early 1880s. This was partly because it took until 1920 for at least half of American industrial machinery to be powered by electricity. Therefore, we won’t be seeing any spikes in productivity until all the major powers have reached at least a 50% penetration rate for computer use.

What’s happening now may be a “replay of the late 80s,” says Erik Brynjolfsson, another MIT economist. Breakthroughs in machine learning and image recognition are “eye-popping”; the delay in implementing them only reflects how much change that will entail. “It means swapping in AI and rethinking your business, and it might mean whole new business models,” he also said. In this view, AI is what economic historians consider a “general-purpose technology.” These are inventions like the steam engine, electricity, and the ­internal-combustion engine. Eventually they transformed how we lived and worked. But businesses had to be reinvented, and other complementary technologies had to be created to exploit the breakthroughs. That took decades. The debate over the productivity paradox is understandable, given the expectations of productivity from computerisation that are embedded in our culture. But it is arguable that the changes under way will inevitably take place across a much longer time frame than the measurements can currently cover, and actual payoff cannot be expected until major elements of organisational and social learning are complete. The measurement, management, and learning arguments can be combined to create a broad case for fundamental social transformation. In this transformation older systems of measurement and management fail as the entire regime of production is altered, and learning becomes to a much greater degree experimental and risky. The slow accumulation of knowledge will probably, eventually, produce major productivity benefits.

Illustrating the potential of AI as a general-purpose technology, Scott Stern of MIT’s Sloan School of Management describes it as a “method for a new method of invention.” An AI algorithm can comb through vast amounts of data, finding hidden patterns and predicting possibilities for, say, a better drug or a material for more efficient solar cells. It has, he says, “the potential to transform how we do innovation.”

But he also warns against expecting such a change to show up in macroeconomic measurements anytime soon. “If I tell you we’re having an innovation explosion, check back with me in 2050 and I’ll show you the impacts,” he says. General-purpose technologies, he adds, “take a lifetime to reorganize around.”

Even as these technologies appear, huge gains in productivity aren’t guaranteed, says John Van Reenen, a British economist at Sloan. Europe, he says, missed out on the dramatic 1990s productivity boost from the IT revolution, largely because European companies, unlike US-based ones, lacked the flexibility to adapt.

We must accept that these better methods will not emerge spontaneously. We have to instead seek them out, put them in writing, experiment to get things right, and be willing to put up with some of the inconveniences and loss of flexibility these efforts might create. All of this, of course, is harder than simply waiting for engineers to build even faster email clients. But ultimately, it’s the only way to ensure that continued innovations in workplace communication can improve rather than hold back our ability to make progress on the things that matter.

Given our current situation knowing that your colleagues or employees are best suited for this new scenario we find ourselves in. Finding the right talent, the best fit for the job and your organisation can be a very challenging task. It is now important to find out whether your managers or your team is well-equipped of working together from various locations. It requires deep knowledge of their personalities, strengths, weaknesses, interests, work style and other characteristics. Our technology and solutions will do the work for you, helping you discover if your people are resilient during times of hardship, if they are autonomous, if they are team players, without actual human contact. Given that our platform is cloud-based, everyone can use it from home as well. Humanity finds itself at a crossroad for various reasons now, why not help people discover and develop themselves from the comfort of their own homes?

Request a free demo:

B_txt_14

Sources:

https://hbr.org/1986/07/the-productivity-paradox
https://www.technologyreview.com/2018/06/18/104277/the-productivity-paradox/
https://www.wired.com/story/email-slack-productivity-paradox/