The workers were furious. Believing that new mechanical looms threatened their jobs, they broke into factories, seized machinery, brought it to the street and set it on fire, all with widespread public support, even silently by the authorities.

That was in 1675. And those English textile workers were neither the first nor the last in the long procession of concerns about the potential damage to jobs from labor-saving devices. Several centuries earlier, the adoption of the fuller’s mill caused a disturbance among workers who were forced to find other occupations. Almost exactly 60 years ago, Life magazine warned that the advent of automation would make “jobs” scarce – instead, employment exploded.

Now, the launch of ChatGPT and other generative AI platforms has unleashed a tsunami of hyperbolic outrage, this time about the fate of white-collar workers. Will paralegals – or maybe even a piece of lawyers – be redundant? Will AI diagnose some diseases faster and better than doctors? Will my next guest essay be ghostwritten by a machine? A breathless press has already begun to chronicle the first job losses.

Unlike most past rounds of technological improvement, the advent of AI has also spawned a small arsenal of non-economic fears, from misinformation to privacy to the fate of democracy itself. Some seriously suggest that AI could have a more devastating effect on humanity than nuclear war.

While I acknowledge the need for substantive fences, I will leave those valid concerns to others. When it comes to the economy, including jobs, history’s reassuring lessons (albeit with some warning signs) are inescapable. Nowadays, the problem is not that we have too much technology; is that we have too little.

We have had forms of artificial intelligence, broadly defined, for millennia. The abacus, supposedly invented in Babylonia more than 4,000 years ago, replaced more laborious methods of mathematical calculation, saving time and therefore reducing work.

When I began my career in finance in the early 1980s, we only had hand-held calculators to aid in our numerical analysis, which we painstakingly wrote in pencil on large sheets of paper (hence the term “spreadsheets”) and which were then typed by a secretarial swimming pool Any changes meant redoing the entire spreadsheet. Now everything happens with the click of a mouse.

Less than three decades ago, library-type research might require hours of combing through dusty volumes; now, it requires a few strokes on a keyboard. Not surprisingly, the number of librarians has been flat since 1990, while total employment has grown by more than 40 percent.

Other job categories have almost completely disappeared. When was the last time you spoke to an operator? Or were transported by manned elevator? In place of these and so many other dysfunctional tasks, a wide variety of new categories were created. A recent study co-authored by MIT economist David Autor found that approximately 60 percent of jobs in 2018 were in occupations that did not exist in 1940.

And so the Great American Jobs Machine began. In the decade since Life magazine decried the robot invasion, the US has created 20.2 million jobs, and today, the unemployment rate sits at 3.6 percent, a hair above its 50-year low. Of course, the number of Americans employed in finance has exploded, even as computers, Excel and other technologies have made them far more productive.

Higher labor productivity translates into higher wages and cheaper goods, which become more purchasing power, which stimulates more consumption, which induces more production, which creates new jobs. That, basically, is how growth has always happened.

This makes AI absolutely necessary, not just nice. We can achieve sustained economic progress and rising living standards only by increasing how much each worker produces. Technology – whether in the form of looms or robots or artificial intelligence – is central to this goal.

Generative AI – as dazzling and scary as it can be because of its potential to be a particularly transformative innovation – is just another step in the continuum of progress. Were our ancestors less surprised when they first witnessed other exceptional inventions, such as a telephone emitting a voice or a light bulb lighting a room?

In the heyday of business innovation – between 1920 and 1970 – productivity rose by 2.8 percent annual rate. Since then, except for a brief period of acceleration between 1995 and 2005 (the modern computer revolution), the annual rate of growth has averaged a modest 1.6 percent. To pessimists, this reflects their view that the most effective technological advances are behind us. To me, that means full speed ahead on AI

What constitutes “full speed ahead” remains to be seen. For all those who believe that AI will prove revolutionary, there are others who are more skeptical that it will prove a game changer. My best guess is that it will help push productivity up but not back to its halcyon days of the last century.

Certainly, the benefits of productivity growth do not always reach workers as fully and effectively as we would like. Recently, even the meager productivity growth has largely not filtered down to the workers. Since 1990, labor productivity has risen by 84 percent, but average real (adjusted for inflation) hourly compensation has increased by 56 percent.

That wiped out worker compensation largely went into corporate profits, fueling a stock market boom and record income inequality. Why the disconnect? There are a variety of contributors, from declining union membership to imports to anti-labor practices by companies, such as non-compete clauses for hourly workers.

Government can help ameliorate these dislocations. For more than a century, redistribution—yes, that can be a dirty word in America—has been a necessary part of managing the fruits of industrial and technological improvements.

The progressive income tax, introduced in 1913, was designed, in part, to offset the vast income inequality generated during the Gilded Age. More factory improvements and more income inequality in the 1920s helped spur a range of New Deal policies, from additional protection for labor to the introduction of Social Security.

Today, we can easily see the consequences of Washington failing to hold up its end of the bargain. Disaffected white factory workers in the Midwest with stagnant or declining real wages became supporters of Donald Trump (despite the fact that his policies favored the wealthy). With only 22 percent of Americans saying our country is on the right track, America feels more divided politically and socially than at any time in my 70 years of life.

We have done a poor job of preparing Americans for the transition from a manufacturing economy to one dominated by services. We have to do a better job this time.

If artificial intelligence proves as transformative as its acolytes (and some antagonists) believe, we could face a vast need for better education and training. The impact will not just be on factory workers, but on Americans across industries and up and down the employment chain, from financial analysts and coders to graphic designers and customer service agents and call center workers.

A recent report from Goldman Sachs, among the most optimistic of the techno bulls, concluded that AI can help return our productivity growth rate to the halcyon days of the mid-20th century. I, for one, fervently hope that the Goldman report proves correct and that AI unleashes a new era of technological and economic progress – and that we take the right steps to ensure that the rewards are widely shared.

By admin

Leave a Reply

Your email address will not be published. Required fields are marked *