Certainly you’ve heard the news by now: inequality is rising, the middle class is being hollowed out, wages are stagnant, fueling populist backlashes from both the right and left around the world, giving rise to our current moment of rising political partisanship and falling social trust. And in the background of this landscape of socioeconomic angst lies the specter of of automation. Advances in artificial intelligence and robotics have displaced millions of workers, and in the coming years as many as half of all jobs may be made obsolete by automation, including even in previously safe, high-skilled industries like law, medicine, and (gulp) finance.

This, at least, is the impression you might get if you’ve happened to read through, say, the New York Times, the Wall Street Journal, the Economist, Atlantic Monthly, the Washington Post, Bloomberg, or the BBC anytime recently, to name just a few examples. Of course, labor saving technological innovations that threatened certain peoples’ jobs have been an issue society has dealt with for hundreds of years, so what’s different now? To many, automation by algorithms represents a paradigm shift away from previous modes of industrialization that tended to support mass employment. In his bestselling book Homo Deus: A Brief History of Tomorrow, historian Yuval Noah Harari (yes, it’s a book about the future written by a historian) lays out the basic case for mass technological unemployment:

Ever since the Industrial Revolution erupted, people feared that mechanisation might cause mass unemployment. This never happened, because as old professions became obsolete, new professions evolved, and there was always something humans could do better than machines. Yet this is not a law of nature, and nothing guarantees it will continue to be like that in the future. Humans have two basic types of abilities: physical and cognitive. As long as machines competed with humans merely in physical abilities, there were countless cognitive tasks that humans performed better. So as machines took over purely manual jobs, humans focused on jobs requiring at least some cognitive skills. Yet what will happen once algorithms outperform us in remembering, analysing and recognising patterns?

He goes on to list several examples of cognitive tasks that were once considered “uniquely human” such as chess-playing, facial recognition, and driving a vehicle, that have now or soon will be done and done better by computers. As that list of tasks computers can do grows ever larger, there will be fewer things left for us humans to do until there isn’t any need for human labor at all. At that point, anybody who doesn’t have enough wealth to support themselves will be utterly dependent on the government or other benefactors to provide their daily bread. This is the part of the conversation where the idea of a Universal Basic Income (UBI) usually comes in. A UBI is simply a system in which the government gives money to all of its citizens, no strings attached, simply for being alive, usually in an amount just enough to keep a person above the poverty level. It’s an idea that’s been around for a long time, and has been endorsed in some form by as varied a list of people as Milton Friedman, Martin Luther King Jr., and Richard Nixon, but is now usually mentioned in the same breath as automation, often by the very people who are doing the automating, like Elon Musk and Mark Zuckerberg. As algorithms inevitably put more and more people out of work, so the thinking goes, we need to radically reimagine how the common person gets by in a world that no longer needs work. Perhaps nobody today makes this claim more boldly than Andrew Yang, a New York businessman turned longshot 2020 presidential candidate whose entire platform is to bring a UBI to the United States, without which “the future without jobs will come to resemble… the desperate scramble for resources of Mad Max.” In a recent interview, Yang makes clear he doesn’t think this is an abstract concern for generations in the future, but the most pressing issue of our time:

The technological changes that are going to completely disrupt many many millions of American jobs are no longer speculative they are here with us today. The reason why Donald Trump won the election of 2016 is that we automated away 4 million manufacturing jobs in Ohio, Michigan, Pennsylvania, Wisconsin – the swing states he needed to win – between 2000 and 2015. And it’s about to get much much worse because we’re going to triple down on eliminating the most common jobs in the US economy, which are, in order: administrative and clerical work… retail and sales workers, food service and food prep workers, truck drivers and transportation, and manufacturing.

As you may have guessed from the title of this post, I am somewhat less concerned that there’s a real problem here, and am overall mostly optimistic about the future that technological progress will bring. Over the next few posts I will take on the zeitgeist of this peculiar moment in history we find ourselves in, exploring the relation between technology, economic growth, finance, globalization, freedom, and the human condition. First up, why robots probably won’t bring about the desperate scramble for resources of Mad Max.

Extraordinary Claims and Ordinary Data

In looking for evidence for the hypothesis “automation is causing unemployment” a good place to start might be the unemployment rate. Turns out it’s pretty darn low right now. As low as it’s been in decades.

US Unemployment Rate Chart

US Unemployment Rate data by YCharts

Seems like most people can find work right now in spite of all the robots. But isn’t the thing that not everybody can find full-time work? Hasn’t technology pushed whole classes of workers into the gig economy where they can only work part-time driving for Uber or something? The official US Unemployment rate is technically U-3 unemployment, which includes people who are looking for work but not working. U-6 employment adds in people who are part-time or marginally employed but looking for more work. Turns out U-6 is currently well below its average since we started measuring it in the 90s as well.

US U-6 Unemployment Rate Chart

US U-6 Unemployment Rate data by YCharts

So far the technological unemployment hypothesis is not off to a good start. But most intelligent observers concerned with this issue are aware of these facts and would point to two main indicators in support: labor force participation and wage growth.

The US labor force increased throughout most of the 2nd half of the 20th century as women increasingly started working outside the home. But it started declining around the turn of the century, then fell significantly following the financial crisis and basically hasn’t recovered since.

US Labor Force Participation Rate Chart

US Labor Force Participation Rate data by YCharts

At least some of this decline represents the archetypal rust belt resident factory worker or coal miner we heard so much about in the last election cycle, whose job is now done by a machine while they continue to be unable to find work. Many worry this was just the first wave in a coming tsunami of jobs that will be lost to automation, leaving devastated communities where many will have little choice but to drop out of the labor force entirely, often going onto permanent disability if there are no better alternatives. But how statistically meaningful is this trend? We’re talking right now about a difference of a few percentage points between the all-time high and where we are right now, roughly 67% vs 63%, hardly the sort of difference that should make you rush to declare the laws of economics have changed. Nor is all of the change the result of these sorts of discouraged workers. In fact, we know that most of it is not. JPMorgan shows this decomposition of the decline in the participation rate since the Great Recession.

While some of the early declines were a cyclical result of the simple fact the last recession was a particularly severe one and many people remained unemployed well after their unemployment benefits ran out, most of that has been worked off by now. But most of the people who have stopped working since then have chosen to do so because they retired, as represented by the “Aging” portion of the chart. The US population is getting older, and the Baby Boomers have begun slowly retiring, which they will continue to do for several more years, bringing the labor force participation rate with it. This is less “robot apocalypse” and more “mundane fact that economists had known about for decades.” That little 1% slice of “Other” has to include all your discouraged workers put out of a job by Hal 9000, AND make room for other totally benign things like students staying in school longer, mothers (and fathers) deciding to stay home with the children, and people who made a killing off Bitcoin and are now living off the grid. 1% of America still means a few million people – enough to swing a national election in some cases – but it’s hardly enough to draw confident conclusions about a fundamental change to the nature of employment.

Okay, so maybe it’s not that we’re quite running out of jobs (yet), but the jobs are just pretty crappy. Despite continued economic growth and rising stock markets, wages in this country for the middle and lower classes just haven’t grown much for a long time, while the rich continue to get richer. Below I plot the growth in real household income for each quintile of the income distribution, plus the top 5th percentile.

We see divergence at every level of the income distribution. High income earners have grown their incomes in recent decades at a fast clip, while the poorest have struggled not to lose ground. The bottom half of America is still making no more than they did in 2000. This fact is not so easily dismissed as a statistical illusion, and is at least consistent with an interpretation in which capitalists have been able to benefit from increased efficiencies on their capital through automation, reducing demand for labor while paying themselves more. This could then just be a prelude to a world in which the fully generalizable form of capital that is artificial intelligence drives the demand for labor down further and further until wages aren’t worth the work anymore and people really are pushed out of the labor force en masse.

Before diving further into the data let’s take a step back and consider the basic economics of this scenario. Does the concept of mass technological unemployment/wage stagnation even make economic sense? And if so, what predictions would that imply? As mentioned above, technology has been automating particular jobs away for over two hundred years, with no long-run effect on the unemployment rate and a rising standard of living. The reason for this, an economist would say, is that human labor and industrial/automated capital are complements, or least have been since the industrial revolution. Complements are goods or services that reinforce the value of one another, like pencils and erasers. Substitutes on the other hand, are goods/services that are used instead of one another, like pencils and typewriters. In the case of complements, if one of the goods becomes more plentiful and cheaper, the price of the other will tend to rise, all else equal, as the increased availability of the less-scarce good increases complementary demand of the more-scarce good. This seems to be what has happened with human labor over the last couple hundred years. As technological machinery has become better and cheaper, human labor has generally become more valuable, since it takes a lot longer to increase the supply of humans than it does to increase the supply of industrial goods.

The tricky thing is that for any particular job, capital may very well be a substitute for labor. The mechanical looms that vaulted Britain into the industrial age around the turn of the 19th century did indeed displace lots of skilled weavers. But for every weaver that lost a job, a new one opened up as a power-loom mechanic, or a textile store clerk, or a merchant shipper, or something else entirely. Capital substituted particular forms of labor while complementing labor in general, and this pattern has persisted ever since. But is a sort of “phase change” possible in which capital becomes a fully general substitute for labor?

Yes. Consider what happened to horses. Throughout the 19th century, horses complemented the process of industrialization, and their “employment” in factories, transportation, and agriculture increased. The value of horses rose and their population increased exponentially. By 1900 there were an estimated 100,000 horses in New York City, at one point giving rise to the “Great Horse Manure Crisis of 1894.” Here’s what 5th Avenue looked like back then:

Just imagine the smell.

But you know what happened next for the horse, of course. Henry Ford came along with mass-produced motor vehicles that could do almost anything a horse could, except more reliably and at lower cost. The horse population collapsed and by the 1930s the only growing equine industry was glue production.

What comparable sort of innovation could totally replace humans in a similar manner? Thus far, every time a computer or machine has made a particular job in the economy redundant, another one simply popped up having to do with designing or programming more computers or machines or analyzing, interpreting, and improving their output, like a giant game of whack-a-mole with the job market. Human desires are virtually infinite and human intelligence is virtually infinitely generalizable, so as soon as we solve one problem we simply move on to the next one, and that means there is always work to be done. In order to shut humans out entirely machines would need to close the loop on this process once and for all, meaning we would need computers as fully intelligent, creative, and flexible as the human mind, computers that could program the next generation of computers. This concept is referred to as Artificial General Intelligence, and it’s widely recognized by people in the know that once we achieve it we’ve entered a whole new ball game. Opinions seem to range from “it’ll be the most important event in human history” to “it’ll be the most important event in the history of the universe.” In a previous post I talked about the economic implications of such an event, predicting that it would shift our present rate of economic growth from a few percent per year to several thousand percent per year. Concurrent to this would be skyrocketing returns to capital and, indeed, a plummeting demand for human labor that drives wages to essentially zero.

How far away are we from this technological singularity? It’s really anyone’s guess. The human brain is a very, very complicated machine and the more we learn about it the farther it seems we have to go before matching its sophistication in our own silicon chips. My own best guess as a non-expert who’s just read a lot on the topic is that we’ll probably have artificial general intelligence (or, of comparable import, human mind uploads) in 50-80 years. But it could be a bit sooner than that or it could be centuries away, no one really knows. Depending on how old you are or what your personal philosophy is this might mean you think this scenario is either not even worth considering, or one of the most important issues humanity currently faces. But in any case it doesn’t seem to be at all relevant to today’s labor markets, nor will it be for the short-to-intermediate future. There seems to me to be no real theoretical support for the notion that automation per se is driving people out of the labor force or holding down wages.

In fact, the technological unemployment hypothesis makes a specific prediction that we can falsify with the available data. Namely, if it were the case that automation was starting to cause capital to substitute for labor in general, then we would expect to see accelerating economic growth alongside falling or stagnating wages. Now, there is an active debate among economists right now about whether recent economic growth rates have changed from what we saw throughout the 20th century, but the controversy is over whether economic growth has been slowing down or not; virtually no one is claiming that economic growth in recent years has been faster than it used to be. Whether or not growth rates today are lower than they were in past decades turns out to actually be a tricky question that I will explore in a future post, but we can definitively disprove the technological unemployment hypothesis by comparing international data over recent history. The OECD collects wage data from all its 36 member countries. Below I plot the average growth in wages for each country (excluding Turkey due to insufficient data) alongside its per-capita GDP growth rate from 2000 to 2017. If the technological unemployment hypothesis is true then the graph should be negatively sloped, as economies that make the most out of automation experience rapid economic growth while suppressing wages.

Source: OECD

In fact, we see the exact opposite. There is an extremely strong positive relationship between wage growth and GDP growth, and the slope is nearly one-for-one. That is, a 1% increase in the GDP growth rate corresponds roughly to a 1% increase in wages. In the United States, true to the pessimistic laments from the media, wage growth has been somewhat lower than GDP growth, suggesting lopsided benefits to capitalists, but several other countries experienced the reverse: wage growth that outpaced that of the aggregate economy.

As a robustness check we can look at what other data may say, for example, the labor force participation rate. Like the United States, most other developed countries have also experienced falling labor force participation as well. The technological unemployment hypothesis would predict that faster-growing economies experience larger declines in their labor force. What do the data say?

Source: OECD

Once again, we find that the robots-are-taking-our-jobs view of the world predicts the wrong sign on the data. Faster-growing economies are associated with higher labor force participation. The relation isn’t as strong as with wages (not that we should have expected it to be), but it is robustly positive and more consistent with the interpretation that labor and capital are still generally complements instead of substitutes. Looking at still other data we might expect to be relevant, such as the rate of capital formation or the capital to income ratio, we find broadly the same pattern. So far as I can tell, there is basically no empirical support for the notion that robots are taking our jobs.

Working Life in an Automated World

So what does the future of jobs look like? It should come as no surprise that this is something that the folks at the Bureau of Labor Statistics spend a good deal of time thinking about. Here are their projections for the next decade’s fastest growing occupations:

Nearly all of these jobs fall into one of two categories: those that help automate greater swaths of our economy, and those that involve tasks that do not readily lend themselves to automation. Among the former, the prototypical example is software engineering. As I stated above, someday computers will be able to program themselves, and then things will start changing pretty radically, but that day is probably still several decades away. Until then, we will need an ever-growing supply of human programmers to create and maintain our information economy, along with professionals who can analyze and interpret the ocean of data that this economy generates, thus the growing demand for statisticians, mathematicians, and the like. Even professionals who are merely highly proficient at data management systems like Microsoft Excel or other enterprise software platforms will likely remain in high demand for many years to come (as in the “operations research analysts” ranked at #18), as well as the designers that build the interfaces between humans and machines. All these jobs will constitute the lion’s share of  tomorrow’s “white collar” labor market (even as an increasing proportion wear t-shirts and hoodies), and they will not be confined to industries we conventionally think of as high-tech. The financial industry already employs a huge number of programmers and other computer experts, as are increasingly firms in retail, transportation & logistics, and other industries. Indeed, development of proprietary software appears to be the competitive advantage that is separating the most successful firms in every industry from the rest today.

In the second category are jobs that cannot be easily automated because they generally involve physically manipulating awkward, non-standard systems and objects that don’t play well with giant factory robots. Most commonly, these awkward objects are the bodies of other human beings. Thus, we see large forecast growth for nurses, physical therapists, masseurs, and related occupations. While software will likely continue to augment the skills of these practitioners – so that, for example, a physical therapist may be able to use big data analytics to better customize your treatment regimen – it will still probably be a long time before cyborg technology is advanced enough to deliver the “human touch” these jobs require. Similarly, there are jobs that involve building and maintaining non-human physical devices that because of their location and/or non-standard structure will be better serviced by human hands for some time to come. Solar panel installation, at the top of the list, is a great example. While manufacturing solar panels is exactly the sort of information technology that is highly amenable to automation (leading to to exponential increases in efficiency), actually installing and maintaining those panels involves traveling all over the place, often to far-flung locations, and integrating the panels and circuitry into a building and other physical infrastructure that is unique to every site. The solar industry already employs more workers in the US than the coal, gas, and oil industries combined, and these numbers will only increase as solar grows to become the dominant source of energy in the coming years.

So yes, jobs that involve little creativity and lots of routine tasks like in manufacturing, retail, transportation, and food service are going to be going away. But they will be replaced by jobs either involving lots of creativity (especially in information technology) or lots of non-routine tasks (especially in health & human services and “idiosyncratic capital”). This hardly seems like a bad outcome. If you look at these fastest-growing professions, you’ll notice that they encompass a wide range of skill-level and compensation; they include the working-class, middle-class, and professional-class. While many have bemoaned the fact that Americans can no longer really get a factory job straight out of high school and live a decent, middle-class life anymore, they can get a job as a massage therapist, medical assistant, or site technician without a great deal of formal education and still earn a decent living, and for the most part these jobs are less dangerous and more fulfilling to boot. There should be plenty of work for everyone for a long while to come, and plans to deal with the unemployed masses in the robot apocalypse are probably premature.

Okay, but why then has America been underperforming the rest of the world in wage growth while experiencing increasing income inequality, if not because of the robots? In my next post I will explore what does explain this problematic development, pointing not to our machines but into the mirror. Much of what has been going on in the US economy and around the world in recent decades can be explained by ours and others’ commitment to America’s founding principle: freedom.

Disclosures: This post is solely for informational purposes. Past performance is no guarantee of future returns. Investing involves risk and possible loss of principal capital. No advice may be rendered by RHS Financial, LLC unless a client service agreement is in place. Please contact us at your earliest convenience with any questions regarding the content of this post. For actual results that are compared to an index, all material facts relevant to the comparison are disclosed herein and reflect the deduction of advisory fees, brokerage and other commissions and any other expenses paid by RHS Financial, LLC’s clients. An index is a hypothetical portfolio of securities representing a particular market or a segment of it used as indicator of the change in the securities market. Indexes are unmanaged, do not incur fees and expenses and cannot be invested in directly.