Want to know the true value of AI, NFTs, and other much-touted technologies? Ignore the news and look at the harsh judgment of the market.
By Lee Vinsel and Jeffrey Funk
If you believe the headlines, we’ve been living through a decade of historic technological breakthroughs. In 2013, USA Today named Uber the tech company of the year for solving “basic market dilemmas” of supply and demand. Soon after, Erik Brynjolfsson and Andrew McAfee published The Second Machine Age, arguing that society was on the verge of a revolution fueled by robots and artificial intelligence. Right through the gloom of the Covid pandemic, tech evangelists kept pushing the same narrative, adding blockchain and cryptocurrencies to the list of advances that would soon unleash a vast wave of new wealth.
But the great revolution always seems to be just around the corner. Despite record venture capital funding, the latest technologies have so far had only modest economic impact.
It’s easy to dismiss each of those exuberant projections as business as usual, one more episode in the long history of technology hyperbole. As scholars who study technology, we see something more troubling happening. Earlier waves of hype, such as the one that accompanied the dot-com boom of the 1990s, were grounded in fundamental technological advances. The current moment is different: New technologies simply aren’t making our lives much more efficient or making our economy that much better. We appear to be living in an unproductive bubble.
By its very nature, hype makes it difficult to distinguish illusion from reality. For a reality check, we therefore look to the judgment of the markets themselves, taking measure of products, market sizes, and profits. The resulting numbers suggest that the current tumble in stock prices, tech stock prices in particular, is more than a temporary correction. It is a symptom of a deep disconnect: Tech hype has been distorting people’s behaviors and distracting us from one of the most fundamental economic problems of our time.
Thinking clearly about technological progress versus technological hype requires us to consider the question of why people buy and adopt new technologies in general. A type of academic analysis called the technology acceptance model identifies two notable factors: perceived ease of use and perceived usefulness. That is, we embrace new technologies when they seem easy enough to use and when we believe they will help us do something worthwhile.
As economist Robert Gordon documented in his majestic tome, The Rise and Fall of American Growth, Americans adopted a wide variety of technologies that we now take for granted during the “special century” of economic growth between 1870 and 1970. This period featured the advent of a wide variety of pivotal technologies, including steel, running water, machine tools, assembly lines, concrete structures, electric lights and appliances, automobiles, airplanes, pharmaceuticals, computers . . . the list goes on and on. Many of these technologies enabled individuals and organizations to get more work done with less effort, increasing productivity—the ratio between input and output. As a result, prices for manufactured goods and services fell while output soared. Per capita income in the United States increased by a factor of six between 1870 and 1973, in large part because of these changes.
For reasons still not fully understood, productivity growth came crashing to a halt in the economic difficulties of the 1970s, and it remained low through the 1980s and early 1990s. Then, from 1994 to 2004, productivity briefly rose again, probably in response to the newly commercialized internet, personal computers, and enterprise software that allowed more precise management of business and manufacturing. For a while the dot-com hype seemed justified, but since 2004, productivity growth has been low again, just like in the ’70s and ’80s.
Much of today’s tech hype is designed to create the impression that the good times of productivity growth are back, or that they never really went away. Innumerable stories tout the looming transformative impact of artificial intelligence (AI), drones, self-driving cars, and whatnot. From our perspective, these technologies aren’t being adopted in transformative ways, largely because they are not allowing us to get that much more done than we did before. Still, measuring productivity is difficult, and people who buy into technology hype can argue that our methods for quantifying productivity are missing important changes. For example, some economic measures, like GDP, do not account for apps that we use for free, like YouTube, Instagram, Facebook, Google Maps, or Waze.
The poor need better housing, health care, education, and transportation, not an NFT of a goofy cartoon monkey.
We believe that looking at markets themselves allows us to see through the confusion. Regardless of whether new technologies are truly boosting productivity, if corporations or individuals perceived them to be useful, they would buy them. In reality, we find that people aren’t buying these technologies in large numbers.
One way to cut through the hype is to look at market sizes, a measure of total revenue generated by sales in a given industry, and compare the new technologies to successful ones that came before. For simplicity, let’s look back to the digital technologies from the 1990s dot-com boom. By 2000, revenue generated by e-commerce, internet hardware, and internet software had reached $446 billion, $315 billion, and $282 billion respectively (in 2020 dollars).
Today’s much-hyped technologies do not compare favorably at all: video streaming ($70 billion), big data/algorithms ($46 billion, including companies like Salesforce), smart homes ($20 billion, United States only, including companies like Nest), artificial intelligence ($17 billion), virtual reality ($16 billion), augmented reality ($11 billion in 2019), commercial drones ($6 billion in 2018), and blockchain ($1.9 billion in 2020). The most valuable of these, video streaming, which includes services like pornography, Netflix binges, and cat videos, is highly unlikely to lead to productivity growth; indeed, often enough it distracts us from our work.
When we take the financial measure of individual new technology firms, things look even worse. An ongoing analysis of start-ups by University of Florida economist Jay Ritter has shown that the percentage of start-ups that were unprofitable during the year prior to their IPO (initial public offering of stock shares) has increased from about 20 percent in the early 1980s to more than 80 percent in the last few years. New firms are simply much less profitable than they used to be. Our evaluation finds that more than 90 percent of today’s big start-ups (those valued at $1 billion or more before they went public) have run cumulative losses over their existence. Uber, the former tech company of the year, has run losses of $29.5 billion. Most of these companies may never climb out of the holes they’ve dug.
A major problem with tech hype (and the journalists who enable it) is that it encourages false optimism. Nobel laureate Robert Shiller describes this “irrational exuberance” in terms of narratives. Investors don’t just look at cold, hard facts such as market sizes and profits; they also follow stories that emphasize intense technological change and big benefits from those changes. One of the most exuberant narratives created the current start-up and tech bubble and sustained it even as the promised changes and benefits failed to materialize.
The deeper, even more essential problem with tech hype is that it obscures serious underlying economic issues. Elected officials, civil servants, university professors, and citizens alike have been seduced by technologies that promised sweeping social benefits and economic growth. In the meantime, leaders have neglected fundamentals—low-quality jobs, income stagnation, and inadequate housing—that are the real causes of economic suffering.
According to United Way’s ALICE program, about 40 percent of working households in the United States now struggle to make ends meet. One big reason: Since the 1970s, and especially since the 1990s, new technologies have not led to the creation of major job-producing industries, despite waves of hype about AI, genetic engineering, nanotechnology, and robotics. If any of these technologies had birthed new industries the way boosters said they would, our economy would be in different shape. Tech hype can be seen as a way to distract us from these failures.
These tough economic realities affect both urban and rural locations and all races and ethnicities, but some more than others. Harvard sociologist William Julius Wilson examined the impact of joblessness on urban Black populations in his 1996 book, When Work Disappears. Too many jobs are low-paying, low-skill positions that do not enable families to thrive; even today, when wages are rising in a tight labor market, incomes are still falling behind inflation. Princeton economists Anne Case and Angus Deaton have demonstrated that whites without college educations in the United States are increasingly dying “deaths of despair,” including suicide, alcoholism, and drug overdoses.
The worst may be yet to come. During the dot-com bust of 2000–2002, many firms went bankrupt and markets hemorrhaged value, but we also ended up with important companies like Amazon. E-commerce was rapidly becoming a part of daily life—even for relatively poor people, even when the dot-com bubble was bursting. When the hype clears and our current, unproductive bubble bursts, we will probably be left only with less.
For example, the urban lifestyle has been propped up by subsidized, unsustainable services, like ride sharing and food and grocery delivery, that are now becoming more expensive even as old services, such as taxis and urban supermarkets, have partly disappeared. Uber burned through investor money like mad, in large part by keeping ride prices artificially low. To become profitable, the company is now increasing its rates and will have to keep doing so until what is left is a pricey service used primarily by the well-to-do. The same applies to the other “sharing economy” app-based companies. What will happen when their services become unaffordable for many people, or when many of these companies disappear entirely? Bankruptcy is a growing possibility as share prices plummet.
In the face of these hard economic realities, the tech-hype machine has latched on to the saddest technologies yet: nonfungible tokens (NFTs), blockchain-based Web 3.0, and Facebook’s “metaverse.” We authors had long wondered what would come along after self-driving cars, AI, and such had lost their luster. We never imagined the answer would be so ridiculous. The poor need better housing, health care, education, and transportation, not an NFT of a goofy cartoon monkey.
Looking at market measures offers a way to see the reality behind the tech hype, but economic data alone cannot show how we can do better. We have a few suggestions.
First and most important, influential figures, including political and journalistic leaders, need to pull back from the dramatic claims of interested parties and examine the larger technological picture. That means talking honestly about which industries are actually improving productivity and creating stable, high-wage jobs. President Obama, who like many Democratic politicians is pretty friendly with Silicon Valley, allowed himself to be taken in by the hype, publicly worrying that AI would soon render many jobs obsolete. If he had asked advisers for realistic guidance on how AI was affecting the economy, he would have seen quite a different picture. He should have focused more on realistic solutions to the deep social problems he cared about—such as poor housing, lack of transportation, and a warming climate—and on ways that these problems can be addressed through technological change.
Academics need to do some strategic rethinking of their own. Universities have issued wild claims about the impact of AI and robots on jobs, often using quantitative methods that are divorced from economic reality. Courses on AI, blockchain, and other new technologies have proliferated, hinting at a conflict of interest between issuing academic projections and earning academic income from courses. Yes, universities have to respond to the market, but they also have the power to influence the market.
Investors and business leaders need to take their social responsibility more seriously. As part of their tech hype, today’s capital pitches and press releases often express a tone of do-goodism, promoting themselves as “agents of change.” It would be better for them to focus on fundamentals like changing market sizes, productivity growth, and job creation, which are what will get us out of trouble if anything will.
And every reader can ask a simple question to avoid being taken in by tech hype: How could this new technology make a positive impact on people’s lives? Consider a boring yet important historical example, machine tools. Emerging in the late 1800s, these tools led to cheaper automobiles, bicycles, construction equipment, and farm equipment, which in turn made food cheaper. Even if they didn’t buy machine tools, average Americans could easily see that their lives were improved through the impact of those tools on productivity. How could cryptocurrencies, NFTs, and the metaverse possibly make the lives of nonusers better? Unfortunately, we authors have trouble finding university professors—even business, economic, and engineering professors—who are asking such simple yet important questions.
Getting back to the fundamentals of technologies and economies will require us to escape our current unproductive bubble of tech hype. Our bet is that, sadly, this process will be painful and will really come about only when the bubble implodes.
This story originally appeared on OpenMind, a digital magazine tackling science controversies and deceptions.