The Next Age of Invention

24_1-jm

For City JournalJoel Mokyr writes:  The statement “everything that could be invented has been invented” is frequently misattributed to the late-nineteenth-century American patent commissioner Charles Holland Duell. The Economistonce credited him with the remark, and sites such as “kool kwotes” still reproduce it. In fact, Duell believed the opposite. “In my opinion,” he wrote at the turn of the century, “all previous advances in the various lines of invention will appear totally insignificant when compared with those which the present century will witness. I almost wish that I might live my life over again to see the wonders which are at the threshold.” While this prediction turned out to be on the money, the belief that “the end of invention” is near is very much alive in our age, despite ample evidence of accelerating technological progress.

“Most states today realize that peaceful interstate competition in the marketplace requires staying current with the most advanced technology—but terrorists and rogue states want to stay current, too, for very different reasons…”

Pessimism is most prevalent among economists such as Northwestern University professor Robert J. Gordon, who expects growth to slow to a small fraction of what it was in the past. Gordon predicts that the disposable income of the bottom 99 percent of Americans will grow at just 0.2 percent per year—one-tenth the average rate of U.S. economic growth in the twentieth century. Innovation, he maintains, will not be enough to offset the headwinds that will buffet Western industrialized economies in the next half-century—aging populations, declining educational achievement, and rising inequality. And he is not alone in this dismal view. In The End of Science, published in 1996, journalist John Horgan declared that “the modern era of rapid scientific and technological progress appears to be not a permanent feature of reality, but an aberration, a fluke. . . . Science is unlikely to make any significant additions to the knowledge it has already generated. There will be no revelations in the future comparable to those bestowed upon us by Darwin or Einstein or Watson and Crick.”

“The argument that “if we don’t do this, someone else will” should prove more powerful than the concerns of groups that regard a new technology with suspicion.”

Certainly, it is difficult to know exactly in which direction technological change will move and how significant it will be. Much as in evolutionary biology, all we know is history. Yet something can be learned from the past, and it tells us that such pessimism is mistaken. The future of technology is likely to be bright.

The first thing to note is that the twentieth century experienced probably as many headwinds, albeit of a different kind, as Gordon foresees for the twenty-first. Industrialized nations fought two massive world wars and experienced the Great Depression, the Cold War, and the rise of totalitarian regimes in much of Europe and Asia. In the past, such catastrophes might have been enough to set economies back for hundreds of years or even to condemn entire societies to stagnation or barbarism. Yet none of them could stop the power of ever-faster innovation in the twentieth century to stimulate rapid growth in much of the industrialized and industrializing world.

Keep in mind, too, that economic growth, measured as the growth of income per capita (corrected for inflation), is not the best measure of what technological change does. True, technology increases productivity by making it possible to produce goods and services more efficiently (at lower cost). But much of what it does is to put on the market new products (or vastly improved ones) that may be quite inexpensive relative to their benefits. Many of the most important inventions of the late nineteenth and twentieth centuries are things that we would not want to do without today; yet they had little effect on the national accounts because they were so inexpensive: aspirin, lightbulbs, water chlorination, bicycles, lithium batteries, wheeled suitcases, contact lenses, digital music, and more.

Further, our outdated conventions of national income accounting fail to capture fully the many ways in which technology can transform human life for the better. For instance, national income calculations do not count “leisure” as a valuable good. People who are not working are not producing, and this is simply “bad,” in Gordon’s view, because they are not adding to economic output. But it may well be that a leisurely life is the best “monopoly profit,” as Nobel Prize winner John Hicks already noted in 1935. And thanks to new technology, leisure—even involuntary leisure such as unemployment—can be more enjoyable than ever before. At little cost, anyone can now watch a bewildering array of sports events, movies, and operas from the comfort and safety of a living room on a high-definition flat-screen TV. If the technology of the twentieth century did anything, it vastly augmented our ability to have a good time when we are not working. Yet, while the average individual in an industrialized country nowadays has far more leisure hours and many more enjoyable things in his or her life than the typical person did a century ago, such things hardly show up in the national income statistics.

Many pessimists make their predictions by extrapolating on current technological trends. Jan Vijg, a distinguished geneticist with an interest in history, is disappointed that airplanes do not fly any faster than they did 50 years ago and that the basic design of our automobiles has not changed. He notes that improvements still take place but tend to be marginal and subject to diminishing returns. Some of the most convenient devices that appeared in the twentieth century, such as air-conditioning and antibiotics, are already in use everywhere. Maybe the low-hanging technological fruit has all been picked, as George Mason University economist Tyler Cowen has said, and there is little left to invent.

But consider those things that ignited rapid progress in the past. The technological historian Derek Price has emphasized that the tools that technology makes available to science help determine the rate of progress. Typically, we tend to see scientific discoveries as a causal factor in technology: as physics and chemistry improve, inventors can design new products and materials. But the reverse is equally true: as scientists get better tools (made, say, by instrument makers and lens grinders), they can advance knowledge, which in turn leads to technological progress. This creates a virtuous circle that has been responsible in the past for the miraculous, technology-driven events that created modern economies. It is not easy to pinpoint when that virtuous circle started, but one salient event occurred in the seventeenth century, when microscopes and telescopes first emerged and enabled scientists to see what no human had ever seen. The development of the barometer led to the discovery of the atmosphere, soon to be followed by steam (that is, atmospheric) engines. The process accelerated after 1750. Another example: the greatest breakthrough in nineteenth-century medicine—the discovery of the germ theory of disease—was made possible by improved microscopes, which reduced optical aberration. Modern economic growth would surely have fizzled out had it not been for the way science and technology reinforced each other.

If this historical model holds some truth, the best may still be to come for modern societies. Only in recent decades has science learned to use high-powered computing and the storage of massive amounts of searchable (and thus accessible) data at negligible costs. The vast array of instruments and machines that can see, analyze, and manipulate entities at the sub-cellular and sub-molecular level promise advances in areas that can be predicted only vaguely. But these tools, to beat Cowen’s metaphor into the ground, allow us to build taller ladders to pick higher-hanging fruit. We can also plant new trees that will grow fruits that no one today can imagine.

A second reason technological progress will continue unabated has to do with the emergence of a competitive global marketplace, which will encourage the spread of new technology from its originating locations to other users who do not wish to be left behind…read more….

City Journal Winter 2014

Joel Mokyr is a professor of economics and history at Northwestern University and author, most recently, of The Enlightened Economy: Britain and the Industrial Revolution.


2 Comments on “The Next Age of Invention”

  1. […] Pundit from another Planet For City Journal, Joel Mokyr writes: The statement “everything that could be invented has […]


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.