14 December 2022

THE EUREKA MYTH & WHY THE AGE OF AMERICAN PROGRESS ENDED

 ". . .In theory, the values of progress form the core of American national identity. The American dream is meant to represent that exception to the rule of history: Here, we say, things really do get better

For much of the 19th and 20th centuries, they did. Almost every generation of Americans was more productive, wealthier, and longer-lived than the one before it. In the past few decades, however, progress has faltered—and faith in it has curdled. Technological progress has stagnated, especially in the nonvirtual world. So have real incomes. Life expectancy has been falling in recent years. . .

3 light bulbs on black conveyor belt on blue background

WHY THE AGE OF AMERICAN PROGRESS ENDED

Invention alone can’t change the world; what matters is what happens next.

. . The 10,000-year story of human civilization is mostly the story of things not getting better: diseases not being cured, freedoms not being extended, truths not being transmitted, technology not delivering on its promises. Progress is our escape from the status quo of suffering, our ejection seat from history—it is the less common story of how our inventions and institutions reduce disease, poverty, pain, and violence while expanding freedom, happiness, and empowerment.

It’s a story that has almost ground to a halt in the United States.

Brighten their holiday. Enrich their everyday.Give The Atlantic
3 light bulbs on black conveyor belt on blue background

WHY THE AGE OF AMERICAN PROGRESS ENDED

Invention alone can’t change the world; what matters is what happens next.

The Scourge of All Humankind

if you were, for whatever macabre reason, seeking the most catastrophic moment in the history of humankind, you might well settle on this: About 10,000 years ago, as people first began to domesticate animals and farm the land in Mesopotamia, India, and northern Africa, a peculiar virus leaped across the species barrier. Little is known about its early years. But the virus spread and, whether sooner or later, became virulent. It ransacked internal organs before traveling through the blood to the skin, where it erupted in pus-filled lesions. Many of those who survived it were left marked, disfigured, even blind.

Explore the January/February 2023 Issue

Check out more from this issue and find your next story to read.

View More

As civilizations bloomed across the planet, the virus stalked them like a curse. Some speculate that it swept through ancient Egypt, where its scars appear to mar the mummified body of Pharaoh Ramses V. By the fourth century A.D., it had gained a foothold in China. Christian soldiers spread it through Europe during the 11th- and 12th-century Crusades. In the early 1500s, Spanish and Portuguese conquistadors conveyed it west across the Atlantic, where it ravaged native communities and contributed to the downfall of the Aztec, Mayan, and Inca empires.By the end of the 1500s, the disease caused by the virus had become one of the most feared in the world. About a third of those who contracted it were dead within weeks. The Chinese called it tianhua, or “heaven’s flowers.” Throughout Europe, it was known as variola, meaning “spotted.” In England, where doctors used the term pox to describe pestilent bumps on the skin, syphilis had already claimed the name “the great pox.” And so this disease took on a diminutive moniker that belied the scale of its wretchedness: smallpox.

Over time, different communities experimented with different cures. Many noticed that survivors earned lifetime immunity from the disease. This discovery was passed down through the generations in Africa and Asia, where local cultures developed a practice that became known as inoculation—from the Latin inoculare, meaning “to graft.” In most cases, people would stick a sharp instrument into a smallpox-infected pustule to collect just a little material from the disease. Then they would stick the same blade, wet with infection, into the skin of a healthy individual. Inoculation often worked—pustules would form at the injection site, and a low-grade version of the disease would typically follow. But the intervention was terribly flawed; it killed about one in every 50 patients.

Not until the early 1700s did a chance encounter in the Ottoman empire bring the process to Britain, and bend the axis of history. In 1717, Lady Mary Wortley Montagu, an English aristocrat living in Constantinople with her husband, a diplomat, heard about inoculation from her acquaintances in the Ottoman court. Circassian women, from the Caucasus Mountains and in great demand for the Turkish sultan’s harem, were inoculated as children in parts of their bodies where scars would not easily be seen. Lady Montagu asked the embassy surgeon to perform the procedure on her son—and upon her return to London a few years later, on her young daughter.

Word spread from court physicians to members of the College of Physicians to doctors across the continent. Within a few years, inoculation had become widespread in Europe. But many people still died of smallpox after being deliberately infected, and in some cases inoculation transmitted other diseases, like syphilis or tuberculosis.

One boy who went through the ordeal of inoculation was Edward Jenner, the son of a vicar in Gloucestershire, England. He trained as a physician in the late 1700s, and carried out these rough smallpox inoculations regularly. But Jenner also sought a better cure. He was taken by a theory that a disease among cows could provide cross-immunity to smallpox.


Want to discuss the future of business, technology, and the future of progress? Join Derek Thompson and other experts for The Atlantic’s first Progress Summit in Los Angeles on December 13. Free virtual and in-person passes available here.


In the spring of 1796, Jenner was approached by a dairymaid, Sarah Nelmes, who complained of a rash on her hand. She told Jenner that one of her cows, named Blossom, had recently suffered from cowpox. Jenner suspected that her blister might give him the opportunity to test whether cowpox was humanity’s long-awaited cure.

May 14, 1796, was a golden day in the history of science but a terrifying one for a certain 8-year-old boy. Jenner drew a blade, slick with ooze from a cowpox blister, across the arm of James Phipps, the brave and healthy son of his gardener.

After a week, young James developed a headache, lost his appetite, and came down with chills. When the boy had recovered, Jenner returned with a new blade—this one coated with the microbial matter of the smallpox virus. He cut the boy with the infected lancet. Nothing happened. The boy had been immunized from smallpox without encountering the disease.

Jenner would go down in history as the person who invented and administered a medical cure for one of the deadliest viruses in world history. Then he invented something else: a new word, from the Latin for “cow,” that would be carried down through the centuries alongside his scientific breakthrough. He called his wondrous invention a vaccine.

The Eureka Myth

let’s pause the story here. Jenner’s eureka moment is world-famous: cherished by scientists, rhapsodized by historians, and even captured in oil paintings that hang in European museums.For many, progress is essentially a timeline of the breakthroughs made by extraordinary individuals like Jenner. Our mythology of science and technology treats the moment of discovery or invention as a sacred scene. In school, students memorize the dates of major inventions, along with the names of the people who made them—Edison, light bulb, 1879; Wright brothers, airplane, 1903. The great discoverers—Franklin, Bell, Curie, Tesla—get best-selling biographies, and millions of people know their names.

This is the eureka theory of history. And for years, it is the story I’ve read and told. Inventors and their creations are the stars of my favorite books about scientific history, including The Discoverers, by Daniel Boorstin, and They Made America, by Harold Evans. I’ve written long features for this magazine holding up invention as the great lost art of American technology and the fulcrum of human progress.

But in the past few years, I’ve come to think that this approach to history is wrong. Inventions do matter greatly to progress, of course. But too often, when we isolate these famous eureka moments, we leave out the most important chapters of the story—the ones that follow the initial lightning bolt of discovery. Consider the actual scale of Edward Jenner’s accomplishment the day he pricked James Phipps in 1796. Exactly one person had been vaccinated in a world of roughly 1 billion people, leaving 99.9999999 percent of the human population unaffected. When a good idea is born, or when the first prototype of an invention is created, we should celebrate its potential to change the world. But progress is as much about implementation as it is about invention. The way individuals and institutions take an idea from one to 1 billion is the story of how the world really changes.And it doesn’t always change, even after a truly brilliant discovery. The 10,000-year story of human civilization is mostly the story of things not getting better: diseases not being cured, freedoms not being extended, truths not being transmitted, technology not delivering on its promises. Progress is our escape from the status quo of suffering, our ejection seat from history—it is the less common story of how our inventions and institutions reduce disease, poverty, pain, and violence while expanding freedom, happiness, and empowerment.

It’s a story that has almost ground to a halt in the United States.

In theory, the values of progress form the core of American national identity. The American dream is meant to represent that exception to the rule of history: Here, we say, things really do get better. For much of the 19th and 20th centuries, they did. Almost every generation of Americans was more productive, wealthier, and longer-lived than the one before it. In the past few decades, however, progress has faltered—and faith in it has curdled. Technological progress has stagnated, especially in the nonvirtual world. So have real incomes. Life expectancy has been falling in recent years.

light bulb with deflated white balloon instead of glass on blue background
Derek Brahney

What went wrong? There are many answers, but one is that we have become too enthralled by the eureka myth and, more to the point, too inattentive to all the things that must follow a eureka moment. The U.S. has more Nobel Prizes for science than the U.K., Germany, France, Japan, Canada, and Austria combined. But if there were a Nobel Prize for the deployment and widespread adoption of technology—even technology that we invented, even technology that’s not so new anymore—our legacy wouldn’t be so sterling. Americans invented the first nuclear reactor, the solar cell, and the microchip, but today, we’re well behind a variety of European and Asian countries in deploying and improving these technologies. We were home to some of the world’s first subway systems, but our average cost per mile for tunnel projects today is the highest in the world. The U.S. did more than any other nation to advance the production of the mRNA vaccines against COVID-19, but also leads the developed world in vaccine refusal

. .One regrettable feature of history is that it sometimes takes a catastrophe to fast-forward progress. The U.S. directly advanced airplane technology during World War I; radar, penicillin manufacturing, and nuclear technology during World War II; the internet and GPS during the Cold War; and mRNA technology during the pandemic. A crisis is a focusing mechanism. But it is up to us to decide what counts as a crisis. The U.S. could announce a Warp Speed for heart disease tomorrow, on the theory that the leading cause of death in America is a national crisis. We could announce a full emergency review of federal and local permitting rules for clean-energy construction, with the rationale that climate change is a crisis. Just as it did in the ’60s with smallpox, the U.S. could decide that a major disease in developing countries, such as malaria, deserves a concerted global coalition. Even in times without world wars and pandemics, crises abound. Turning them into national priorities is, and has always been, a political determination.

A Question of Culture

operation warp speed was ingenious, admirable, and wildly successful. But despite all that, it was not enough.

Having overcome the hurdles of scientific breakthrough, technological invention, and rapid distribution, the mRNA vaccines faced a final obstacle: cultural acceptance. And the skepticism of tens of millions of American adults proved too much for the vaccines to overcome. This is the third lesson of the smallpox story—culture is the true last-mile problem of progress. It doesn’t matter what you discover or invent if people are unwilling to accept it.

5 flat pictures of lightbulbs lined up like dominoes with the first one tipping over the next one on blue background
Derek Brahney

In 2021, the U.S. took an early global lead in vaccine distribution, thanks to the accelerated development of vaccines under President Donald Trump and their timely delivery under President Joe Biden. By April, we had distributed more shots per capita than almost any other country in the world. But by September, according to one estimate, the U.S. had fallen to 36th in national vaccination rates, behind Mongolia and Ecuador. The problem wasn’t supply, but demand. Tens of millions of American adults simply refused a free and effective vaccine in the middle of a pandemic.

Michael Bang Petersen, a Danish researcher who led a survey of attitudes in Western democracies about COVID-19, told me that America’s history of vaccine skepticism—and of conspiracy theories surrounding vaccines—of course predates the coronavirus pandemic. And although American vaccine resistance has several sources, including the cost of some vaccines and our legacy of medical racism, Petersen told me that one of the most important factors today is “the level of polarization between Democratic and Republican elites.” Vaccine rejection remains higher among Republican adults than any other measured demographic, including age, education level, gender, and ethnicity.

In the 19th century, state and church leaders across Europe and the Americas typically praised the smallpox vaccine in unison. But in the 21st century, a dwindling number of subjects enjoy such universal elite endorsement. Despite the historical assumption that moments of tragedy bring a country together, the pandemic efficiently sorted Americans into opposing camps—for and against lockdowns, for and against vaccines. Nearly 90 percent of Americans told the Pew Research Center that the pandemic has made the country more divided.

Americans are deeply polarized; that much is obvious. Less obvious, and more important for our purposes, is how polarization might complicate material progress today. One big problem the country faces is that as coastal, educated elites have come to largely identify as Democrats, Republicans have come to feel ignored or condescended to by the institutions populated by the former group. As if recoiling from the rise of a liberal scientific and managerial class, the GOP has become almost proudly anti-expertise, anti-science, and anti-establishment. Cranks and conspiracy theorists have gained prominence in the party. It is hard to imagine scientific institutions flourishing within right-wing governments averse to both science and institutions. But this is only part of the problem, culturally speaking.

The other part is that some Democrats—many of whom call themselves progressives—have in meaningful ways become anti-progress, at least where material improvement is concerned. Progress depends on a society’s ability to build what it knows. But very often, it’s progressives who stand against building what we’ve already invented, including relatively ancient technology like nuclear power or even apartment buildings. Cities and states run by Democrats have erected so many barriers to construction that blue metro areas are now where the housing crisis is worst. The five states with the highest rates of homelessness are New York, Hawaii, California, Oregon, and Washington; all are run by Democrats. Meanwhile, it is often left-leaning environmentalist groups that use onerous rules to delay the construction of wind and solar farms that would reduce our dependency on oil and gas. The left owns all the backpack pins denouncing the oil industry, but Texas produces more renewable energy than deep-blue California, and Oklahoma and Iowa produce more renewable energy than New York.

One possible explanation is that progressives have become too focused on what are essentially negative prescriptions for improving the world, including an emphasis on preservation and sacrifice (“reduce, reuse, recycle”) over growth (“build, build, build”). At the extreme, this ascetic style leads to calls for permanent declines in modern living standards, a philosophy known as “degrowtherism.” The aim is noble: to save our descendants from climate change by flying less, traveling less, buying less, and using less. But it is a profound departure from progressivism’s history, which is one of optimism about the ability of society to improve lives on a big scale through bold action. It’s self-defeating to tell voters: “My opponent wants to raise your living standards, but I promise I won’t let that happen.” It’s far better—and, arguably, more realistic—to tell voters that building more renewable power is a win-win that will make energy cheaper and more abundant.

When you add the anti-science bias of the Republican Party to the anti-build skepticism of liberal urbanites and the environmentalist left, the U.S. seems to have accidentally assembled a kind of bipartisan coalition against some of the most important drivers of human progress. To correct this, we need more than improvements in our laws and rules; we need a new culture of progress.

The Trust Gap

a famous theme in American history is adaptability, and justifiably so. When something isn’t working, we’ve typically been game to try something new. In the summer of 2022, Biden signed a series of laws, including the CHIPS and Science Act and the Inflation Reduction Act, that included hundreds of billions of dollars for building microchips, solar panels, electric cars, and infrastructure, green and otherwise. In an address touting this approach, Treasury Secretary Janet Yellen branded it “modern supply-side economics.” Contrasted with the Reagan-era phrase, which referred to cutting taxes to stimulate the economy, her speech focused more on direct investments in American manufacturing and improving America’s ability to build what it invents. In October, Brian Deese, a senior adviser to Biden, announced the administration’s plans to deliver a modern industrial strategy that would help “spur mature technologies to deploy more quickly [and] pull emerging innovations to market faster.”

No one can say for sure how well Biden’s specific plans will work—and a decade from now, critics will undoubtedly find particular initiatives that failed or wasted money. Still, we might be moving from the eureka theory of progress to an abundance theory of progress, which focuses on making our best ideas affordable and available to everyone. Overall, this new direction of federal policy seems promising.

Still, it doesn’t solve the problem of cultural unreadiness for progress, a problem that afflicts the left and right differently, but that ultimately comes down to trust. Every form of institutional trust is in free fall. Fewer than half of Republicans say they have faith in higher education, big businesses, tech firms, media, the entertainment industry, and unions. Among Democrats, too, confidence in government has declined. Why is social trust so important to progress? In a country where people don’t trust the government to be honest, or businesses to be ethical, or members of the opposite party to respect the rule of law, it is hard to build anything quickly and effectively—or, for that matter, anything that lasts.

One of the most important differences between invention and implementation is that the former typically takes place in private while the latter is necessarily public. The first practical silicon-solar-cell technology was developed in a corporate lab in New Jersey. Building a solar farm to generate electricity requires the sustained approval of officials and local residents—in other words, it requires people to genuinely believe that they will benefit, at least collectively, from changes to their lived environment.

I want to tell you that there is a simple agenda for restoring trust in America, but I don’t think I can do that. When discussing barriers to the construction of nuclear-power plants or the pace of drug development, one can play the part of a bottleneck detective—identifying obstacles to progress and working to overcome them through clever policy tweaks. But Americans’ growing mistrust of institutions and one another is rooted in the deepest hollows of society: in geographical sorting that physically separates liberals and conservatives; in our ability to find ideological “news” that flatters our sensibilities but inhibits compromise.

In 2022, the medical journal The Lancet published an analysis of which variables best predicted the rates of COVID infection across 177 countries. Outside wealth, one of the most powerful variables was trust in government among the public. “Trust is a shared resource that enables networks of people to do collectively what individual actors cannot,” the authors of the Lancet paper wrote. When I first read their definition, I stared at it for a while, feeling the shock of recognition. I thought of how much that could serve as a definition of progress as well: a network of people doing collectively what individual actors cannot. The stories of global progress tend to be the rare examples where science, technology, politics, and culture align. When we see the full ensemble drama of progress, we realize just how many different people, skills, and roles are necessary.

The last needle to be applied against smallpox, before its eradication almost half a century ago, carried a dose of vaccine smaller than a child’s pupil. Four hundred years fit inside that droplet. The devotion of D. A. Henderson’s disease-eradicating team was in it. So were the contributions of Benjamin Rubin and the Spanish boys, as well as the advocacy of Henry Cline and the discovery by Edward Jenner, and before him the evangelism of Lady Montagu, and the influence of Circassian traders from the Caucasus Mountains, who first brought the practice of inoculation to the Ottoman court. An assembly line of discovery, invention, deployment, and trust wound its way through centuries and landed at the tip of a needle. Perhaps there is our final lesson, the one most worth carrying forward. It takes one hero to make a great story, but progress is the story of us all.


This article appears in the January/February 2023 print edition with the headline “The Eureka Theory of History Is Wrong.” When you buy a book using a link on this page, we receive a commission. Thank you for supporting The Atlantic.

No comments:

Ukraine Sacks Two Senior Ministers at Heart of Wartime Economy By Reuters | May 9, 2024, at 4:40 a.m.

Ukraine Sacks Two Senior Ministers at Heart of Wartime Economy By  Reuters May 9, 2024, at 4:40 a.m. REUTERS FILE PHOTO: Oleksandr Kubrakov,...