Ep.54: Critical casualties? The Dichotomy & Discrepancy of Essential Work and Emerging Industries
With Jerry Dias
This article first appeared in the Brown Journal of World Affairs, Volume XXVII, Issue II, Spring/Summer 2021.
In March 1985, on winter break from graduate studies in Britain, I visited my first illiberal and non-democratic country. It was a fortuitous moment to be dropping in on the mothership of the alternative system, the Soviet Union, that had provided decades of dystopian fodder for the likes of Orwell, Solzhenitsyn, Koestler, and Kundera.
Mikhail Gorbachev had been named General Secretary of the Communist Party two weeks earlier. His photo, with the famous red wine birthmark on his forehead, adorned the cover of the Time magazine I picked up at Heathrow airport. Although some of our reading material was confiscated on arrival, the Gorby cover slipped through. A couple of evenings in, a group of us gathered outside Leningrad’s Winter Palace, once the home of the czars and a key venue in the 1917 Bolshevik Revolution. We met three young Russians. It was my birthday. They invited us back to their apartment to listen to “underground” music, drink vodka, and talk.
Our host, a mournful young man named Aleksander, yearned to be a graphic artist—but that was not what the totalitarian state had in mind for him, he told us. The most interesting jobs were reserved for the children of the nomenklatura, the party elite. I showed Aleksander and his friends the Time magazine cover. They had never seen the Gorbachev birthmark and didn’t believe it was real at first. Who was manipulating the truth for effect, airbrushing in or out the meaningless scar? Eventually, they agreed it was more likely the propagandists on their side. They envied our system, which to them was about the freedom to be informed and to make choices for oneself, including simple and highly personal decisions, like becoming a graphic artist. To this day, I keep a notebook with a list of songs Aleksander played for us that night, written in his beautiful Cyrillic script.
I have always felt an uneasy kinship with Eastern Europe. My family came to Canada from various corners of Lithuania, Romania, Poland, and the Soviet Union. My maternal grandparents, with whom we shared a house for the first six years of my life, arrived in 1929 from eastern Ukraine with four kids in tow and my mother to come. We were even more fortunate than they could have imagined. Shortly after they traded in the dictatorship of the proletariat for a shot at a better life abroad, Stalin hardened the borders getting out of the Soviet Union, and Depression-ravaged North America hardened the borders getting in.
In the old country, peril came packaged in sizes large and small. One of my grandfather’s brothers had gone to dental school. As a graduation present, local thugs tied him to the back of a horse and dragged him through the streets of their small town. He died before he could ever hang his diploma. My grandparents—and those of us descended from them—escaped the most calamitous years of what Yale University historian Timothy Snyder came to dub “The Bloodlands.” On 6 July 1941, the Nazis arrived in the town my family had left behind a dozen years earlier. Three weeks later, their first massacre of Jews began.
My mother’s first cousin, Isaac, survived by joining the Red Army. Isaac’s unit fought in Germany in 1945. I met him in Israel four decades later. A chemical engineer, he rose up through the ranks of his plant and sat on the Riga city council. He told me that he had become a communist at the outbreak of “the Great Patriotic War.” He stopped believing in communism in 1942, after the political commissar in his army unit wanted him to rat out comrades. “But you were a communist for decades afterward. You even sat on the Riga city council,” I protested. “Ah,” nodded Isaac, who had lived experience with inhumanity and emerged, like so many Eastern Europeans, fatalistic and sardonic. “It was kind of like the Mafia—once you got in, it was very hard to get out.”
In late 1987, two years after the conversations with Aleksander and Isaac, my employer, Canada’s The Globe and Mail, asked me to return to Europe as a foreign correspondent, the junior reporter in a two-person traveling bureau based in London. I spent most of 1989 to 1991 shuttling to political hotspots— Poland, where the Solidarity trade union was negotiating a power-sharing arrangement; Hungary, where the crushed dreams of the failed 1956 revolution were stirring once again; East Germany, where the Berlin Wall had become the most potent symbol of the world’s divisions: a metaphor from afar, a concrete prison enclosure to those within.
After the wall fell, I became fixated on a nearby white cross bearing the name “Chris Geoffroy.” A 20-year-old waiter, Geoffroy was the last of 192 people killed by East German border guards while trying to escape to freedom—shot in the heart just nine months before this particular manifestation of illiberalism was toppled.
And then there was Romania. A few days before Christmas that year, I received a call from my editor in Toronto dispatching me to cover a simmering revolt against long-time dictator Nicolae Ceausescu. I had been in godforsaken Romania in March 1988, curious about life inside arguably the most repressive and creepy country of the Eastern bloc. I had been struck by how thoroughly civil society had been crushed. People believed that one-third of their neighbors worked a side hustle for the secret police, the Securitate. Unsure which third, they kept to themselves. Electricity and hot water were in short supply, as was food, which the government preferred to peddle abroad for hard currency.
People short of sustenance and unsure of their ability to bathe shied away from exercise. Ceausescu, choosing to measure his grandeur by population growth, had banned contraception and abortion. Many Romanians refrained from sex as well for fear of subjecting children to hunger; others gave up their children to squalid orphanages. Social capital had been thoroughly depleted.
I teamed up with a Reuters correspondent I met on the flight to Belgrade. Driving into Timisoara, where the revolution began, we were stopped every hundred yards or so at makeshift checkpoints by buoyant residents armed with ax handles, broomsticks, and the odd knife or hunting rifle. It struck me that civil society was being reconstituted right there at the barricades—as inspiring a sight as I could imagine.
At one post, we were informed of fighting ahead and told that we would be unable to make it to the hotel downtown. A teenage boy named Adrian piped up in French with an offer to let us stay overnight at his family’s spartan flat. He was exultant, shaking his head over how, a few days earlier, he would have been arrested for talking to us, and now we were accompanying him home. Adrian reminded me of Aleksander in Moscow. His revolution was very personal. He dreamed of seeing the world. He wanted the state to get out of his way so he could enjoy the freedom to travel. He particularly longed to visit France.
The rise of liberalism became very personal for me, too. The borders—both physical and political—that had impeded Adrian, Aleksander, and others crumbled, opening new frontiers to hundreds of millions. Arbitrary measures would be circumscribed by transparent systems of law. One-party rule would give way to competitive elections and an independent judiciary. Sloganeering and propagandist state media organs would be jostled aside by the entrepreneurial energies of a free press.
As a journalist in the region while the wave of liberalization came to fruition, I can remember being struck by the deep hunger I continually encountered for reliable and trustworthy information with which people could inform their futures. They were determined to make up for lost time by quickly creating a free-wheeling press and an invigorated political discourse. For a while, the precepts of a healthy liberal democracy flourished. Eventually, an equal and opposite authoritarian reaction materialized in places such as Russia, Hungary and Poland, and ultimately seeped into Western democracies as well. Today, illiberalism has made a comeback, aided and abetted by a powerful digital-based information system that has delivered not only the freedoms individuals craved, but also reams of newly styled misinformation.
It is far too early to know how the emergent clash between the forces of change and forces of resistance will play itself out. But following the early trail of breadcrumbs takes us in some discouraging directions. Large swathes of the body politic feel disrupted beyond their coping capacity by a more aggressive post-1990 globalization, the rise of China, and the exploitation of the Internet by those either hostile to facts and truth or indifferent to them. The resulting turbulence has sparked a classic struggle between those embracing the potentialities of change and those desperate to ward off its destabilizing nature.
In early 1991, I recall someone remarking that the twentieth century would be remembered for three years above all: 1914, 1945, and 1989. Throughout the Soviet bloc, there was a bounce in the steps of the “Adrians” and “Aleksanders” and those young people who, unlike Chris Geoffroy, had outlasted the Iron Curtain. Liberalism gathered momentum with Nelson Mandela’s release from prison in February 1990, along with subsequent peace talks in the Middle East and Ireland. The West was set to reap dividends, too, with Cold War missiles beaten into ploughshares, and the likes of retraining programs. The End of History was nigh.
But 1989 didn’t live up to its potential. What a friend in Berlin once described to me as die Mauer im Kopf—the wall of the mind—is back.
Confidence in liberalism’s ability to deliver stability and security has eroded, not just in some rotten boroughs of the old Soviet empire, but in places as far-flung as Turkey, India, China, Venezuela, the Philippines, Brazil and, to a worrying extent, even in democracies such as liberalism’s shining city on the hill, the United States of America.
As it turns out, three decades on, we need to talk anew about liberalism and illiberalism: how we got from there to here, the role of economics, geopolitics, and—particularly for me—the great new information innovations of my lifetime with their mixed blessing of more free expression and less common purpose.
We find ourselves in 2021, caught in an epic contest between borders and frontiers, often physical, sometimes legal, always digital. Borders exist to encumber change: new ideas, new people, new identities, new ways of doing things. Populists patrol these borders in the names of those fearful of being left behind. They appeal to an old-fashioned sense of solidarity built on ethnonational and religious identity. They double down on traditional values in the face of elite metropolitanism and the “otherness” of immigrants. Hungary’s governing Fidesz party has pushed through multiple constitutional changes entrenching the role of Christianity in preserving nationhood, the latest of which in December 2020 requires children to be raised with a Christian interpretation of gender roles.
Borders are fixed edifices of thought that favor the known over the unknown, social stability over social mobility, and the familiarity of the present over the perceived risks of shooting for a better tomorrow.
In contrast, frontiers are the redoubts of society’s adventurers, those jazzed by new possibilities. They embrace the inherent good of science and technology, a free press, and diverse points of view. The people of the frontier revel in a never ending symphony of inquiry, knowing truth can only be approached, never truly attained. This is a critical differentiator: border people believe truth is in plain sight and that answers are not just knowable, they are known.
Liberalism is the antithesis of tribalism—or at least it is meant to be. Its embrace of excellence leaves it generally open-minded about identity, whether national origin, ethnicity, color, sexual orientation, or gender expression. Its sense of inclusion has been less robust, however, when it comes to religion and region. If you want to reduce voting behavior to a single variable, place is a good bet. In smaller U.S. counties, with fewer opportunities and less diversity, Donald Trump attracted two-third of votes; in the largest, home to Richard Florida’s creative cities, just one-third went his way.
The pandemic has provided succor to the keepers of borders. There is much talk about measures to produce greater national self-sufficiency and of further restrictions on the movements of people in order to hold the globalization of germs at bay (conveniently forgetting that the Black Death killed as many as 200 million people in pre-jet travel fourteenth-century Europe).
More convincing to me, though, is how the COVID-19 pandemic has litigated the case for the frontier. A liberal friend wrote me recently “that the very things so many people thought needed ‘resetting’ and ‘reevaluating’ at the beginning of the pandemic—globalization, free markets, supply chains, corporate innovation, technology—are the very things that got us through the crisis, generated a vaccine, and may provide us the tools to fight climate change.” We will see how this plays out.
My favorite pandemic testimony to frontier greatness concerns the German biotech firm BioNTech, which teamed up with the global pharmaceutical giant Pfizer to develop the world’s first approved COVID-19 vaccine. BioNTech is the product of a first-generation husband-and-wife team of German and Turkish origin. Turks in Germany have struggled to overcome borders to citizenship and equal opportunity for decades. The BioNTech founders are frontier standard-bearers, plain and simple. They have re-validated that throwing down the welcome mat to others actually enriches nations and humanity.
Of course, the real world is not inhabited at the far ends of competing poles of thought. Borders cannot exist without frontiers, and frontiers cannot thrive without borders. Freedom without order is anarchy. Order without freedom is fascism. The success of liberalism—particularly democratic liberalism—requires that a delicate balance is always maintained, that everyone has cause to feel that opportunities are available to them, and that the state will extend a helping hand should they stumble. When this balance falls out of sync, we end up where we find ourselves today: with progress under attack by those who perceive, often with reasonable cause, that they are being left behind. The political arbitrageurs of anxiety and resentment—the Donald Trumps and Viktor Orbáns—are always at the ready to turn these fears into resentment and twist emotions to serve their own purposes.
To address the rise of illiberal democracy, we need to understand the mindset of the places where the border looms large—and tend, as much as possible, to the underlying grievances. First, though, we have to figure how we got from 1989 to today.
The “End of History” was a famous—perhaps infamous—essay (later turned into a book, The End of History and the Last Man) written after the collapse of communism. In it, U.S. political scientist Francis Fukuyama postulated that Western liberal democracy had emerged as the ultimate form of human government. He did not imagine a straight line to Utopia—in fact, history was not ending so much as making clear its endpoint. There would be plenty of bumps along the way.
Fukuyama has been cursed ever since with having to explain the nuance, and to issue a mea culpa or two. After the election of Donald Trump and the Brexit vote, he admitted to his concerns about resurgent populism and a post-fact political order. “Twenty-five years ago,” he told The Washington Post in 2017, “I didn’t have a sense of a theory about how democracies can go backward.”
To be fair, the triumphalism that flowed from 1989–90 was generations in the making, and the case for democratic success looked convincing. Some over-exuberance can be forgiven. In retrospect, though, the seeds of today’s illiberalism were being planted even as celebrations went on. For Western democracies, three developments stand out in weakening the liberal project.
Firstly, globalization rewrote its rulebook with the Uruguay Round of trade talks in the years bracketing the putative end of history. Ultimately, a new and improved World Trade Organization (WTO) subsumed the long-serving yet less ambitious General Agreement on Tariffs and Trade (GATT). The new rules reflected the rise of the services economy and the Thatcher-Reagan mindset of placing enhanced power in the hands of corporations. This, in turn, hobbled the ability of nation states to constrain that marketplace. In the words of Harvard economist Dani Rodrik, globalization was invited to reach behind national borders in a departure from the post-war Bretton Woods border-to- border arrangement.
Secondly, China’s spectacular rise, including its acceptance into the WTO in 2001, challenged the notion of a unipolar world order almost as soon as it gained a foothold. China’s initial drive to become the “factory to the world” drew manufacturing investment and jobs away from wealthier Western countries. Global companies embraced the gospel of efficiency, leaving behind rustbelts in place of middle-class, blue-collar communities.
Thirdly, Tim Berners-Lee unveiled the World Wide Web in 1990—ushering in an organized modern Internet that quickly went mainstream, becoming the communications tool of choice for billions and connecting them to one another as never before. Instant and universal communication technologies insinuated themselves into every nook and cranny of economic, political, and social life. The impacts were often positive, sometimes negative, and always disruptive.
The first two of these three mega-changes helped ignite populism. The third acted as an unprecedented accelerant. Government was already suffering a loss of public esteem before the Cold War was won, but its best arguments—upholding national security and providing reformist alternatives to communism—were further diminished. Now, a more laissez-faire market and a libertarian communications system eroded the relevance of the public sector even more. For the private sector, good citizenship became just a cost of doing business, inequality rose, and globalized business leaders became increasingly detached from the communities that once grounded them. Eventually, this brought us to the calamity of the Great Financial Recession, which saw the machinations of Wall Street lay Main Street low.
All of this happened in the context of an information and communications technology revolution with the Internet at its core. In the 1990s, conventional wisdom had it that people no longer needed governments to mediate their interactions and exchanges or even to promote growth. The market could make do just fine. Where once the movement of money was tied to gold reserves held by governments, now it moved across the globe with a simple click. Diplomats came to be seen as just another set of increasingly redundant intermediaries. Google knew where infections were breaking out before public health officials, as the new-style platform companies enjoyed access to far more data than governments could ever have imagined.
The most monumental technological change in centuries had offered up a competing means of organizing governance and entirely new categories of commerce. In time, Main Street’s long and troubled relationship with Wall Street would be superseded by new data-driven interlopers such as Amazon. At first, though, things looked rosy.
By the late 1990s, I had become excited by the historic opportunity of the Internet. Back in Canada for a number of years, I was helping manage The Globe and Mail newsroom through an intense newspaper war against a conservative upstart called the National Post. Catching my breath on a family holiday in the summer of 1999, I couldn’t help but wonder whether we were focused on the wrong danger. I told my publisher I feared that, over the next few years, we would discover that the Internet was our existential threat. The Globe didn’t even have an online news site. He asked that I stick with the newspaper war, but eventually gave me the green light to launch globeandmail.com.
I viewed the Internet as the second coming of the printing press, a worthy democratizing successor to the communications technology that had broken the Catholic Church’s monopoly over knowledge and set the West on a 500-year course of ever-greater liberal thought and action. As a consumer of news, I had been gifted with greater choice than ever. Communications no longer lived on a one-way street (producer to reader, listener, or viewer). People spoke back and spoke with one another. The news media business, built on a model of consumer markets protected by geography and advertisers with no better means to reach those markets, did not yet comprehend the imminent peril. In this early part of the century, the Internet looked not so much as an alternative system as it appeared to be an extension of the old one. The takeover of the media terrain by user-generated content, with all of its consequences, was still to come.
A journalistic-driven free press had provided a critical twentieth-century check and balance in our politics. In democracies, the media was frequently criticized from the left and right for being too closely attached to the overall system and its elite worldview. But it often acted independently within the confines of that system, holding to account those abusing their powers and privileges. In 1974, it helped bring down Watergate-era President Richard Nixon. Its sensibilities were liberal and democratic. There is a reason why authoritarians turn early in their ascent to the task of corralling or controlling the media: a free press poses an existential threat. And so, journalists were always among the first arrested after a coup d’état. And this has not relented. According to the human rights group Reporters Sans Frontières (Reporters Without Borders), 937 journalists were killed on the job in the past decade, increasingly outside war zones. Journalists have been under siege in places as disparate as Russia, Hong Kong, Mexico, the Philippines, Turkey, and India. Donald Trump railed against journalists as enemies of the state. The Saudis brutally murdered Jamal Khashoggi not just for being an opponent of the regime, but for being an opponent with a column in the hometown paper of the capital city of the United States.
Regardless of platform, journalism was—and is—a charter member of the frontier. So, too, was the Internet, at least in its first couple of decades. The Internet leveled borders wherever it went. Political dissenters discovered new and powerful digital organizing tools at their disposal, as witnessed in the 2009 Iranian election protests, the 2011 Arab Spring, and the 2014 Ukrainian uprisings. On its better days, the Internet also gave us Black Lives Matter and the #MeToo movement, both shredding chronic lies of powerful players. For my part, I held out hope for a holy alliance between the rigor of classic journalism and the openness of the Internet. We saw the one-two punch at work in the joint takedown of sexual predator Harvey Weinstein. Yet, even early on, something was clearly amiss. There was an intemperate quality to communicating over the web. The Globe and Mail would receive 300 letters to the editor on a given day and publish just a fraction. When we introduced reader comments, it was striking how the 299th best letter was more measured and thoughtful than the third-best comment in a string of online posts. The medium was the message, as Canadian media guru Marshall McLuhan had once proclaimed. It became evident within a few years that the enormous freedom the Internet conferred was offset by an erosion of commonweal, shared sense of community, and purpose among citizens.
As it continued to evolve, the new system stumbled on a business model that allowed it to live off the avails of free content (photos, memes, blog posts, Wikipedia entries) generated and shared by audiences themselves. Legacy news publishers had to pay trained journalists to produce their content; the new platforms had it handed to them for free. Legacy media values and practices like balance and verification never migrated to the new social media system. In 2016, I asked a Facebook executive why so many posts could be found on the social media giant purporting Barack Obama was born in Kenya. I was told it was not Facebook’s job to tell people what to believe. Larger issues were at stake. Democratic societies can’t survive with a nostrum of truth neutrality.
All media have structural biases—the Internet included. Its big competitive advantage—even more than the user-generated content—is the capacity to collect and apply data instantly. Looking for a 50-something-year-old Trump supporter who loves football and wears nylons? The Googles and Facebooks of the world can find you that. Social media want to keep people online as long as possible, so they can learn more about you to serve up additional and better targeted ads. They feed you information that caters to predispositions and reinforces prejudices that they have deduced from past Internet behavior. As a result, people increasingly become isolated in so-called filter bubbles from information that might be dissonant or even challenging.
Among the biggest changes, as the new system became a huge profit machine and the old one fell into impoverishment, was that the so-called Overton Window developed cracks and then broke into fragments. The Overton Window was a concept developed early in the century for explaining the spectrum of political ideas that are considered at any given point to fall within the bounds of acceptable debate. Politicians, academics, think tanks, and journalists served as gatekeepers of what ideas were considered legitimate and would get aired. Debates about segregated or integrated schools might fall inside the Overton Window, but white supremacy would not. Acceptability was always in motion. Opposition to the Vietnam War in the United States was considered on the far fringe of the Overton Window in the early 1960s but went mainstream later in the decade. Same-sex marriage would make a similar journey three decades later. The Internet has expanded the Overton Window into near meaninglessness.
Anything can get through: the good, the bad, and the ugly—from previously stifled progressives to the most repugnant racists. Gatekeepers’ influence has been radically reduced. Extremists and purveyors of mischievous falsehoods have climbed through the window at will—at least they did until the January 6, 2021 attack on the U.S. Capitol.
Either way, in a world of limitless expression with few quality controls, the medium pushes against the sharing of a common and coherent chronicle of current events, a necessary condition of democratic debate. The medium also demonstrates a preference for such attributes as emotion over reason, the immediate over the considered, condemnation over forgiveness, and the globally scalable over the locally relevant. Fake articles, it was soon discovered, were more popular than factual ones, except they were no longer confined to supermarket checkouts. As the third decade of the twenty-first century began, this had become the “medium is the message” culture of the digital information system.
As early as 2010, Columbia University anti-trust scholar Tim Wu had painted a picture of an Internet losing public purpose in his ground-breaking books, The Master Switch and The Attention Merchants. He bemoaned an Internet that had been colonized by a handful of companies dedicated to business models largely devoid of a civic ethos and lacking in incentive for truthfulness. A laundry list of common online harms makes for depressing reading: misogyny, racism, anti-Semitism, Islamophobia, white supremacy, homophobia, disinformation, alternate facts, bullying, revenge porn, phony consumer reviews, seniors’ fraud, counseling of suicide, conspiracy theories, attacks on electoral integrity, incitement to violence—on it goes. From the Salem witch hunt through slavery and McCarthyism, America has always struggled with the dark recesses of its soul. But there is something newly troubling in the marriage of paranoia and stupidity that characterizes much of today’s Internet propaganda, from QAnon conspiracies to the thoroughly-refuted yet widely-embraced lie that the election was stolen from Donald Trump.
The Internet’s extraordinary dominance of democratic discourse and the new digital public square has upset a long-held and delicate balance in the liberal and democratic equation. “The confluence of features and norms on Facebook affects the interactions people have with each other, creating a communications ecosystem that facilitates negative and stereotyped evaluations of the Americans with whom people disagree,” author James Settle wrote in Frenemies: How Social Media Polarizes America. Added Brookings Institute senior fellow Tom Wheeler, “Digital technology is gnawing at the core of democracy by dividing us into tribes and devaluing truth.”
Where they had initially appeared to be blazing new frontiers, it now looks as if Internet companies have effectively erected new invisible and insidious borders.
Filter bubbles are borders. Unknowable algorithms are borders. Targeted ads are borders. Falsehoods are borders. The concentrated power of Big Tech is a border. The so-called FAANGS—Facebook, Amazon, Apple, Netflix, and Google—had a combined stock market value on 31 December 2020 of nearly $6 trillion, larger than any single nation in the world except the United States and China (which, incidentally, are busily trying to divide the web into extensions of their narrow national security ambitions).
Writing in American Purpose in 2020, 30 years on from The End of History and The Last Man and the vision of a near-borderless world, Fukuyama sought to better understand the concerted attacks on liberalism and democracy by authoritarian states like Russia and China and by elected populists scattered across the globe. It is the fate of liberal in liberal democracy that really concerns him. He describes classical liberalism as “a system for peacefully managing diversity in pluralistic societies.” This liberalism is more a process, a means rather than an end. It is a way of governing that rejects absolutism in favor of modest compromises.
Its sense of moderation is to be preferred over quests for definitive answers—from both left and right—with lofty ambitions that too often simply divide and conquer. Fukuyama points out that renewed nationalism, a manifestation of illiberalism, has historically tended to confer rights “only on their favoured group, defined by culture, language or ethnicity.” This is the type of nationalism that led step-by-unseen-step to the unmitigated tragedy of 1914, the first of the twentieth century’s notable years. Today, exclusionary forms of state-sponsored nationalism, amplified over the Internet, are part of the governing playbook in such disparate places as Modi’s India, Xi’s China, Erdogan’s Turkey, Orbán’s Hungary, and Kaczynski’s Poland. It animated the Trump presidency and figures prominently in the appeals of populist pretenders for power in Western European democracies.
Fukuyama’s portrayal of liberalism, however, provides an incomplete picture of the challenge ahead. It is a politics-of-the-possible liberalism—call it community liberalism—intent on the good and fair functioning of modern pluralist democracies. Yet there is also personal liberalism to consider, one more associated with liberalism as an economic creed. Importantly, it has lost its popular footing to the most pervasive inequalities in a hundred years and the suppression of social mobility. Democracies wither, and populists rise if privilege is locked in or if significant segments of the population feel locked out. You can see it in the sorry state of the counties Donald Trump won in 2016 and 2020, with their lower incomes and limited hopes. These are places discomfited by change and overwhelmed by anger and pain. And, one can only infer, they are places active on social media.
As many scholars have observed, a natural tension exists between the liberal and the democratic in liberal democracy. The former embraces personal freedom, the latter collective will. The liberalism/illiberalism debate needs to be positioned not as good and bad or right and wrong, but as about trade-offs and balance, about finding an inclusive and just equilibrium between such values as liberty and equality; change and tradition; disruption and order; reason and passion; individualism and community; and frontiers and borders. Liberalism without a political constituency simply is not sustainable. People need to see themselves on both sides of the equation.
Classically, this calls for an agenda of reform to compete with the radical exclusionary propositions of populist leaders. This agenda would be economic, of course—not to build borders against change but to give people the tools and confidence for life on the frontier. The agenda would also address the quandary of an Internet that has repeatedly failed to protect individuals and society from harm. Between Silicon Valley’s libertarianism and China’s social control, there exists plenty of space for a reformist, public interest agenda.
In 1985, New York University media scholar Neil Postman, in his book Amusing Ourselves to Death, wrote a telling critique that featured a McLuhanesque analysis of how the medium of television was dumbing down society and public life. TV was dominated by the image, and the image lacked the sort of nuance inherent in the printed word and vital to understanding complex phenomenon and finding compromise. The content in TV news, he argued, becomes subordinated to the entertainment imperative and citizens become the poorer for it, a phenomenon that has become even more pronounced with the behavioral science approach of social media operators in the programming of algorithms to retain attention.
A good part of Postman’s genius lay in his comparison of the different ways in which George Orwell’s 1984 and Aldous Huxley’s Brave New World imagined totalitarian governments suppressing the liberal experiment. Huxley and Orwell, the former writing before the Second World War and the latter after, did not prophesy the same path to illiberalism. What Orwell feared, Postman wrote, were those who would ban books. What Huxley feared was that there would be no reason to ban a book, for there would be no one who wanted to read one. Orwell feared those who would deprive us of information. Huxley feared we would be exposed to too much. Orwell feared the truth would be concealed from us. Huxley feared the truth would be drowned in a sea of irrelevance. Orwell feared we would become a captive culture. Huxley feared we would become a trivial culture. “This book,” Postman wrote, “is about the possibility that Huxley, not Orwell, was right.”
Postman did not live to see today’s Internet. But his son spoke to new forms of communications technologies in an introduction to a twentieth-anniversary edition of the classic, arguing that the questions his father was asking applied to all new media technologies. “What happens to us when we become infatuated with and then seduced by them? Do they free us or imprison us? Do they improve or degrade democracy? Do they make our leaders more accountable or less so? Do they make us better citizens or better consumers? Are the trade-offs worth it?”
Despite my feeling a strong attachment to the free speech culture in which I grew up as a journalist, the answers seem clear enough after Charlottesville, Christchurch, and the Capitol. An Internet that is both publicly unfettered and serving narrow private interests has, by and large, not made for better citizens and has degraded liberal democratic society. Public authorities should be leading conversations with their citizens about how to hold the new powers-that-be responsible for what they render.
Liberalism is not about absolutes. Free speech can be safeguarded within a regulated Internet.
As the public sphere moves online, we must take heed that division and falsehood cannot be allowed to carry equal weight to truth; that myth and rumor cannot trump truth and evidence; that malevolent actors cannot hijack our public concourses of communications to pollute reasoned debate.
We must be cognizant that the number one threat to democracy today comes from disorientation, distraction, and the erosion of commonweal—all symptoms of modern Internet culture. “The further a society drifts from truth the more it will hate those who speak it,” Orwell wrote. Democracy and liberalism recede when truth and falsehood no longer grapple in the open because they have been assigned to different filter bubbles. It recedes when the public square is subdivided, and its keepers do not see the ongoing pursuit of truth as their responsibility.
Thomas Jefferson, one of the founding fathers of American democracy, said that he would rather have newspapers without governments than governments without newspapers. That was because the only way for voters to render intelligent judgments and counter false arguments, he felt, was for citizens to have access to a common base of facts. Jefferson even called on governments “to contrive” that this would be so. A contrivance is another name for policy. In his day, the most popular contrivance was a postal subsidy, which allowed newspapers to circulate more widely. Today, distribution is fine. It is the financing of professional news and controlling online harms that cry out for policy contrivances. The proof of public harm is in the pudding of those who remain convinced beyond reason that the 2020 U.S. presidential election was stolen. They are operating from a thoroughly different information base. They reside in particular filter bubbles and echo chambers, which play to their fears and prejudices and limit exposure to counterarguments. The Internet has transformed in the first two decades of the twenty first century from our greatest frontier to a phalanx of border posts controlled by privately interested actors.
Truth is an inescapable foundation of the liberal order and is in a precarious state in today’s liberal democracy. Our ability to address difficult issues such as climate change or electoral integrity is compromised if truth and public opinion become confused.
I am naturally drawn to frontiers. It’s probably why I became a journalist. I am even more drawn these days to understanding why frontiers have become reviled by so many—and, most particularly, the role the Internet has played in that change. Liberalism cannot strike the simple pose of a victim. It needs to understand how it came to be seen in large quarters as oppressive. If it does not take the time to understand and address these misgivings, liberalism, and the better angels of the soul it represents, will continue to face populist headwinds, and the oxymoron called illiberal democracy will divide nations from one another and from within. The world is always in flux, and there always will be those who, for economic or cultural reasons, react against change. What is different this time around is that, along with everything else, the Internet has democratized the tools of propagandists. You no longer need Josef Goebbels to think up your Internet memes. Hate, disinformation, and division are easily crowd-sourced. And there’s an army of the alienated ready to participate. “The propagandist,” Huxley wrote a century ago, “is a man who canalizes an already existing stream. In a land where there is no water he digs in vain.”
I sometimes wonder what became of Aleksander and Adrian. I failed to keep in touch. Do they still drink from the cup of openness and opportunity with its uneven flow? Did the state help equip them with the tools of confidence—from education and training to health care and, when necessary, income support? Or did they long ago slake their liberal thirst and retreat into the insecurity of holding onto what they know? It would be a shame of monumental proportions to have knocked down the political borders around them, given them access to the frontiers, and then allowed their ambitions to become destabilized.
History is in play in their lifetimes again. This time, illiberalism will not simply collapse under the weight of its own contradictions. Rather, liberalism needs to rebuild its constituency, as it always has, by reforming itself—and this time around the context of a global and digital age.