Public Interest and Media Infrastructures

Introduction: The Pictures in Our Heads

Writing shortly after the end of the First World War—when professional journalism was in its infancy and publishers were discovering that the U.S. government had lied to them (and the public) for much of the war—Walter Lippmann formulated a key challenge that is still with us. He observed that “the world that we have to deal with politically is out of reach, out of sight, out of mind. It has to be explored, reported, and imagined.”[1] Given that our political worlds exist at scales beyond our direct experience—within privately owned and technologically obscured social media platforms how can we better explore, report, and imagine public life today by better holding these platforms publicly accountable?

Lippmann’s insight was that people were increasingly living in complex webs of relations that were big and powerful and complicated, impossible to escape, and deeply dependent on media. Journalists’ job was to create the “pictures inside the heads of…human beings, the pictures of themselves, of others, of their needs, purposes, and relationships”[2] that could make people feel, know, and act in particular ways. What irked Lippmann was that, in its effort to secure public support for the war, the U.S. government had not only fabricated casualty counts and lied about battles but, in doing so, it had also manipulated people into feeling solidarity, outrage, and patriotism. It had helped to form imagined false images of the world that the state then co-opted and used for its purposes. This was a double betrayal because the media could not be good-faith sources for individual learning, nor could it be a vehicle for discovering and managing shared social conditions.

If the media lied to you, how could you trust what you or your friends thought? And why would you ever willingly sacrifice anything in the service of a larger, collective, public good, based on what the media told you?

These questions sparked decades of research into media systems. How should media be created, acted on, funded, professionalized, and held publicly accountable? Who was more or less susceptible to media manipulations? What exactly counts as “the media,” whose stories are being told, and who has access to publishing power? Although decades of Communication research tells us that propaganda and social manipulation are complex social and cultural processes that cannot be reduced to mere information transmission,[3] “the media” continues to be an ill-defined, fragile meaning-making system that makes and remakes its philosophical and professional moorings anew in every era. In other words, Lippmann’s insights remain true.

Today, the “imagined communities”[4] that social media create are lenses we use to know how to think, feel, and act. Journalists and audiences alike[5] look to these platforms to understand issues as varied as climate crises, food supply, whether to go to war, or what it means to be “Canadian” or “European.” Writing in May 2020, the pandemic makes this point especially powerfully. For millions of people, their beliefs and behaviours—whether to wear a mask, get tested for Covid-19, socially isolate themselves, trust medical experts—depend not only upon where people live and the policies of their local governments, but also on the relationships and algorithms of social media platforms.[6]

The key difference between Lippmann’s era and today, though, is that we have a very different media system, with very different power dynamics. The media is still a complex mix of people, economic interests, professional values, regulatory frameworks, and ideals of public life. But it also includes a new and largely inscrutable set of privately controlled and proprietary computational systems driven by advertising markets and optimized through machine learning algorithms. Variously called the “hybrid media system”[7] or the “networked press,”[8] today’s media system includes not only the traditional newsroom personnel, editorial judgments, and publishing channels. It also contains a messy mix of training data, user clicks, advertising metrics, surveillance systems, machine learning models, and recommendation engines. These systems are motivated by data. Indeed, at the most basic level, these systems are data—and we are that data.[9]

This media system needs a constant stream of data in order to create pictures for our heads that are personalized, predictable, and profitable. Unlike in Lippmann’s era, scale is not a problem for the media to overcome or an unfortunate side-effect of modern life. On the contrary, scale is a resource for these media systems to extract and harness, a key method for creating tailored, instantaneous images of the world that can be bought and sold.[10] These systems buy and sell people by surveilling, commodifying, and shaping their behaviours—valuing some people more than others because their data are worth more than others.[11]

The complex systems that produce media use algorithmic processes to convert “big data” into stable stories. These are the stories that drive individual beliefs, fuel commerce, and organize collective action.[12] Today’s “liars” are not (only) states that deceive individuals and manufacture solidarity. Rather, their power is more subtle. They claim that they do not “lie” to you, but simply show you what you and others have said. They position themselves as neutral mirrors that simply reflect the best and worst of society. If deception and co-optation happen, it’s because of what you do, not what they do.[13]

Publicly and precisely critiquing how platforms position themselves is critical to the future of platform governance.

Regulators must squarely tackle the narrative of disinterest, user service, objectivity, and voluntary participation that platforms repeat. But doing so means delving into the details of how platforms work and understanding them much better than we currently do. It means conceptualizing platforms not as channels or broadcasters but as private, for-profit, invisible infrastructures of human values and computational power that even their creators often do not fully understand. Indeed, although they bear some resemblance to earlier media institutions, their form is unprecedented and overwhelmingly motivated by financial, not editorial priorities.[14]

The pictures that these infrastructures create “work” if they are economically viable, culturally palatable, and politically plausible. Because they have the potential to keep users and advertisers engaged, platforms try to create as many realities as possible,[15] outsourcing the consequences of those realities to the societies that they say their technologies simply reflect. Indeed, by adopting a (profitable) marketplace model of truth in which the truth is seen to be “produced by its collision with error,” platforms reject anything other than a libertarian image of free speech.[16] This hands-off approach aligns well with platforms’ desires for large scales of data. More data bring more truth, faster.

Lippmann’s concerns about lies and manipulation remain valid, but I suspect he would be shocked at platforms’ general disregard for the very idea of stable, human-created truths, and the relatively small-scale investments that they have made in fact-checking[17]and self-governance,[18] which their public relations staffs celebrate as public commitments. Recent investigative journalism tells us that platforms know that they damage public life, but they will do nothing that upsets their business models or takes responsibility for the reality-shaping power of their algorithms.[19]

So, if social media platforms are not motivated by truth seeking, shared reality, and collective action based on knowledge and expertise—key ingredients of healthy public life—then how can we reform them toward more public ends? As a small number of powerful technology companies increasingly controls the conditions under which people and computational systems make, interpret, circulate and act upon information, how can we rescue the idea of collective, publicly accountable self-governance through communication? To address this question, we need two types of progress (the first of which is the focus of the rest of this essay).

First, the public needs far more sophisticated mastery of the inner workings and impacts of today’s media systems. If regulators could better understand the complexities, assumptions, and interconnections that shape how online news is made, commodified, and acted upon, they would be much better equipped to protect the public interest. To better implement and evaluate media policy, I want to suggest that regulators adopt and deploy the concept of “infrastructure,” explained below.

Second, although not the focus of this essay, progress on these questions requires significant political will. Technology industries often respond to regulatory threats by claiming that:

  • Their systems use proprietary knowledge that they cannot publicly disclose;
  • Their business models require large-scale data harvesting;
  • People are unwilling to pay for services that are currently underwritten by people’s data; and
  • Encryption technologies, transparency commitments, and controlled data disclosure obviate the need for public oversight.

They defend themselves through a mix of trade secrets, economic claims, promises of self-regulation, and technological solutionism, forestalling real public oversight.

Public Interest and Media Infrastructures
Read the full report

[1] Lippmann, W. (1922). Public opinion. Free Press, p.18.

[2] Ibid.

[3] Jack, C. (2019). Wicked content. Communication, Culture and Critique. doi:10.1093/ccc/tcz043

[4] Anderson, B. (1983). Imagined communities (Revised ed.). Verso, p.62.

[5] McGregor, S. C. (2019). Social media as public opinion: How journalists use social media to represent public opinion. Journalism. doi:10.1177/1464884919845458

[6] Holtz, D., Zhao, M. et al. (2020). Interdependence and the cost of uncoordinated responses to COVID-19. MIT Sloan School of Management, working paper. https://mitsloan.mit.edu/shared/ods/documents/?PublicationDocumentID=7397

[7] Chadwick, A. (2017). The hybrid media system: Politics and power (2nd ed.). Oxford University Press.

[8] Ananny, M. (2018a). Networked press freedom: Creating infrastructures for a public right to hear. MIT Press.

[9] Cheney-Lippold, J. (2017). We are data: Algorithms and the making of our digital selves. NYU Press.;

Koopman, C. (2019). How we became our data. University of Chicago Press.

[10] Couldry, N., & Mejias, U. A. (2019). The costs of connection: How data Is colonizing human life and appropriating it for capitalism. Stanford University Press.

[11] Benjamin, R. (2019). Race after technology. Polity.

[12] Crawford, K. (May 9, 2013). Think again: Big data. Foreign Policy. http://www.foreignpolicy.com/articles/2013/05/09/think_again_big_data?page=full

[13] Gillespie, T. (2018a). Custodians of the internet: Platforms, content moderation, and the hidden decisions that shape social media. Yale University Press.

[14] Elsewhere I have argued that “sometimes [platforms] are like cities, newsrooms, post offices, libraries, or utilities — but they are always like advertising firms” (Ananny, 2019b, n.p.).

[15] Silverman, C. (May 22, 2020). The information apocalypse Is already here, and reality is losing. BuzzFeed News. https://www.buzzfeednews.com/article/craigsilverman/coronavirus-information-apocalypse

[16] Wu, T. (2018). Is the First Amendment obsolete? In L. C. Bollinger & G. R. Stone (Eds.), The free speech century. Oxford University Press.

[17] Ananny, M. (April 4, 2018b). The partnership press: Lessons for platform-publisher collaborations as Facebook and news outlets team to fight misinformation. Tow Center for Digital Journalism, Columbia University. https://www.cjr.org/tow_center_reports/partnership-press-facebook-news-outlets-team-fight-misinformation.php/

[18] Douek, E. (2019). Facebook’s ‘Oversight Board:’ Move fast with stable infrastructure and humility. North Carolina Journal of Law & Technology, 21(1). https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3365358

[19] Horwitz, J. and Seetharaman, D. (May 26, 2020). Facebook executives shut down efforts to make the site less divisive. The Wall Street Journal. https://www.wsj.com/articles/facebook-knows-it-encourages-division-top-executives-nixed-solutions-11590507499