Decoding the Election: Algorithms, AI, and Your Vote

By David C. Brock | October 06, 2020

In the United States, politics was intertwined with electronics from the very start. The advent of vacuum tubes in the early twentieth century marked the beginnings of our electronic world. These tubes – cousins to the incandescent light bulbs that are now becoming a rarity – controlled flows of electricity, switching and amplifying them among other things. Early tubes were used to amplify the dots and dashes of telegraph codes as they sped flows of political and economic news across continents and underneath oceans.

In the 1920s, tubes afforded the construction of radio broadcasting, and a new era in the spread of political messaging and news from the few to the many, to say nothing of cultural productions like music and theatre. By the 1930s, the electronic flows of radio became indispensable tools for politicians across the globe, from US President Franklin Delano Roosevelt’s on-air “fireside chats” on the Depression, the New Deal, and, eventually, war, to the use of radio by fascist leaders in Germany, Italy, and elsewhere.

In the 1940s, tubes allowed for an expansion of the electronic realm beyond the auditory to the visual, and early in the decade, television broadcasting exploded in popularity, with television news at the forefront. That same decade, researchers and engineers used the very same vacuum tube electronics to put electrical flows in service of mathematics, creating the first digital electronic computers. These two tube-driven developments – television and computing – came together spectacularly on November 4th, 1952, the night of the US presidential election. That evening, CBS News prominently made use of the Univac electronic digital computer throughout its live coverage of the election results. Early election returns flowing into the newsroom were fed into the Univac, in which a program calculated a prediction of the eventual winner. Its prediction was correct.

By the end of the 1950s, a new thread was braided into the tangle of television, computing, and politics: advertising. As historian Jill Lepore has shown in her recent work, all of these threads were wound together in a new firm, Simulmatics. The company used computer simulations to predict voter behavior in response to various factors, but in particular political messaging, in other words, political advertising. This strand of politics, computing, television, and advertising only strengthened over the following decades.


Just a few years after Simulmatics’ consulting for the successful Kennedy campaign, the computer researcher Ted Nelson coined the terms “hypertext” and “hypermedia” to describe literary and multimedia productions enabled by computers that would move beyond what had been possible with paper or with film, encoding networking into the heart of such productions. Links, within and without the production, would be, Nelson described, the essence of hypertexts and hypermedia.

In our digital present, the braided strand of politics, computing, advertising, and television has evolved. Television has been replaced by a more pervasive medium that has subsumed it: Nelson’s hypermedia. From smartphones to laptops, and from television screens to dashboard displays, the many screens of our digital world present to us an electronic reality of interactive hypermedia. This is particularly true of social media sites, not only linking sounds, sights, and texts in complex ways, but linking people to one another in intoxicating, and worrisome, ways. Hypermedia is the reality of today’s politics.

The importance of our political hyperreality became inescapable in 2016, first with the Brexit referendum in the UK, and then with the US presidential election and Donald Trump’s Electoral College victory. What became inescapable in 2016 now can feel overwhelming. For the 2020 US presidential election, how are we to understand the importance of our hyperreal politics? How is hypermedia being used to shape our votes? How should we understand the role of algorithms and artificial intelligence in social media platforms? What about traditional factors like newspapers and television news? How do we decode the election?

Decoding the Election

For help, CHM turned to experts for guidance during on online CHM Live event on September 16th, 2020: Joan Donovan, a prominent internet researcher at Harvard; Robby Mook, a political consultant who managed Hillary Clinton’s 2016 presidential campaign; and David Rothschild, a political researcher with Microsoft Research. The bestselling author of Facebook: The Inside Story, Steven Levy, was an insightful moderator for the panel.

Early in the discussion, Robbie Mook illuminated why our most recent elections seem more in need of decoding than ever before: hypermedia, as used today, has introduced a new opacity into politics, rather than transparency.

Robbie Mook discusses opacity in politics.

Joan Donovan elaborated on how this new political opacity afforded by the ways in which digital advertising is currently treated favors disinformation. For example, she notes that for campaign finance law, a digital advertisement is equivalent to a free MAGA hat or a Biden-Harris bumper sticker, and that social media platforms have positioned the monitoring of these advertisements as someone else’s job.

Joan Donovan discusses opacity in politics.

For David Rothschild, however, it is important to keep this online opacity in its proper perspective. While much of television now has the character of interactive hypermedia, much of it retains the same passive, broadcast television aspect it had for decades. And, as Rothschild notes, it is from this aspect – broadcast TV news – that the important “marginal voters” get what little news they consume.

David Rothschild discusses the importance of television news for “marginal voters."

Nevertheless, Steven Levy challenged the panelists to address whether or not President Trump’s 2016 campaign proved that building digital engagement with voters online is now the way to win an election.

Steven Levy challenges panelists about digital engagement.

For Robby Mook, while traditional factors beyond online digital engagement remain critically important, digital engagement is nevertheless dramatically altering both campaign finance and political coverage by newspaper and television reporters.

Robby Mook discusses the consequences of digital engagement.

Robby Mook on the consequences of online political donations.

Joan Donovan noted that some of the ways that online activity shapes news coverage in traditional outlets like newspapers and television is deliberate, the result of careful strategy by online actors.

Joan Donovan discusses deliberate strategies to use online activity to shape mainstream news.

David Rothschild underscored the importance of remembering that mainstream news outlets and social media companies are competitors for advertising dollars, and that this competition should be kept in mind when evaluating claims by one against the other.

David Rothschild discusses the competition between mainstream media and social media.

Steven Levy and Robby Mook wrapped up this broad discussion of political opacity and the relationships between hypermedia and traditional news, by exploring a new development from 2016. This was the relationship between Facebook and the presidential campaigns, and whether or not the two had become so intertwined as to have become inseparable.

Steven Levy and Robby Mook discuss the relationship between Facebook and the 2016 Presidential campaigns.

The panel concluded with an important conversation around the quality of information within hypermedia and the responsibility for the quality of our information environment. Joan Donovan laid out the broad lines of the matter.

Joan Donovan on the quality of and responsibility for our information environment.

David Rothschild underscored that these issues with our hypermedia information environment are not only questions of consumption on social media platforms, but also on the production of news itself by journalists.

David Rothschild on production and consumption in the information environment.

For Robby Mook, the question of information quality is the question of liability for damage to our information environment. As with protection of the natural environment, liabilities around damage to the information environment would likely have profound consequences for all stakeholders.

Robby Mook and Steven Levy discuss liability for information quality.

At the end of the discussion, Joan Donovan offered concrete advice for what all of us can do in the 2020 election and beyond in light of decoding the intertwining of politics, computing, advertising, and hypermedia: act as information stewards for our families and communities. It is a role that promises only to increase in importance in the days ahead.

Joan Donovan on the importance of information stewards.

Watch the Full Conversation

CHM Live | Decoding the Election: Algorithms, AI, and Your Vote, September 16, 2020.

About The Author

David C. Brock is an historian of technology, CHM's Director of Curatorial Affairs, and director of its Software History Center. He focuses on histories of computing and semiconductors as well as on oral history. He is the co-author of Moore’s Law: The Life of Gordon Moore, Silicon Valley’s Quiet Revolutionary and is on Twitter @dcbrock.

Join the Discussion


FacebookTwitterCopy Link