Chm Blog CHM Live

Rebooting Big Tech

By CHM Editorial | October 06, 2021

Can We Solve Our System Error?

With technology pervading every corner of our lives, it can often feel like we’re at the mercy of the big tech companies. Suggestions to “just delete Facebook” don’t help. And, letting big tech companies continue to reinforce discrimination, erode privacy, and spread disinformation can only lead to a darker future. So what do we do?

Three Stanford professors believe they can help. At a virtual CHM Live event on September 8, 2021, they shared ideas from their new book, System Error: Where Big Tech Went Wrong and How We Can Reboot. Philosopher Rob Reich, computer scientist Mehran Sahami, and political scientist Jeremy Weinstein discussed how we can shift the balance of power from big tech with moderator Marietje Schaake, international policy director at Stanford’s Cyber Policy Center.

What's the Problem?

Programming Davids have now become the Goliath.

— Rob Reich

What’s the primary “system error”? Jeremy Weinstein says that decisions with massive implications for all of us are being left to a very small number of people who work in and run big tech companies. Their platforms and tools have benefits but also tradeoffs in security and privacy and other downsides for people and society.

Rob Reich noted that the pandemic has exacerbated these issues as we’re dependent on tech platforms for work, education, and even our private lives. Explaining how we got here, Mehran Sahami said that as technology scales, what we want and what we get diverges more and more, driven by technologists’ drive for quantifiable metrics: time spent on a platform like Facebook is easy to measure, but people may not benefit from consuming hours of misinformation on it.

The professors candidly admitted that Stanford’s culture of promoting disruption has contributed to the problems we see with big tech and has a responsibility to help fix it. Reich would like to celebrate civic technologists instead of young founders whose technologies don’t advance the public good. Sahami believes the university must teach students about the consequences of unbridled tech innovation and prepare them to think about how to mitigate negative effects. Weinstein thinks Stanford is a microcosm of what’s happening more broadly in tech.

Jeremy Weinstein explains the tech backlash.

Can Democracy Take Back Power?

You are also quite critical about the abdication of responsibility on the part of political and democratic leaders.

— Marietje Schaake

What can government regulators and policy-makers do to rebalance power? Reich insists that technologists’ optimization mindset cannot be applied to democracy, which is designed to referee conflicting citizen preferences in a way that is constantly adaptable. Sahami outlined how government regulation can create safe systems—as it's done in the past.

Mehran Sahami advocates for regulation.

Big Tech argues that regulation slows down innovation, which Weinstein says is essentially a rejection of the role of democracy and our political institutions. We can’t leave it up to the tech companies, as the past decade has shown. But we can, he says, find areas where our democratic institutions can achieve consensus on the most evident harms to be avoided in the near future. These are:

  1. Data privacy and the huge asymmetry of power between individuals and companies;
  2. The use of algorithms in high-stakes decision-making and the lack of transparency and fairness in the rollout of systems;
  3. Negative effects of AI on the workforce. Showing progress in these areas during our current fraught political moment will demonstrate that government actually can rein in big tech.

Schaake noted that tech companies headquartered in the US affect not just Americans but also more fragile countries. In the absence of US government leadership, hate speech and incitement are causing real harm to citizens in other nations. Weinstein warned that arguing that we have to let Big Tech do what it wants or we’ll lose out to China just dodges responsibility. The US should work with allies who value justice, equality, and privacy and have open conversations about the values at stake.

What’s a Tech Employee To Do?

An error rate is just not an error rate—it is about human lives being affected.

— Mehran Sahami

Sahami advocated for a human-centered design approach within tech companies that starts with actually scrapping a new technology if prototypes and models show problems that can’t be fixed. There must be guidelines around how models should be measured before deploying at scale. Companies need to work harder to understand how people are harmed, the benefits and tradeoffs, and who are the populations being served and who is left out.

Weinstein noted that tech professionals themselves are often concerned about these issues and feel disempowered by their employers. Ethics are treated not as critical conversations for designers and engineers, but rather like a compliance function addressed by the office of the general counsel. But, ethics must go beyond just conforming to the rule of law, and regulation can’t solve everything. There must be an ethic of responsibility in the computer science profession that’s built into the structure of a tech business. Pressure from employees can help shift big tech away from harmful practices, especially if the way they do business prevents them from attracting talent.

The big argument of the book is that there’s agency for everyone.

— Jeremy Weinstein

Reich highlighted why ethics inside a company must go beyond the general counsel, a chief ethics officer, and even an individual’s morals.

Rob Reich sees a need for professional ethics.

The panelists explored these ideas and more as they answered specific questions from the audience. Here are some of the answers. Watch the full video to see more.

What about…?

Should all computer science degree programs include a mandatory ethics course?

Reich: That’s not enough; ethics must be embedded through the entire curriculum.

Should big tech companies be broken up?

Sahami: That’s not a solution. Breaking up Facebook would exacerbate misinformation because more small companies who are less equipped to deal with it would then spread misinformation. But, greater government scrutiny could lead to greater competition without breaking up big tech companies because they would change their behavior, as Microsoft did when it was hit with an antitrust suit.

How does our system of government help us understand why we are where we are?

Weinstein: There’s been a long-running race between disruption and democracy. Going back 150 years to the early telecommunications infrastructure with the telegraph and telephone, you see that in a system that combines free market ideas with democratic institutions to provide oversight and restraint you get extraordinary innovation in the private sector. But then externalities unfold, like the dominance of one company, because expensive infrastructure needs to be built. The big companies maintain power by influencing the political system. Then, government tries to solve the problems with regulation, but the regulations don’t adjust as systems change until there’s an opening in the policy window. That’s what’s happening today, where there is bipartisan recognition of harm and the momentum to tackle the issue.

With government comes lobbyists. What can our democracy do to fix that?

Weinstein: Our system is failing by choice. We’ve made systematic decisions to erode technical expertise and knowledge in the federal government and instead rely on experts supplied by lobbyists. We’ve underinvested in building a democratic system that is capable of governing technology.

Aren’t tech decisions actually being made by the collective through their free choice to adopt these technologies rather than decisions just being imposed by big tech?

Reich: Facebook is a global platform of over three billion active users with one person (Mark Zuckerberg) who has complete control over the information on it. When you reach that scale, issues of private governance call out for scrutiny. Massive system problems should not be framed as choices that can be made by individual consumers.

How do we avoid future problems before they scale?

Sahami: Tech companies must be aware that their products are not just solving problems but also creating them. They need to try to understand what might potentially happen before they deploy a new technology and not just let it out there to spread and try to fix it later.They should conduct studies themselves about what is actually happening to understand and monitor. There will be times companies don’t get it right, but they should take a view that is self-critical rather than self-aggrandizing.

Let’s all hope they get the message. Better yet, let’s reach out to our elected representatives to help them understand why and how we can reboot big tech for a better future.

Watch the full conversation

System Error | CHM Live, September 8, 2021

About The Author

CHM Editorial consists of editors, curators, writers, educators, archivists, media producers, researchers, and web designers, looking to bring CHM audiences the best in technology and Museum news.

Join the Discussion

Share

FacebookTwitterCopy Link