With technology pervading every corner of our lives, it can often feel like we’re at the mercy of the big tech companies. Suggestions to “just delete Facebook” don’t help. And, letting big tech companies continue to reinforce discrimination, erode privacy, and spread disinformation can only lead to a darker future. So what do we do?
Three Stanford professors believe they can help. At a virtual CHM Live event on September 8, 2021, they shared ideas from their new book, System Error: Where Big Tech Went Wrong and How We Can Reboot. Philosopher Rob Reich, computer scientist Mehran Sahami, and political scientist Jeremy Weinstein discussed how we can shift the balance of power from big tech with moderator Marietje Schaake, international policy director at Stanford’s Cyber Policy Center.
What’s the primary “system error”? Jeremy Weinstein says that decisions with massive implications for all of us are being left to a very small number of people who work in and run big tech companies. Their platforms and tools have benefits but also tradeoffs in security and privacy and other downsides for people and society.
Rob Reich noted that the pandemic has exacerbated these issues as we’re dependent on tech platforms for work, education, and even our private lives. Explaining how we got here, Mehran Sahami said that as technology scales, what we want and what we get diverges more and more, driven by technologists’ drive for quantifiable metrics: time spent on a platform like Facebook is easy to measure, but people may not benefit from consuming hours of misinformation on it.
The professors candidly admitted that Stanford’s culture of promoting disruption has contributed to the problems we see with big tech and has a responsibility to help fix it. Reich would like to celebrate civic technologists instead of young founders whose technologies don’t advance the public good. Sahami believes the university must teach students about the consequences of unbridled tech innovation and prepare them to think about how to mitigate negative effects. Weinstein thinks Stanford is a microcosm of what’s happening more broadly in tech.
What can government regulators and policy-makers do to rebalance power? Reich insists that technologists’ optimization mindset cannot be applied to democracy, which is designed to referee conflicting citizen preferences in a way that is constantly adaptable. Sahami outlined how government regulation can create safe systems—as it's done in the past.
Big Tech argues that regulation slows down innovation, which Weinstein says is essentially a rejection of the role of democracy and our political institutions. We can’t leave it up to the tech companies, as the past decade has shown. But we can, he says, find areas where our democratic institutions can achieve consensus on the most evident harms to be avoided in the near future. These are:
Schaake noted that tech companies headquartered in the US affect not just Americans but also more fragile countries. In the absence of US government leadership, hate speech and incitement are causing real harm to citizens in other nations. Weinstein warned that arguing that we have to let Big Tech do what it wants or we’ll lose out to China just dodges responsibility. The US should work with allies who value justice, equality, and privacy and have open conversations about the values at stake.
Sahami advocated for a human-centered design approach within tech companies that starts with actually scrapping a new technology if prototypes and models show problems that can’t be fixed. There must be guidelines around how models should be measured before deploying at scale. Companies need to work harder to understand how people are harmed, the benefits and tradeoffs, and who are the populations being served and who is left out.
Weinstein noted that tech professionals themselves are often concerned about these issues and feel disempowered by their employers. Ethics are treated not as critical conversations for designers and engineers, but rather like a compliance function addressed by the office of the general counsel. But, ethics must go beyond just conforming to the rule of law, and regulation can’t solve everything. There must be an ethic of responsibility in the computer science profession that’s built into the structure of a tech business. Pressure from employees can help shift big tech away from harmful practices, especially if the way they do business prevents them from attracting talent.
Reich highlighted why ethics inside a company must go beyond the general counsel, a chief ethics officer, and even an individual’s morals.
The panelists explored these ideas and more as they answered specific questions from the audience. Here are some of the answers. Watch the full video to see more.
Should all computer science degree programs include a mandatory ethics course?
Reich: That’s not enough; ethics must be embedded through the entire curriculum.
Should big tech companies be broken up?
Sahami: That’s not a solution. Breaking up Facebook would exacerbate misinformation because more small companies who are less equipped to deal with it would then spread misinformation. But, greater government scrutiny could lead to greater competition without breaking up big tech companies because they would change their behavior, as Microsoft did when it was hit with an antitrust suit.
How does our system of government help us understand why we are where we are?
Weinstein: There’s been a long-running race between disruption and democracy. Going back 150 years to the early telecommunications infrastructure with the telegraph and telephone, you see that in a system that combines free market ideas with democratic institutions to provide oversight and restraint you get extraordinary innovation in the private sector. But then externalities unfold, like the dominance of one company, because expensive infrastructure needs to be built. The big companies maintain power by influencing the political system. Then, government tries to solve the problems with regulation, but the regulations don’t adjust as systems change until there’s an opening in the policy window. That’s what’s happening today, where there is bipartisan recognition of harm and the momentum to tackle the issue.
With government comes lobbyists. What can our democracy do to fix that?
Weinstein: Our system is failing by choice. We’ve made systematic decisions to erode technical expertise and knowledge in the federal government and instead rely on experts supplied by lobbyists. We’ve underinvested in building a democratic system that is capable of governing technology.
Aren’t tech decisions actually being made by the collective through their free choice to adopt these technologies rather than decisions just being imposed by big tech?
Reich: Facebook is a global platform of over three billion active users with one person (Mark Zuckerberg) who has complete control over the information on it. When you reach that scale, issues of private governance call out for scrutiny. Massive system problems should not be framed as choices that can be made by individual consumers.
How do we avoid future problems before they scale?
Sahami: Tech companies must be aware that their products are not just solving problems but also creating them. They need to try to understand what might potentially happen before they deploy a new technology and not just let it out there to spread and try to fix it later.They should conduct studies themselves about what is actually happening to understand and monitor. There will be times companies don’t get it right, but they should take a view that is self-critical rather than self-aggrandizing.
Let’s all hope they get the message. Better yet, let’s reach out to our elected representatives to help them understand why and how we can reboot big tech for a better future.