Chm Blog CHM Live

Computers versus Crime

By CHM Editorial | November 14, 2022

In police departments and courts across the country, artificial intelligence is being used to help decide who is policed, who gets bail, who is sentenced, and who gets parole. But is it actually making our law enforcement and court systems fairer and more just?

NOVA explores these questions in their new documentary, Computers v. Crime, an investigation into the flaws of controversial criminal justice technology. On November 1, 2022, CHM hosted two participants in the film, UC Berkeley professor Hany Farid and neuroscientist Vivienne Ming, to deconstruct the biased algorithms behind the tech and discuss their impact with KQED’s Rachael Myrow.

Humans / Algorithms

An excerpt from the film featured Hany Farid’s research into whether algorithms are doing better than humans at eliminating bias.

The researchers paid non-experts to evaluate defendants by reading a short paragraph that included age, gender, and prior conviction record. There was no information about race. People were then asked, “Do you think this person will commit another crime within the next two years?” The results showed that the humans were as accurate as the commercial software used in courts today . . . and just as biased. Because people of color in America are more likely to be arrested, charged, and convicted of crimes, it turned out that prior conviction data is a proxy for race.

Given the historical inequities in the criminal justice system and how deeply ingrained racism is in American society, is it even possible to create software that’s not racist? Hany Farid believes it’s naïve to think we can. Worse, people tend to assume that technology is infallible despite mounting evidence that it’s spectacularly biased. He explains how that bias has been baked in.

Hany Farid describes how biased algorithms affect lives.

Despite their ubiquity and the critical decisions they’re being allowed to make, algorithms are not particularly explainable, notes Farid. They are very complex, massive systems with billions of parameters, and even those who create them don’t know exactly how they’re operating.

There’s also no one providing quality control, and regulation is not only years behind but also subject to highly-partisan, heavily lobbied lawmakers who don’t understand technology.

Vivienne Ming shared a story about how predictions can turn out to be nothing like you may have, well, predicted.

Vivienne Ming explains a problem with predicting behavior.

Critical Thinking / Pattern Recognition

In the second film excerpt, Amazon’s failed attempt to correct gender bias in hiring actually resulted in even more biased results, despite the best intentions. Why? The algorithm was very good at recognizing patterns in historical data that correlated promotion with men even when all gender markers were supposedly removed.

Ming says it’s impossible to scrub the history out of data and rebalance it—it’s just too complicated. She notes that the Amazon algorithm developers didn’t seem to ask themselves if they fully understood the problem they thought they were solving. Too much data, complex systems, and computer science programs that lack diversity and don’t teach critical thinking skills may be part of the problem.

Hany Farid and Vivienne Ming explain why scale can be a problem.

The third film excerpt highlighted how data related to an historically over-policed Oakland neighborhood was fed into a platform that then predicted crime will occur in that area, regardless of other available data showing that elicit drug use is a city-wide problem.

So, now that research has demonstrated the problems with these systems, why are they still being used?

Both Farid and Ming believe that political issues come into play and that both policy makers and consumer must make better choices. Market-driven AI isn’t going to improve until we all get smarter about what we want.

Watch the full Conversation

Computers v. Crime | CHM Live, November 1, 2022

NOVA's Computers v. Crime is available for streaming online and via the PBS Video App. NOVA is a production of GBH. Computers v. Crime is a NOVA Production by BlueSpark Collaborative, LLC for GBH.

This event is made possible by the generous support of the Kapor Center.

Major funding for NOVA is provided by Brilliant Worldwide, Inc., Consumer Cellular, the Corporation for Public Broadcasting, and PBS viewers. Additional funding for Computers v. Crime is provided by the George D. Smith Fund, and the NOVA Science Trust with support from Margaret and Will Hearst, and Howard and Eleanor Morgan.

About The Author

CHM Editorial consists of editors, curators, writers, educators, archivists, media producers, researchers, and web designers, looking to bring CHM audiences the best in technology and Museum news.

Join the Discussion

Share

FacebookTwitterCopy Link