How to investigate your government through algorithms

Some kinds of reporting-by-the-numbers are anything but lazy. Take investigations looking into algorithms — examining the formulas used by the government to determine who is more likely to commit a crime or how likely your building is to have a fire inspection.

Speaking at the recent International Festival of Journalism, Nick Diakopoulos, assistant professor at the University of Maryland’s Philip Merrill College of Journalism and a member of its Human Computer Interaction Lab, gave a solid primer on how to get started.

He’s been studying the wider reach of algorithms in society, government and industry for about four years, coming at it from a computer science background as a “techie who worked my way into journalism.” Boyish, bespectacled and occasionally prone to professorial turns of phrase like “algorithmic accountability,” Diakopoulos offered a look into the numbers that shape our lives.

What they are

Photo: Jon Oropeza via Flickr. CC-licensed.

At the most basic level algorithms are like recipes, Diakopoulos says. They have ingredients, assembly instructions and a sequence or order for those instructions — your basic how-to method for doing something. Where the analogy falters, he says, is that unlike the sequence that results in a good plate of pasta al pomodoro, algorithms are decision-making formulas. “The crux of algorithmic power is how they make decisions, or have the potential to make decisions, potentially without any human involvement.”

These break down broadly into four types of decisions: prioritization, classification, association and filtering. Familiar examples include search engines top-ranking better sources of information; YouTube’s strainer for picking up copyrighted material; connecting terms—as in the slander lawsuit over Google’s auto-complete — and filtering, whereby some news sources rank higher than others.

Why you should care

Far from being impartial gatekeepers or shortcuts, algorithms are designed by humans — often with built-in bias that can shape our daily lives. They are deciding what schools kids attend, who gets released on parole and who your next date is.

“It’s time to start getting skeptical about algorithms,” Diakopoulos says. “It’s time to start asking questions to learn more about how these systems function and get more details on how they work.”

That’s where algorithmic accountability — pulling back the curtain on the formulas —  comes in. Diakopoulos cites a ProPublica investigation into software used in crime cases that asks a number of seemingly benign questions  — “What neighborhood do you live in?”  “What’s your education level?” “Are you in touch with your family?” — to arrive at a flight or future crime risk. Looking at the results in 7,000 cases, reporters discovered that the resulting “risk assessments” are not only biased against blacks but only slightly more accurate than a coin toss for predicting who will commit more crimes.

“Algorithmic accountability means investigating these systems and trying to understand how these quantifications affect people,”  he says. His team’s investigations have lead to articles including “How Google shapes the news you see about the candidates” and “Uber seems to offer better service in areas with more white people.” Continue reading