‘Minority Report’ Precrime Program ‘Northpointe’ Flawed by Racial Bias


Are you familiar with Phillip K. Dick’s short science-fiction story Minority Report? If not, you should be. No one could have predicted the 1956 story would mean so much 60 years later.

In Minority Report, lawmakers utilize a technology used to catch criminals before they commit violent crimes. Northpointe is a precrime algorithm that is currently used in parts of the United States to predict the likelihood of potential criminals and repeat offenders.

According to Gawker, Northpointe is a Minority Report-type program that is used to help lawmakers make decisions on criminal recidivism, bail eligibility, and rehabilitation programs. In a perfect world, a system like this could rid the world of crime, but it isn’t that simple. Northpointe is riddled with major flaws to include a bias against black people.

Northpointe, Northpointe Is, Minority Report, Algorithm
[Photo by Scott Olson/Getty Images]
ProPublica published an investigation into the effectiveness of the Northpointe algorithm. The testing took place in Broward County, Florida, and looked at 7,000 at-risk criminals and found that only 20 percent of those predicted to commit a crime actually did so.

“As evidence of racial bias in this Northepointe’s effectiveness, and found that, after controlling for variables such as gender and criminal history, black people were 77 percent more likely to be predicted to commit a future violent crime and 45 percent more likely to be predicted to commit a crime of any kind.”

As Yahoo points out, the Northpointe algorithm sounds incredibly biased. If only 20 percent of people, regardless of race, are willing commit crimes, how did they this test come up with 77 percent of black people?

With these faulty results from the test, many jurisdictions still look to Northpointe for answers. Engadget reports that in New York, Northpointe is used to assess inmates parole eligibility. In Wisconsin, it is used during “each step in the prison system, including sentencing.” Federal sentencing reform bills will soon require the use of Northpointe all over the federal corrections system.

The full basis of Northpointe and their calculations is based on a list of 137 questions asked to potentially ‘high-risk’ criminals. The questions included are

“Was one of your parents ever sent to jail or prison?”

“How often did you get in fights while at school?”

The questionnaire doesn’t focus on race per se, but there are some questions that ask about class status, and poverty. Northpointe founder told ProPublica that they can’t get an “accurate” assessment without including questions on class. While the company doesn’t think their program is biased, it’s obvious that many blacks and other people of color live below the poverty line. Thus, putting them at risk at being falsely flagged at a higher rate, while white defendants are labeled as “low-risk” even though they might be known as repeat offenders.

Northpointe, Northpointe Is, Minority Report, Algorithm
[Photo by Johannes Simon/Getty Images]
To sum this up, Northpointe disputes the claims that their program is biased as it falsely classifies individuals as high-risk, using an algorithm formula that they will not be disclose to the public, yet it has significant influences on lawmakers. Do you see a pattern here?

Philip K. Dick could not have known his ideas would come to life and manifest itself into something like Northpointe. He imagined a future where humans rely on machines to make choices for them, but, instead of trusting their own judgment, they would rather trust a flawed system.

Northpointe is slowly catching on all over the country and before long computers will be predicting who goes to jail, how long they will stay in jail, and whether or not they deserve to ever get out. But who will really benefit from this? Is human observation no longer an option? If this is to catch on nationwide, there should be some serious changes to the algorithm and its glitches to get a more accurate readout of the expected results.

[Photo by Johannes Simon/Getty Images]

Share this article: ‘Minority Report’ Precrime Program ‘Northpointe’ Flawed by Racial Bias
More from Inquisitr