NationalStacy M. Brown

Justice Department Algorithms Used to Predict Crime

Democrats Seek to Curb Results of 'Dirty Policing'

New York University Law Review researchers recently found that law enforcement agencies “are increasingly using predictive policing systems to forecast criminal activity and allocate police resources.”

Yet in numerous jurisdictions, these systems are built on data produced during documented periods of flawed, racially biased and sometimes unlawful practices and policies — or dirty policing, noted the report, titled “Dirty Data, Bad Predictions: How Civil Rights Violations Impact Police Data, Predictive Policing Systems, and Justice.”

Eight Democrat lawmakers have taken exception to the use of algorithms that automate policing decisions, raising their concerns with the U.S. Department of Justice this week.

U.S. Reps. Yvette D. Clarke (D-N.Y.) and Sheila Jackson Lee (D-Texas) and Sens. Ron Wyden (D-Ore.), Elizabeth Warren (D-Mass.), Edward Markey (D-Mass.), Jeff Merkley (D-Ore.), Alex Padilla (D-Calif.) and Raphael Warnock (D-Ga.) asked the DOJ in a letter to help ensure that any predictive policing algorithms in use are fully documented.

They asked the agency also to ensure that algorithms are subjected to ongoing, independent audits by experts and made to provide a system of due process for those affected.

“If the DOJ cannot ensure this, DOJ should halt any funding it is providing to develop and deploy these unproven tools,” the lawmakers wrote.

According to www.nextgov.com, predictive policing involves law enforcement officials implementing mathematical and predictive analytics and other technology-based techniques to pinpoint potential crimes.

In their letter, the lawmakers said two primary ways such methods are used are to predict locations where crimes could occur in a particular window or predict which individuals might be involved in future illegal acts.

“Algorithms draw from historical crime data, and at times other data elements like weather patterns or gunfire detection, to produce the forecasts,” they noted.

“But, when predictive policing systems have been exposed to scrutiny, auditors have found major problems with their effectiveness and reliability,” the letter continued.

‘Dirty Data’ Perpetuates Biases

Nextgov.com reported that the lawmakers pointed to specific reviews that sparked worry and a police department’s 2020 strategic plan that mentioned implementing such technologies with Justice Department funds.

They also referenced the New York University Law Review study that found nine out of 13 assessed law enforcement departments used what’s deemed “dirty data” – or information collected from illegal policing practices – to inform their algorithms leveraged in this sort of work.

“When datasets filled with inaccuracies influenced by historical and systemic biases are used without corrections, these algorithms end up perpetuating such biases and facilitate discriminatory policing against marginalized groups, especially Black Americans,” the lawmakers wrote.

They requested a range of detailed information from the federal department.

The information includes whether officials have analyzed if this technology application complies with relevant civil rights laws.

They demanded to know the names of each jurisdiction that has operated predictive policing algorithms funded by the agency and the actual software used.

The lawmakers also asked for a detailed annual accounting of all federal funding DOJ distributed related to developing and implementing predictive policing algorithms at federal, state, and local levels for fiscal years 2010 to 2020; and more.

“Deploying predictive policing systems in jurisdictions with extensive histories of unlawful police practices presents elevated risks that dirty data will lead to flawed or unlawful predictions, which in turn risk perpetuating additional harm via feedback loops throughout the criminal justice system,” New York University Law Review researchers wrote.

“The use of predictive policing must be treated with high levels of caution and mechanisms for the public to know, assess, and reject such systems are imperative.”

Stacy M. Brown

I’ve worked for the Daily News of Los Angeles, the L.A. Times, Gannet and the Times-Tribune and have contributed to the Pocono Record, the New York Post and the New York Times. Television news opportunities have included: NBC, MSNBC, Scarborough Country, the Abrams Report, Today, Good Morning America, NBC Nightly News, Imus in the Morning and Anderson Cooper 360. Radio programs like the Wendy Williams Experience, Tom Joyner Morning Show and the Howard Stern Show have also provided me the chance to share my views.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *


By submitting this form, you are consenting to receive marketing emails from: Washington Informer Newspaper, 3117 Martin Luther King Jr. Ave SE, Washington, DC, 20032, http://www.washingtoninformer.com. You can revoke your consent to receive emails at any time by using the SafeUnsubscribe® link, found at the bottom of every email. Emails are serviced by Constant Contact

Back to top button

My News Matters to me - Washington Informer Donations

Be a Part of The Washington Informer Legacy

A donation of your choice empowers our journalists to continue the work to better inform, educate and empower you through technology and resources that you use.

Click Here Today to Support Black Press and be a part of the Legacy!

Subscribe today for free and be the first to have news and information delivered directly to your inbox.

Select list(s) to subscribe to


By submitting this form, you are consenting to receive marketing emails from: Washington Informer Newspaper, 3117 Martin Luther King Jr. Ave SE, Washington, DC, 20032, http://www.washingtoninformer.com. You can revoke your consent to receive emails at any time by using the SafeUnsubscribe® link, found at the bottom of every email. Emails are serviced by Constant Contact

Adblock Detected

Please consider supporting us by disabling your ad blocker