Using algorithms to deliver justice – bias or boost?

The growing use of algorithms in the justice system raises many questions with few easy answers, the Law Society of England and Wales said as it launched a public policy commission to explore the impact of technology and data on human rights and justice.

Law Society vice president and commissioner Christina Blacklaws said: “Big data and algorithms already augment human capabilities for analysis and prediction beyond anything previous generations could have imagined.

“Their use could – and sometimes does – keep us safer, preserve scarce resources and expand the reach of increasingly stretched law enforcement.

“But the design, sale and use of algorithms to deliver justice or maintain security also raises questions about unconscious bias, ethics and rights. Further potential risks may emerge when an algorithm is developed by a business focused on profit rather than by an organisation focused on delivering justice.”

The commission will initially look at the use of AI in legal practice and more broadly in society, by the police and prison services. For instance:

  • Durham Constabulary have used an artificial intelligence system to inform decisions about whether to keep a suspect in custody. They use an algorithm to assess low, medium and high risk of reoffending – so that arrestees forecast as a moderate risk can be made eligible for a programme designed to reduce re-offending
  • Mathematicians and social scientists in the US have developed a crime prediction tool in collaboration with the Los Angeles Police Department. ‘PredPol’ has now been used by Kent Police for crime prediction hotspot mapping
  • The Metropolitan Police and South Wales Police use facial recognition technology at public events, music festivals and demonstrations to cross-reference people already on watch-lists.

Christina Blacklaws added: “The questions we will explore include: What are the financial and social costs if algorithms are skewed? When is the use of algorithms and big data appropriate? What kind of oversight do we need? How do we ensure that the data used is correct and free from bias? And, how do we ensure decisions can be accessible, reviewed or appealed?”

The Law Society commissioners will take oral and written evidence from tech, government, academics, commercial actors and legal and human rights experts to explore an overarching question: what framework for the use of big data and algorithms could protect human rights and trust in the justice system?

 

Credit: Harriet Beaumont | The Law Society

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s