Should machines be tasked with catching future criminals, or are we encoding our own biases?

To catch a (potential) criminal In the movie Minority Report, a panel of psychics peers into the future to predict who will commit a crime. Today, using machine learning technology, courts, police departments, and parole officers are attempting to do the same. Recommending sentences and placement Increasingly, law enforcement agencies and courts are using computer algorithms to assess the risk that criminals will re-offend. This information is used to sentence defendants in court, to choose where to place inmates in prison, which streets should be more heavily policed, and who should be released on parole. There are several different risk calculators available. Some fully disclose the kinds of data they use to assess risk, while others companies, such as Northpointe, which makes risk assessment software called Compas, won’t reveal how…


Link to Full Article: Should machines be tasked with catching future criminals, or are we encoding our own biases?

Pin It on Pinterest

Share This

Join Our Newsletter

Sign up to our mailing list to receive the latest news and updates about homeAI.info and the Informed.AI Network of AI related websites which includes Events.AI, Neurons.AI, Awards.AI, and Vocation.AI

You have Successfully Subscribed!