One of the primary goals of Los Angeles County’s child welfare system is keeping kids out of lock-up. But in this pursuit, the county took a surprising step: It used a predictive analytics tool as part of a program to identify which specific kids might end up behind bars.
The process wasn’t incredibly complicated: It involved administering and assessing a questionnaire about a child’s family, arrests, drug use, academic success, and abuse history. But the goal was abundantly clear: separating out the good kids from the potentially bad.
Though the program—which was officially dubbed the Los Angeles County Delinquency Prevention Pilot, or DPP—ended in 2014, a new report from the National Council on Crime and Delinquency looks into how the program functioned. It not only suggests that L.A. County’s strategy was on the right path, but also that more government agencies should consider testing similar programs all over the country.
The report might be right, but the DPP raises some troubling issues.
When people talk about predictive analytics—whether it's in reference to policing, banking, gas drilling, or whatever else—they're often talking about identifying trends: using predictive tools to intuit how groups of people and/or objects might behave in the future. But that's changing.
In a growing number of places, prediction is getting more personal. In Chicago, for example, there’s the “heat list”—a Chicago Police Department project designed to identify the Chicagoans most likely to be involved in a shooting. In some state prison systems, analysts are working on projects designed to identify which particular prisoners will re-offend. In 2014, Rochester, New York, rolled out its version of L.A. County’s DPP program—with the distinction that it’s run by cops, and spearheaded by IBM—which offered the public just enough information to cause concern.
To continue reading this article by Matt Stroud, go to: http://www.psmag.com/politics-and-law/minority-report
Comments (1)