Skip to main content

“PACEs

An algorithm that screens for child neglect raises concerns [apnews.com]

 

By Sally Ho and Garance Burke, AP News, April 29, 2022

Inside a cavernous stone fortress in downtown Pittsburgh, attorney Robin Frank defends parents at one of their lowest points – when they risk losing their children.

The job is never easy, but in the past she knew what she was up against when squaring off against child protective services in family court. Now, she worries she’s fighting something she can’t see: an opaque algorithm whose statistical calculations help social workers decide which families should be investigated in the first place.

“A lot of people don’t know that it’s even being used,” Frank said. “Families should have the right to have all of the information in their file.”

From Los Angeles to Colorado and throughout Oregon, as child welfare agencies use or consider tools similar to the one in Allegheny County, Pennsylvania, an Associated Press review has identified a number of concerns about the technology, including questions about its reliability and its potential to harden racial disparities in the child welfare system. Related issues have already torpedoed some jurisdictions’ plans to use predictive models, such as the tool notably dropped by the state of Illinois.

According to new research from a Carnegie Mellon University team obtained exclusively by AP, Allegheny’s algorithm in its first years of operation showed a pattern of flagging a disproportionate number of Black children for a “mandatory” neglect investigation, when compared with white children. The independent researchers, who received data from the county, also found that social workers disagreed with the risk scores the algorithm produced about one-third of the time.

[Please click here to continue reading.]

Add Comment

Comments (0)

Copyright © 2023, PACEsConnection. All rights reserved.
×
×
×
×
Link copied to your clipboard.
×