Oregon Stops Algorithm Identifying Child Neglect & Abuse That Flagged Black Families

Algorithm

Oregon will no longer utilize an algorithm to flag potential cases of child abuse after a report exposed the software application could have a racial predisposition. Equivalent AI programs remain in use in 11 states with the objective of preventing neglect and abuse.

The Department of Human Services in Oregon informed its hotline employees in an e-mail in May that it would not utilize the Safety at Screening Tool following issues concerning distinctions between households that the algorithm flagged.

The program will be totally out of use by the end of June and the department will move to a less automated review system.

“Making decisions about what should happen to children and families is far too important a task to give untested algorithms,” Democratic Senator Ron Wyden of Oregon said in a statement per Engadget. “I’m glad the Oregon Department of Human Services is taking the concerns I raised about racial bias seriously and is pausing the use of its screening tool.”

On April 29, the Associated Press released a comprehensive report about a comparable software application utilized in Pennsylvania.

The outlet “identified a number of concerns about the technology, including questions about its reliability and its potential to harden racial disparities in the child welfare system.”

Throughout the first year that the program was utilized by Allegheny County, it “showed a pattern of flagging a disproportionate number of Black children for a ‘mandatory’ neglect investigation, when compared with white children.”

The algorithm utilized information from public records to determine threat ratings that would suggest the probability of child abuse happening in the house.

The county “initially considered including race as a variable in its predictions about a family’s relative risk but ultimately decided” against including the information in 2017.

An independent evaluation of the information found that “if the tool had acted on its own to screen in a comparable rate of calls, it would have recommended that two-thirds of Black children be investigated, compared with about half of all other children reported.”

Oregon’s Safety at Screening Tool was based upon the system produced by the Allegheny County Department of Human Services and introduced in 2018.

In a November 2019 report relating to the algorithm, the government stated that “the Safety at Screening Tool utilizes techniques from a field of computer science called machine learning.”

“The procedure involves using a computerized technique to discover how to associate Child Welfare administrative data elements with future outcomes of interest,” noted the DHS. “By linking data elements… regarding historical information to live information about an incoming report of abuse/neglect, it is possible to generate a prediction about whether the report will lead to a removal if the report is assigned to investigation, and/or whether screening out the report will lead to another future investigation.”

Data points included the number of children in the report and the overall variety of previous reports. Info about the household’s well-being status was likewise included of in Pennsylvania’s algorithm.

“The tool comes to learn how to use administrative data elements to calculate the probability that a child will be removed from home and/or involved in a future investigation,” the department noted.

The report warned that using the algorithm could result in “automation bias” which leave child welfare officials feeling pressured to rely on the “predictive risk scores despite clear contradictory evidence.”

An Oregon DHS spokesman, Jake Sunderland, said the algorithm cannot be used in that state’s new screening process and was therefore “no longer necessary.”

While he did not provide a specific explanation of why state officials decided to suspend the program, he noted “no expectation that it will be unpaused soon.”

Oregon means to convert to an evaluation system called a Structured Decision Making Model, which is presently in use in New Jersey, California, and Texas.

H/T Timcast

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Posts