An Algorithm May Be the Key to Reducing Child Abuse and Neglect

There is an interesting thing happening in Allegheny County. In 2016, the police department started using a predictive algorithm when screening incoming calls of reported child abuse or neglect. They are the first in the country to use the system as a back-up to human screeners – a second opinion, as the New York Times Magazine puts it – to determine whether or not a child is at high-risk of being injured, abused or neglected in the future.

From the Times Magazine:

“Nationally, 42 percent of the four million allegations received in 2015, involving 7.2 million children, were screened out, often based on sound legal reasoning but also because of judgment calls, opinions, biases and beliefs. And yet more United States children died in 2015 as a result of abuse and neglect — 1,670, according to the federal Administration for Children and Families; or twice that many, according to leaders in the field — than died of cancer.”

There are approximately 14,000 allegations made in Allegheny County each year. About 7,000 of them are screened out. What the predictive algorithm does is help the department of Children, Youth and Families (CYF) which calls warrant closer inspection. The algorithm does not decide whether or not children are pulled from the home. Instead, it identifies the children at the most risk of danger; CYF then determines whether or not to pursue the allegations further.

How does the program work?

The algorithm is designed to find patterns. Every single police department in the country (not just Allegheny County) has files on record for allegations of child abuse or neglect. It would take human screeners hours – maybe days – to comb through them all. Instead, the algorithm looks at all the data at once, “using well over 100 criteria maintained in eight databases for jails, psychiatric services, public-welfare benefits, drug and alcohol treatment centers and more.”

The exact process is proprietary, but ultimately, the program reviews the data and renders its analysis based on that data. It provides a number on a scale of one (safe) to 20 (high risk). Screeners report that number, and then CYF decides about whether or not to investigate.

Does the algorithm work?

Yes. Yes, it does.

Before the implementation of the program, about half of the cases Allegheny County investigated were considered low risk. This ate up valuable time that CYF could have spent working with children who were in more serious, and more immediate, danger. Over the last 18 months, the percentage of low-risk cases dropped from half to a third.

But even more important, the number of high-risk cases been screened into the system increased. The Times admits that the number only went up by a few percentage points – but it’s still progress.

There is one other benefit to using the program, too. It doesn’t fall prey to racial bias:

“In 2015, black children accounted for 38 percent of all calls to Allegheny County’s maltreatment hotline, double the rate that would be expected based on their population. Their rate of being placed outside their home because of maltreatment was even more disproportionate: eight out of every 1,000 black children residing in the county were placed outside their home that year, compared with just 1.7 of every 1,000 white children.”

The algorithm is helping to reduce that bias. True, it may have more data available if more allegations are reported in regard to black families – but because race isn’t a factor in the program’s analysis, it has actually limited the effects of bias.

As more and more states get onboard with the program, we can only hope they will consider bringing it here. According to the Child Welfare League of America, Ohio had 172,445 child neglect referrals in 2015. Of those referrals:

  • 79,215 reports were referred for investigation
  • 23,006 were victims of neglect (44%), physical abuse (44.3%) and sexual abuse (20.4%)
  • 74 children died

Kentucky had fewer referrals – 101,094 – than Ohio, but a higher percentage of those cases (55,209) were investigated. CWLA found a “rate of 18.7 per 1,000 children” were victims of neglect (92.2%), physical abuse (8.3%) and sexual abuse (4.8%). Sixteen of the children died.

The algorithm may not predict which children are in immediate danger, but by finding a pattern, it can help reduce (and perhaps even eradicate) future abuses. And anything that helps protect kids seems like a good idea to us.

Crandall & Pera Law is a premier personal injury and medical malpractice law firm serving clients throughout Ohio and Kentucky. To learn more about our services, or to schedule a free consultation with a lawyer at one of our multiple locations, please fill out this contact form, or call us: 844-279-2889 in Ohio, and 844-279-2889 in Kentucky.

Leave a Comment