This is an archived article and the information in the article may be outdated. Please look at the time stamp on the story to see when it was last updated.

(AP) — An artificial intelligence tool used by child welfare agencies, including in Colorado, may flag parents with disabilities, an Associated Press investigation found.

The child welfare tool is meant to predict which children may be at risk of harm and promises to lighten the workload for caseworkers. It’s been used in Larimer and Douglas counties, at least, with a new project underway in Arapahoe County.

Through tracking the tools across the country, however, the AP found they can set families up for separation by rating their risk based on personal characteristics they cannot change or control, such as race or disability, rather than just their actions as parents.

Now, the U.S. Justice Department is investigating at least one use of the tool in Pennsylvania, determining whether the algorithm discriminates against people with disabilities or other protected groups.

Larimer County child welfare unsure of AI’s variables

The algorithm uses a number of factors to measure risk, including race, poverty rates, disability status and family size, according to the AP, which obtained the algorithms’ underlying data points.

In Larimer County, one official acknowledged that she did not know what variables were used to assess local families.

“The variables and weights used by the Larimer Decision Aide Tool are part of the code developed by Auckland and thus we do not have this level of detail,” Jill Maasch, a Larimer County Human Services spokeswoman, said in an email to the AP, referring to the algorithm’s developers.

Colorado is among other states, including Pennsylvania and California, where counties opened their data to the academic developers who build the algorithms.

Rhema Vaithianathan, a professor of health economics at New Zealand’s Auckland University of Technology, and Emily Putnam-Hornstein, a professor at the University of North Carolina at Chapel Hill’s School of Social Work, said in an email that their work is transparent and that they make their computer models public.