We’ve written before about a computer algorithm used by the Arkansas Department of Human Services to calculate how much home care disabled people should receive from a Medicaid-funded program. It’s under challenge in court, so far with some success.

But here’s an in-depth report from The Verge on one person screwed by an assessment done by computer rather than a human being, along with a report on how the system came to be in the first place. Tammy Dobbs, who has cerebral palsy, saw her hours cut from 56 to 32 hours a week.

Advertisement

In-home care, the problem of allocating help is particularly acute. The United States is inadequately prepared to care for a population that’s living longer, and the situation has caused problems for both the people who need care and the aides themselves, some of whom say they’re led into working unpaid hours. As needs increase, states have been prompted to look for new ways to contain costs and distribute what resources they have.

States have taken diverging routes to solve the problem, according to Vincent Mor, a Brown professor who studies health policy and is an InterRAI member. California, he says, has a sprawling, multilayered home care system, while some smaller states rely on personal assessments alone. Before using the algorithmic system, assessors in Arkansas had wide leeway to assign whatever hours they thought were necessary. In many states, “you meet eligibility requirements, a case manager or nurse or social worker will make an individualized plan for you,” Mor says.

Arkansas has said the previous, human-based system was ripe for favoritism and arbitrary decisions.

The human-based system also was ripe for providing services that people needed. More services cost more. Gov. Asa Hutchinson’s administration, however, is all about “efficiency.” Meaning less money spent. This article tells the personal impact.

Even Brant Fries, who heads a company that develops the algorithms, says states couldn’t be wholly devoted to the computer’s findings.

Advertisement

He’s sympathetic to the people who had their hours cut in Arkansas. Whenever one of his systems is implemented, he says, he recommends that people under old programs be grandfathered in, or at least have their care adjusted gradually; the people in these programs are “not going to live that long, probably,” he says. He also suggests giving humans some room to adjust the results, and he acknowledges that moving rapidly from an “irrational” to a “rational” system, without properly explaining why, is painful. Arkansas officials, he says, didn’t listen to his advice. “What they did was, in my mind, really stupid,” he says. People who were used to a certain level of care were thrust into a new system, “and they screamed.”

The article talks of the efforts of Kevin DeLiban, a Legal Aid lawyer, to investigate the problem and eventually go to court.

De Liban started keeping a list of what he thought of as “algorithmic absurdities.” One variable in the assessment was foot problems. When an assessor visited a certain person, they wrote that the person didn’t have any problems — because they were an amputee. Over time, De Liban says, they discovered wildly different scores when the same people were assessed, despite being in the same condition. (Fries says studies suggest this rarely happens.) De Liban also says negative changes, like a person contracting pneumonia, could counterintuitively lead them to receive fewer help hours because the flowchart-like algorithm would place them in a different category. (Fries denied this, saying the algorithm accounts for it.)

But from the state’s perspective, the most embarrassing moment in the dispute happened during questioning in court. Fries was called in to answer questions about the algorithm and patiently explained to De Liban how the system works. After some back-and-forth, De Liban offered a suggestion: “Would you be able to take somebody’s assessment report and then sort them into a category?” (He said later he wanted to understand what changes triggered the reduction from one year to the next.)

Fries said he could, although it would take a little time. He looked over the numbers for Ethel Jacobs. After a break, a lawyer for the state came back and sheepishly admitted to the court: there was a mistake. Somehow, the wrong calculation was being used. They said they would restore Jacobs’ hours.

It would develop the algorithm didn’t account for diabetes issues for people with cerebral palsy  DeLiban also learned about state responses to errors.

Advertisement

But in internal emails seen by The Verge, Arkansas officials discussed the cerebral palsy coding error and the best course of action. On an email chain, the officials suggested that, since some of the people who had their hours reduced didn’t appeal the decision, they effectively waived their legal right to fight it. (“How is somebody supposed to appeal and determine there’s a problem with the software when DHS itself didn’t determine that?” De Liban says.) But after some discussion, one finally said, “We have now been effectively notified that there are individuals who did not receive the services that they actually needed, and compensating them for that shortcoming feels like the right thing to do.” It would also “place DHS on the right side of the story.”

The centerpiece of the story, Tammy Dobbs, remains in limbo, uncertain if the changes eventually will force her to be institutionalized.