Algorithm used by thousands of US hospitals biased against blacks

Algorithm used by thousands of US hospitals biased against blacks

Washington (AFP) – Computer algorithms are used across thousands of hospitals in the United States to identify patients most at risk in order to provide them with extensive follow-ups.

But these programs exhibit significant racial bias in favor of whites and against blacks, according to a new study published in Science on Thursday.

Ziad Obermeyer, of the University of California, Berkeley, told AFP he stumbled onto the finding almost by chance, while analyzing data provided by a major university hospital.

The algorithm had calculated “risk scores” to identify the three percent of patients most at risk (with diabetes, congestive heart failure, emphysema etc.), who could then call a dedicated hotline, get appointments the same day or schedule home visits.

“If you looked at two people, one black, one white, with the exact same risk score, the black patient was much more likely to go on to have worse health problems over the coming year than the white patient,” Obermeyer told AFP.

The algorithm isn’t programmed to consider race. It works by analyzing health care costs generated by a patient in the past. 

“That’s the problem, because black patients, on average, generate fewer costs than white patients with the same level of health,” said Obermeyer.

That’s the result of deep-rooted inequalities that mean blacks have fewer health visits, and when they do, doctors prescribe on average less medication and order fewer tests.

“There’s an illusion that we are working with, you know, biological variables or variables that describe physiology, in an objective way,” said Obermeyer. 

“But that’s not what these data sets are, these data sets come from financial transactions between the hospital and insurance, they come from doctors writing things down or not writing things down.”

The company that markets the software has accepted a suggestion by researchers that is hoped to reduce the racial imbalance by more than 80 percent. 

But as Obermeyer points out, tweaking the code is only the first step: we need better data about patients’ actual health status.

“It’s very strange that our major source of data about medicine comes from financial transactions,” he concluded. “I think the (US) health care system does not take seriously the problem of acquiring and maintaining good health information. “


Please let us know if you're having issues with commenting.