A recent blockbuster study found that software used in healthcare settings systematically provides worse care for black patients than white patients, and two senators want to know what both the industry and regulators are going to do to fix the situation.
Senators Cory Booker (D-N.J.) and Ron Wyden (D-Ore.) on Tuesday issued letters to the Federal Trade Commission, the Centers for Medicare and Medicaid Services (CMS), and the five largest US health insurers asking about bias in the algorithms used to make healthcare decisions.
"In using algorithms, organizations often attempt to remove human flaws and biases from the process," Booker and Wyden wrote. "Unfortunately, both the people who design these complex systems, and the massive sets of data that are used, have many historical and human biases built in. Without very careful consideration, the algorithms they subsequently create can further perpetuate those very biases."
The senators have slightly different questions for each group. From the FTC, Wyden and Booker want to know if there is any regulation or policy being considered to address algorithmic bias in products and services, or at least if the commission has "any ongoing studies or investigations into damages done to consumer welfare by discriminatory and biased algorithms."
The letter to the CMS probes the use of potentially biased algorithms in federal healthcare, including flat-out asking: "What is the agency doing to ensure that algorithmic bias is taken into account?" The senators also ask what other federal agencies and external stakeholders are involved in making those choices and what data the CMS uses to make its decisions.
The last set of letters, basically identical, went to Aetna (CVS), Blue Cross Blue Shield, Cigna, Humana, and UnitedHealth, which among them cover more than 300 million Americans. Those are a bit more pointed, not asking if algorithms are in use but instead how many, what specific decisions they're used to make, and how many people they impact. When insurers put them in place, they ask, do they consult anyone who might be affected by bias, or with outside sources who can vet them?