Whether any of these strategies will correct pulse oximeter bias remains to be seen.. But it's likely that by the time improved devices go through regulatory approval, the bar for performance will be higher. At meeting last week, committee members considered a proposal that would require companies to test the device on at least 24 people whose skin tones span the entirety of a 10-shade scale. The current requirement is that the trial must include 10 people, two of whom have “dark pigmented” skin.
Meanwhile, health workers are wondering how to use existing tools and whether they should trust them. At the advisory committee meeting Friday, a committee member asked a representative from Medtronic, one of the largest suppliers of pulse oximeters, whether the company had considered a voluntary recall of its devices. “We are 100% confident that our devices meet current FDA standards,” said Sam Ajizian, chief medical officer of patient monitoring at Medtronic. A recall “would harm public safety because it is a fundamental device in operating rooms, intensive care units, emergency rooms, ambulances and everywhere.”
But not everyone agrees that the pros outweigh the cons. Last fall, a community health center in Oakland, California, filed a complaint against some of the largest manufacturers and sellers of pulse oximeters, asking the court to ban the sale of these devices in California until the readings are proven accurate for dark-skinned people, or until devices carry a warning label.
“The pulse oximeter is an example of the tragic harm that occurs when the nation's health care industry and the regulatory agencies that oversee it prioritize the health of white people over the realities of non-white patients,” said Noha Aboelata, CEO of Roots Community Health Center. , in a report. “The history of the manufacturing, marketing and use of racist pulse oximeters constitutes an indictment of our health care system. »
Read more about MIT Technology Reviewarchives
Melissa Heikkilä's reporting showed her how AI humans are “pale, masculine and stale.” Can we just ask him to do better?
It's no surprise that technology perpetuates racism, Charlton McIlwain wrote in 2020. That's how it was designed. “The question we must face is whether we will continue to design and deploy tools that serve the interests of racism and white supremacy. »
We've seen that deep learning models can perform as well as medical professionals when it comes to imaging tasks, but they can also perpetuate bias. Some researchers say the way to solve the problem is to stop training the algorithms to match experts, Karen Hao reported in 2021.
From all over the world on the web
The high levels of lead found in applesauce packets came from a single cinnamon processing plant in Ecuador. (BNC)