Would it surprise you if I told you that a popular and well-respected machine learning algorithm developed to predict the onset of sepsis has shown some evidence of racial bias?[1] How can that be, you might ask, for an algorithm that is simply grounded in biology and medical data? I’ll tell you, but I’m not going to focus on one particular algorithm. Instead, I will use this opportunity to talk about the dozens and dozens of sepsis algorithms out there. And frankly, because the design of these algorithms mimics many other clinical algorithms, these comments will be applicable to clinical algorithms generally.
Blog Editors
Recent Updates
- Utah Law Aims to Regulate AI Mental Health Chatbots
- National Science Foundation (NSF) Imposes 15% Indirect Cost Rate Cap: What to Know
- New DOJ White Collar Priorities Focus on Health Care Fraud
- Federal Regulators Announce Non-Enforcement of the 2024 Rule for Mental Health Parity
- Will Colorado’s Historic AI Law Go Live in 2026? Its Fate Hangs in the Balance in 2025