A procurement contract template with AI riders, algorithms that are secure and free from bias and medical devices that can be queried as to their functionality and safety will all improve the quality of health care, Aspen Institute fellows say.
A toolkit that can reduce algorithmic bias in artificial intelligence tools for the health industry would mean better care for everyone, a data scientist said during an online demonstration of the solution.
“The unique risks in algorithmic bias come from [the] way that it allows the systematic and repeatable automation of biases to impact people on a scale that was previously impossible,” said Matthew Zhou, a tech policy fellow at the Aspen Institute, which hosted the “Improving Healthcare and Health Data Privacy” event Feb. 1. “Assumptions are baked into the technology by the designers and these assumptions should be tested to ensure they aren’t automating harm.”
Called Diagnosing Bias, the toolkit contains resources to help government health care procurement officers incorporate best practices for algorithmic accountability, said Zhou, who is also a senior data engineer at Peloton, a fitness company. The toolkit’s two main elements are a procurement template generator tool with AI contract riders and the AI Model Checklist.
The template generator provides procurement officers with a readymade template for writing health care AI contracts that includes clauses addressing transparency, bias mitigation, security and privacy. This is not unlike request for proposal templates common for many other purchases, such as property, Zhou said.
The goal is to make these templates open sourced and freely available to procurement officers, he added.
The second tool is the checklist, which provides a set of guiding questions and transparency artifacts that procurement officers can solicit from health care AI companies at each stage of the AI design process.
Health care AI has the potential to vastly improve medicine – once bias is minimized. Zhou cited a 2019 Science Academic Journal article that stated that health care algorithms overlooked 28% of Black patients when compared to white patients with the same disease.
“That happens because the algorithm that recommended patients in this case assume that the financial spend of a patient equated to the severity of their illnesses,” Zhou said. “Basically, the more you spent on health care in the past, the sicker the algorithm thought you were.”
The National Institute for Health Care Management Foundation offers examples: The American Heart Association’s Heart Failure Risk Score assigns three additional points to patients identified as “nonblack,” which may raise the bar for hospital admission for Black patients. The STONE score, which predicts the likelihood of kidney stones in patients who arrive at the emergency room with flank pain adds three points for “nonblack” patients, leading clinicians away from diagnosing the condition.
“I do believe that transformation for the health care system will progress at the pace of trust,” Aneesh Chopra, cofounder and president of CareJourney and former U.S. chief technology officer, said during the demonstration day.
Zhou is one of several Aspen Institute fellows looking at digital health care innovation. Another is Daniel Bardenstein, technology strategy lead at the Cybersecurity and Infrastructure Security Agency and former cybersecurity lead for Operation Warp Speed for COVID vaccine discovery and production. His project focuses on securing digital health care devices.
Bardenstein proposes a Device Query Interface, which manufacturers could build into their equipment. It would allow hospitals to ask the devices about their cybersecurity status, with the improved visibility into device function and security translating into better patient health and safety.
“I like to think of it like a digital concierge in a hotel,” Bardenstein said. “If you needed to find out where your friends and family were staying in a hotel, instead of having to knock on every single door, there’s a concierge right up front, very easy where you can ask a question and get back very quickly an answer.”
He also urged the Food and Drug Administration to establish a cyber baseline of mandatory protections for all medical devices. Currently, there are about 15 million connected medical devices – about 20,000 per hospital – in the United States, he said.
“While beneficial, a surprising amount of these medical devices – as much as 50% of these devices – are trivially easy to hack,” Bardenstein said. “We’re really talking about securing patients and saving lives.”
Stephanie Kanowitz is a freelance writer based in northern Virginia.