FDA says AI tools to warn of sepsis should be regulated as devices

FDA says AI tools to warn of sepsis should be regulated as devices

JThe Food and Drug Administration on Tuesday released a list of artificial intelligence tools that should be regulated as medical devices, in some cases appearing to extend its oversight to previously unregulated software products.

In a final new guidance for industry, the agency clarified that tools designed to warn caregivers of sepsis, a life-threatening complication of infection, should be subject to regulatory review. Health software vendors have been selling tools designed to flag the condition for years without obtaining FDA clearance.

Sepsis, which kills more than 200,000 people in the United States each year, is particularly difficult to detect. Several companies have developed AI tools to predict which patients are most likely to develop the disease, with the aim of helping hospitals speed up the delivery of antibiotics and save more lives.

advertising

But the tools don’t always work as advertised. STAT has released several investigations detailing shortcomings of a widely used tool developed by Epic Systems, the nation’s largest provider of electronic health records. Investigations revealed that the tool frequently delivered false alarms and failed to detect the condition in advance, distracting caregivers working in emergency situations. Epic’s sepsis alert system is used by more than 180 customers in the United States and Canada.

The FDA has traditionally avoided regulating software tools integrated with electronic health records, an area considered outside the scope of regulation because the software was used primarily as a record-keeping system that posed minimal risks to patients. patients.

advertising

But the growing sophistication of the products used in EHRs and the growing role they play in advising providers on the treatment of serious and life-threatening conditions have led to growing calls for the FDA to consider new closer these products.

“EHR vendors need to have control over how they build these algorithms and how they check for bias,” said Harvard University biostatistician Leo Celi, who recently published a paper calling for a enhanced regulations.

Celi added that even the new guidelines, which list 34 distinct product types that the FDA says should be regulated, do not create clarity due to fine line distinctions between product categories and room for interpretation of language. from the FDA.

“There needs to be more public discourse and discussion among all stakeholders,” he said. “They offer that (advice) and the line between software as a medical device and a non-device is not very clear.”

The new guidelines aren’t binding and don’t necessarily mean the FDA will soon begin regulating sepsis tools and other products flagged as devices in the document. It aims to clarify regulatory boundaries outlined in the federal 21st Century Cures Act of 2016, which included exclusions for technology products that lawmakers wanted excluded from FDA review.

But the exclusions rely on definitions that are difficult to parse and may apply unevenly to a new generation of AI products flooding the market. In addition to sepsis-related products, the guidelines also state that the FDA believes it should review products that predict heart failure hospitalizations, as well as those designed to identify signs of patient deterioration or review medical information to identify patients who may be opioid dependent. .

Similar Posts

Leave a Reply

Your email address will not be published.