MONDAY, Oct. 2, 2023 (HealthDay News) — A chest radiography foundation model demonstrates racial and sex-related bias, which may lead to disparate performance across patient groups, according to a study published online Sept. 27 in Radiology: Artificial Intelligence.
Ben Glocker, Ph.D., from Imperial College London, and colleagues conducted a retrospective study using 127,118 chest radiographs from 42,884 patients from the CheXpert dataset obtained between October 2002 and July 2017. The presence of bias in features generated by a chest radiography foundation model and baseline deep learning model was determined. A disease detection analysis was performed to associate any biases with specific disparities in classification performance across subgroups of patients.
The researchers found that in the studied foundation model, 10 of 12 pairwise comparisons across biologic sex and race showed significant differences compared with four significant tests in the baseline model. In the feature projections that primarily capture disease, significant differences were found between male and female patients and Asian and Black patients. Classification performance on the “no finding” label decreased between 6.8 and 7.8 percent for female patients when comparing the average model performance across all subgroups, and performance in detecting pleural effusion decreased between 10.7 and 11.6 percent for Black patients.
“Our bias analysis showed that the foundation model consistently underperformed compared to the reference model,” Glocker said in a statement. “We observed a decline in disease classification performance and specific disparities in protected subgroups.”
Several authors disclosed ties to industry.
Copyright © 2023 HealthDay. All rights reserved.