(HealthDay News) — An artificial intelligence (AI) tool can exclude pathology, with an equal or lower rate of critical misses on radiographs than radiologists, according to a study published online Aug. 20 in Radiology.

Louis Lind Plesner, MD, from Herlev and Gentofte Hospital in Denmark, and colleagues estimated the proportion of unremarkable chest radiographs in which AI can correctly exclude pathology without increasing diagnostic errors. Consecutive chest radiographs were obtained for 1,961 adults from Jan. 1 to 12, 2020, at four Danish hospitals. Chest radiographs were labeled as remarkable or unremarkable based on predefined findings; radiology reports were similarly classified. A commercial AI tool was adapted to output the probability of chest radiograph remarkableness, which was used to calculate specificity at different AI sensitivities.

The researchers found that based on the reference standard, 62.8 and 37.2% of chest radiographs were labeled as remarkable and unremarkable, respectively. The AI had specificity of 24.5, 47.1 and 52.7% at sensitivity of 99.9, 99.0 and 98.0%, respectively. With AI sensitivity fixed to be similar to that of radiology reports (87.2%), the missed findings of AI and reports had 2.2 and 1.1% classified as critical, and 4.1 and 3.6% and 6.5 and 8.1% classified as clinically significant and clinically insignificant, respectively. The AI tool exhibited ≤1.1% critical misses at sensitivities ≥95.4%.

“These results should be evaluated in a prospective study due to the high potential for mitigating workload challenges in radiology departments,” the authors write.

Several authors disclosed ties to the pharmaceutical and medical device industries.

Abstract/Full Text

Editorial (subscription or payment may be required)