A systematic review of the diagnostic accuracy of artificial intelligence-based computer programs to analyze chest x-rays for pulmonary tuberculosis
We undertook a systematic review of the diagnostic accuracy of artificial intelligence-based software for identification of radiologic abnormalities (computer-aided detection, or CAD) compatible with pulmonary tuberculosis on chest x-rays (CXRs). We searched four databases for articles published bet...
        Saved in:
      
    
          | Published in | PloS one Vol. 14; no. 9; p. e0221339 | 
|---|---|
| Main Authors | , , , , , , , , | 
| Format | Journal Article | 
| Language | English | 
| Published | 
        United States
          Public Library of Science
    
        03.09.2019
     Public Library of Science (PLoS)  | 
| Subjects | |
| Online Access | Get full text | 
| ISSN | 1932-6203 1932-6203  | 
| DOI | 10.1371/journal.pone.0221339 | 
Cover
| Summary: | We undertook a systematic review of the diagnostic accuracy of artificial intelligence-based software for identification of radiologic abnormalities (computer-aided detection, or CAD) compatible with pulmonary tuberculosis on chest x-rays (CXRs). We searched four databases for articles published between January 2005-February 2019. We summarized data on CAD type, study design, and diagnostic accuracy. We assessed risk of bias with QUADAS-2. We included 53 of the 4712 articles reviewed: 40 focused on CAD design methods ("Development" studies) and 13 focused on evaluation of CAD ("Clinical" studies). Meta-analyses were not performed due to methodological differences. Development studies were more likely to use CXR databases with greater potential for bias as compared to Clinical studies. Areas under the receiver operating characteristic curve (median AUC [IQR]) were significantly higher: in Development studies AUC: 0.88 [0.82-0.90]) versus Clinical studies (0.75 [0.66-0.87]; p-value 0.004); and with deep-learning (0.91 [0.88-0.99]) versus machine-learning (0.82 [0.75-0.89]; p = 0.001). We conclude that CAD programs are promising, but the majority of work thus far has been on development rather than clinical evaluation. We provide concrete suggestions on what study design elements should be improved. | 
|---|---|
| Bibliography: | ObjectType-Article-2 SourceType-Scholarly Journals-1 content type line 14 ObjectType-Feature-3 ObjectType-Evidence Based Healthcare-1 ObjectType-Article-1 ObjectType-Feature-2 content type line 23 ObjectType-Undefined-3 Competing Interests: The authors have declared that no competing interests exist.  | 
| ISSN: | 1932-6203 1932-6203  | 
| DOI: | 10.1371/journal.pone.0221339 |