Tuesday, May 30, 2023

The impact of artificial intelligence on the reading times of radiologists for chest radiographs – npj Digital Medicine


The Institutional Review Board (IRB) of Yongin Severance Hospital approved this prospective study (IRB number 9-2021-0106), and all participants provided written informed consent to take part in this study. Informed consent was given by the radiologists who autonomously agreed to participate in this study. Attending radiologists who agreed to have the reading times of their daily CXR interpretations collected from September to December 2021 were recruited prospectively on August 2021 (Fig. 1). Radiologists who wished to participate in the study were eligible for inclusion regardless of their experience in the field of radiology, as long as they were all board-certified radiologists and employed at the hospital during the study period and agreed to the terms. Two authors in this study were excluded from the participants to minimize bias. In our hospital, radiographs, including CXRs, are read by all radiologists regardless of subspecialty, with a minimum recommendation of 500 radiographs for each month. Therefore, radiologists were requested to read CXRs just as they would normally do in their routine daily practice, with a minimum requirement of 300 CXRs per month during the study period. They independently read CXRs freely, referring to electronic medical records or available previous images while being kept blind to their reading times.

AI application to CXR

In our hospital, commercially available AI-based lesion detection software (Lunit Insight CXR, version 3, Lunit, Korea) has been integrated into all CXRs since March 2020. Doctors could refer to the analyzed AI results by simply scrolling down images on the picture archiving communication system (PACS) because the analyzed results were attached to the second image of the original CXR as patients underwent examinations. The software could detect a total of eight lesions (atelectasis, cardiomegaly, consolidation, fibrosis, nodule, pleural effusion, pneumoperitoneum, and pneumothorax) and displays a contour map for lesion localization when the operating point is over 15% (Fig. 3). For detected lesions, abbreviations, and abnormality scores are displayed separately on PACS. The abnormality score represents the probability of the presence of the lesion on CXR determined by AI and ranging from 0 to 100%. Among the abnormality scores of detected lesions, the highest score was used as a total abnormality score, and this was listed as a separate column on the PACS. Therefore, doctors could refer to AI results whenever they wished, and radiologists could prioritize CXRs using the total abnormality score column on the PACS during their reading sessions if they wanted. A more detailed explanation of the integration process of AI to all CXRs was given in a recent study20,27. Therefore, the participating radiologists used the AI software for more than one year in the involved study period.

Fig. 3: Integration of AI for CXRs on PACS.

a The AI result attached to the second image of the original CXR contains a contour map, abbreviations, and the abnormality score of detected lesions. Doctors can simply refer to the AI results by scrolling down the original image on the PACS. b The highest abnormality score is used as the total abnormality score of each CXR, and this was listed as a separate column (red square) on the PACS.

Reading time measurement in AI-unaided and AI-aided periods

Reading time was defined as the duration in seconds from opening CXRs to transcribing that image by the same radiologist on the PACS. The reading time of each CXR could be extracted from the PACS log record. For the participating radiologists, we preset the PACS to not show the AI results during September and November 2021 (AI-unaided period) and to show the AI results in October and December 2021 (AI-aided period) automatically (Fig. 1). During the AI-unaided period, AI results, including secondary capture images attached to the original CXR and the abnormality score column on the worklist, were not shown on the PACS automatically, and the participating radiologists were blinded to them. However, during the AI-aided period, the results were made available and could be freely utilized by radiologists. The CXRs of patients more than 18 years old were included for analysis because the software has been approved for adult CXRs. We excluded reading time outliers with a duration of more than 51 s based on the outlier detection method. These outliers in reading time could be from various conditions, such as from delayed interpretation of corresponding CXRs after opening by unexpected interruption from other work12.

For the included CXRs, patient age, sex, and information on whether CXRs were taken at an inpatient or outpatient clinics were reviewed using electronic medical records. The location of patients at the time of the CXR, including the ER, general ward, and intensive care unit, was also reviewed. The presence of previous comparable CXRs was analyzed as a possible factor affecting reading times. For the AI results, the abnormality score was analyzed as both a continuous variable using the number itself and a categorical variable by applying a cutoff value of 15%. This cutoff value was chosen because our hospital has employed an operating point of 15% when determining the presence of lesions according to the vendor’s guidelines12. When the operating point was above 15%, the AI software marked the lesion location with a contour map, abnormality score, and abbreviation for each lesion on images20. Therefore, the presence of lesions, including atelectasis, cardiomegaly, consolidation, fibrosis, nodule, pleural effusion, pneumoperitoneum, and pneumothorax, were evaluated by using each abnormality score itself as a continuous variable and by applying the operating point. In addition, the highest score was used as a total abnormality score of each CXR and used to determine whether the CXRs included any abnormalities.

Statistical analysis

For statistical analysis, the R program (4.1.3, Foundation for Statistical Computing, Vienna, Austria, package lme4, lmerTest) was used. We used the 1.5 IQR method to exclude CXRs with reading time outliers. This method is a conventional method to define outliers by using the first quartile (6 s in our study) and the third quartile (24 s). The formula to determine a cutoff value for the outlier was as follows; 24 + (24–6) × 1.5 = 51 s. The chi-square test and two-sample t-test were used for comparison of the total number of included CXRs and the ages of the patients in the AI-unaided and AI-aided periods. A linear mixed model was used to compare reading times considering the random effects of radiologists and patients. Reading times in seconds were compared between AI-unaided and AI-aided periods according to patient characteristics (sex, age, location, and presence of previous comparable CXR). Reading times were compared according to the presence of lesions detected by AI (any one of the following eight abnormalities: atelectasis, cardiomegaly, consolidation, fibrosis, nodule, pleural effusion, pneumoperitoneum, pneumothorax) using an operating point of 15%. When the abnormality score was considered as a continuous variable, reading times were compared between AI-unaided and AI-aided conditions. The variables, AI availability, and their interactions were considered as fixed effects for the linear mixed model. p-values less than 0.05 were considered statistically significant.

Reporting summary

Further information on research design is available in the Nature Research Reporting Summary linked to this article.

Source link

Related Articles

Leave a Reply

Stay Connected

- Advertisement -spot_img

Latest Articles

%d bloggers like this: