Tennant JN, Shankar V, Dirschl DR. Reliability and validity of a mobile phone for radiographic assessment of ankle injuries: a randomized inter- and intraobserver agreement study.
Foot Ankle Int 2013;
34:228-33. [PMID:
23413062 DOI:
10.1177/1071100712466849]
[Citation(s) in RCA: 12] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 02/01/2023]
Abstract
BACKGROUND
Current mobile phone technology may allow orthopaedic surgeons to make clinical decisions using radiographs viewed on a small mobile device screen. The purpose of this study was to examine the reliability and validity of interpreting ankle fracture images viewed on a mobile device and a computer monitor, with a hypothesis that the agreement in clinical decision making between the mobile device and computer monitor would be high.
METHODS
A randomized interobserver and intraobserver reliability study was conducted in which 16 mortise and lateral ankle images representing a severity spectrum of malleolar ankle, plafond, and extra-articular tibial fractures were shown to volunteer orthopaedic surgeons on both an Apple fourth-generation iPod Touch and a 23-inch liquid crystal display (LCD) computer monitor. Participants answered a multiple-choice questionnaire for each image regarding diagnosis, severity, need for higher level imaging, need for acute inpatient versus outpatient management, and plan of treatment. Inter- and intraobserver reliability was assessed by kappa (κ), multirater kappa statistics, and intraclass correlation coefficient (ICC).
RESULTS
Ninety-three orthopaedic surgeon volunteers completed the study. Excellent intraobserver agreement (κ ≥ 0.8) was found for all variables measured, including diagnosis (median κ = 0.84), need for computed tomography scan (κ = 0.86), need for reduction (κ = 0.82), treatment setting (κ = 0.82), and treatment type (κ = 0.87). Interobserver agreement was consistent between the mobile device and computer screen. Interobserver agreement for the severity assessment had a slightly higher ICC for the mobile device compared with the computer monitor (ICC = 0.83 vs 0.79). Sixty-seven percent (62/93) said at the completion of the study they were "completely" or "very" comfortable using a mobile device as a primary viewing device for new emergency room, inpatient, or transfer request consults.
CONCLUSIONS
Strong reliability for radiographic assessment of ankle injuries existed between a 23-inch computer monitor and a handheld mobile device. Further study is warranted to validate the technology to apply to other anatomic locations and imaging modalities.
LEVEL OF EVIDENCE
Level II, diagnostic study.
Collapse