Sommers M, Beacham B, Baker R, Fargo J. Intra- and inter-rater reliability of digital image analysis for skin color measurement.
Skin Res Technol 2013;
19:484-91. [PMID:
23551208 DOI:
10.1111/srt.12072]
[Citation(s) in RCA: 11] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 02/23/2013] [Indexed: 11/29/2022]
Abstract
BACKGROUND
We determined the intra- and inter-rater reliability of data from digital image color analysis between an expert and novice analyst.
METHODS
Following training, the expert and novice independently analyzed 210 randomly ordered images. Both analysts used Adobe(®) Photoshop lasso or color sampler tools based on the type of image file. After color correction with Pictocolor(®) in camera software, they recorded L*a*b* (L*=light/dark; a*=red/green; b*=yellow/blue) color values for all skin sites. We computed intra-rater and inter-rater agreement within anatomical region, color value (L*, a*, b*), and technique (lasso, color sampler) using a series of one-way intra-class correlation coefficients (ICCs).
RESULTS
Results of ICCs for intra-rater agreement showed high levels of internal consistency reliability within each rater for the lasso technique (ICC ≥ 0.99) and somewhat lower, yet acceptable, level of agreement for the color sampler technique (ICC = 0.91 for expert, ICC = 0.81 for novice). Skin L*, skin b*, and labia L* values reached the highest level of agreement (ICC ≥ 0.92) and skin a*, labia b*, and vaginal wall b* were the lowest (ICC ≥ 0.64).
CONCLUSION
Data from novice analysts can achieve high levels of agreement with data from expert analysts with training and the use of a detailed, standard protocol.
Collapse