1
|
Kim JK, Tawk K, Kim JM, Shahbaz H, Lipton JA, Haidar YM, Tjoa T, Abouzari M. Online ratings and narrative comments of American Head and Neck Society surgeons. Head Neck 2024. [PMID: 38488221 DOI: 10.1002/hed.27743] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/18/2023] [Revised: 02/29/2024] [Accepted: 03/10/2024] [Indexed: 03/17/2024] Open
Abstract
BACKGROUND We analyzed online rating scores and comments of head and neck surgeons to understand factors that contribute to higher ratings. METHODS Numerical ratings and comments for American Head and Neck Society physicians were extracted from Healthgrades, Vitals, RateMDs, and Yelp, with narrative comments categorized based on content. Physician practice location, education, and residency training were also compiled. RESULTS Patient ratings were significantly higher with supportive staff and affable physician demeanor but showed significant drops with longer wait times and difficulties scheduling appointments or follow-ups. Physician education and postgraduate training did not significantly affect ratings. CONCLUSION Online ratings and comments correlated to modifiable factors in clinical practice and may be informative in understanding patient needs.
Collapse
Affiliation(s)
- Joshua K Kim
- Department of Otolaryngology - Head and Neck Surgery, University of California Irvine, Irvine, California, USA
- School of Medicine, Duke University, Durham, North Carolina, USA
| | - Karen Tawk
- Department of Otolaryngology - Head and Neck Surgery, University of California Irvine, Irvine, California, USA
| | - Jonathan M Kim
- Department of Otolaryngology - Head and Neck Surgery, University of California Irvine, Irvine, California, USA
| | - Hady Shahbaz
- Department of Otolaryngology - Head and Neck Surgery, University of California Irvine, Irvine, California, USA
| | - Joshua A Lipton
- Department of Computer Science, University of California Irvine, Irvine, California, USA
| | - Yarah M Haidar
- Department of Otolaryngology - Head and Neck Surgery, University of California Irvine, Irvine, California, USA
| | - Tjoson Tjoa
- Department of Otolaryngology - Head and Neck Surgery, University of California Irvine, Irvine, California, USA
| | - Mehdi Abouzari
- Department of Otolaryngology - Head and Neck Surgery, University of California Irvine, Irvine, California, USA
| |
Collapse
|
2
|
Pruvis TE, Holzman S, Hess DK, Levin SC, Maher DP. Online Ratings of Pain Physicians in a Regional Population: What Matters? PAIN MEDICINE 2021; 21:1743-1748. [PMID: 32626891 DOI: 10.1093/pm/pnaa173] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Affiliation(s)
| | - Samuel Holzman
- Division of Infectious Diseases, Department of Internal Medicine, Johns Hopkins School of Medicine, Baltimore, Maryland
| | - Demere Kasper Hess
- Division of Chronic Pain Management, Department of Anesthesiology and Critical Care Medicine, Johns Hopkins School of Medicine, Baltimore, Maryland, USA
| | - Steven C Levin
- Division of Chronic Pain Management, Department of Anesthesiology and Critical Care Medicine, Johns Hopkins School of Medicine, Baltimore, Maryland, USA
| | - Dermot P Maher
- Division of Chronic Pain Management, Department of Anesthesiology and Critical Care Medicine, Johns Hopkins School of Medicine, Baltimore, Maryland, USA
| |
Collapse
|
3
|
Mulgund P, Sharman R, Anand P, Shekhar S, Karadi P. Data Quality Issues With Physician-Rating Websites: Systematic Review. J Med Internet Res 2020; 22:e15916. [PMID: 32986000 PMCID: PMC7551103 DOI: 10.2196/15916] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/19/2019] [Revised: 04/24/2020] [Accepted: 08/13/2020] [Indexed: 12/21/2022] Open
Abstract
BACKGROUND In recent years, online physician-rating websites have become prominent and exert considerable influence on patients' decisions. However, the quality of these decisions depends on the quality of data that these systems collect. Thus, there is a need to examine the various data quality issues with physician-rating websites. OBJECTIVE This study's objective was to identify and categorize the data quality issues afflicting physician-rating websites by reviewing the literature on online patient-reported physician ratings and reviews. METHODS We performed a systematic literature search in ACM Digital Library, EBSCO, Springer, PubMed, and Google Scholar. The search was limited to quantitative, qualitative, and mixed-method papers published in the English language from 2001 to 2020. RESULTS A total of 423 articles were screened. From these, 49 papers describing 18 unique data quality issues afflicting physician-rating websites were included. Using a data quality framework, we classified these issues into the following four categories: intrinsic, contextual, representational, and accessible. Among the papers, 53% (26/49) reported intrinsic data quality errors, 61% (30/49) highlighted contextual data quality issues, 8% (4/49) discussed representational data quality issues, and 27% (13/49) emphasized accessibility data quality. More than half the papers discussed multiple categories of data quality issues. CONCLUSIONS The results from this review demonstrate the presence of a range of data quality issues. While intrinsic and contextual factors have been well-researched, accessibility and representational issues warrant more attention from researchers, as well as practitioners. In particular, representational factors, such as the impact of inline advertisements and the positioning of positive reviews on the first few pages, are usually deliberate and result from the business model of physician-rating websites. The impact of these factors on data quality has not been addressed adequately and requires further investigation.
Collapse
Affiliation(s)
- Pavankumar Mulgund
- School of Management, State University of New York Buffalo, Buffalo, NY, United States
| | - Raj Sharman
- School of Management, State University of New York Buffalo, Buffalo, NY, United States
| | - Priya Anand
- Institute of Computational and Data Sciences, State University of New York Buffalo, Buffalo, NY, United States
| | - Shashank Shekhar
- School of Management, State University of New York Buffalo, Buffalo, NY, United States
| | - Priya Karadi
- Institute of Computational and Data Sciences, State University of New York Buffalo, Buffalo, NY, United States
| |
Collapse
|
4
|
Shervington L, Wimalasundera N, Delany C. Paediatric clinicians' experiences of parental online health information seeking: A qualitative study. J Paediatr Child Health 2020; 56:710-715. [PMID: 31849144 DOI: 10.1111/jpc.14706] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/23/2019] [Revised: 11/05/2019] [Accepted: 11/10/2019] [Indexed: 11/28/2022]
Abstract
AIM The aim of this research was to explore clinicians' experiences of parents' online health information seeking (OHIS) behaviour about selective dorsal rhizotomy for the management of cerebral palsy. METHODS Using qualitative methodology, clinicians likely to have had experience with parents requesting selective dorsal rhizotomy were invited to participate in semi-structured interviews. Interviews with 13 clinicians were recorded, transcribed and inductive content analysis was used to identify, code and organise the data into themes. RESULTS Participants highlighted how parental OHIS was changing clinical communication. Negative effects included a shift in clinicians' attention from giving advice and guidance to spending time discussing online findings, justifying how this information applies to a particular child and managing parents' judgments about clinical views. Positive effects included more collaboration and sharing of ideas. These results are presented in three main themes: (i) the informed parent; (ii) the clinicians' role; and (iii) a new clinical dynamic. CONCLUSION This research reinforces the notion that OHIS is changing the communication dynamic and clinicians' and parents' roles within the clinical encounter. Of significance was the number of challenges clinicians are facing as a result of online information, including managing parental understanding of non-evidenced information and responding to negative feedback about their practice. This research suggests a need for educational support and ongoing professional development for clinicians to assist them to adjust to new goals and expectations of clinical interactions with 'informed' parents.
Collapse
Affiliation(s)
- Lily Shervington
- The Department of Paediatrics, University of Melbourne, Melbourne, Victoria, Australia
| | - Neil Wimalasundera
- The Department of Paediatric Rehabilitation, The Royal Children's Hospital Melbourne, Melbourne, Victoria, Australia
| | - Clare Delany
- The Department of Medical Education, Melbourne Medical School, The University of Melbourne, Melbourne, Victoria, Australia.,Children's Bioethics Centre, The Royal Children's Hospital Melbourne, Melbourne, Victoria, Australia
| |
Collapse
|
5
|
LaPinska M, Kleppe K, Webb L, Stewart TG, Olson M. Robotic-assisted and laparoscopic hernia repair: real-world evidence from the Americas Hernia Society Quality Collaborative (AHSQC). Surg Endosc 2020; 35:1331-1341. [PMID: 32236756 DOI: 10.1007/s00464-020-07511-w] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2019] [Accepted: 03/14/2020] [Indexed: 12/30/2022]
Abstract
BACKGROUND Ventral hernia repair (VHR) is a commonly performed procedure and is especially prevalent in patients who have undergone previous open abdominal surgery: up to 28% of patients who have undergone laparotomy will develop a ventral hernia. There is increasing interest in robotic-assisted VHR (RVHR) as a minimally invasive approach to VHR not requiring myofascial release and in RVHR outcomes relative to outcomes associated with laparoscopic VHR (LVHR). We hypothesized real-world evidence from the Americas Hernia Society Quality Collaborative (AHSQC) database will indicate comparable clinical outcomes from RVHR and LVHR approaches not employing myofascial release. METHODS Retrospective, comparative analysis of prospectively collected data describing laparoscopic and robotic-assisted elective ventral hernia repair procedures reported in the multi-institutional AHSQC database. A one-to-one propensity score matching algorithm identified comparable groups of patients to adjust for potential selection bias that could result from surgeon choice of repair approach. RESULTS Matched data describe preoperative characteristics and perioperative outcomes in 615 patients in each group. The following significant differences were observed among the 11 outcomes that were pre-specified. Operative time tended to be longer for the RVHR group compared to the LVHR group (p < 0.001). Length of stay differed between the two groups; while both groups had a median length of stay of 0, stay lengths tended to be longer in the LVHR group (p < 0.001). Rates of conversion to laparotomy were fewer for the RVHR group: < 1% and 2%, respectively (p = 0.007). Through 30 days, there were fewer RVHR patient-clinic visits (p = 0.038). CONCLUSION Both RVHR and LVHR perioperative results compare favorably with each other in most measures. Differences favored RVHR in terms of shorter LOS, fewer conversions to laparotomy, and fewer postoperative clinic visits; differences favored LVHR in terms of shorter operative times.
Collapse
Affiliation(s)
- Melissa LaPinska
- University Health Systems, University of Tennessee Medical Center, 1934 Alcoa Highway, Suite D-285, Knoxville, TN, 37920, USA. .,Department of Surgery, University of Tennessee Graduate School of Medicine, Knoxville, TN, USA.
| | - Kyle Kleppe
- University Health Systems, University of Tennessee Medical Center, 1934 Alcoa Highway, Suite D-285, Knoxville, TN, 37920, USA.,Department of Surgery, University of Tennessee Graduate School of Medicine, Knoxville, TN, USA
| | - Lars Webb
- University Health Systems, University of Tennessee Medical Center, 1934 Alcoa Highway, Suite D-285, Knoxville, TN, 37920, USA.,Department of Surgery, University of Tennessee Graduate School of Medicine, Knoxville, TN, USA
| | - Thomas G Stewart
- Department of Biostatistics, Vanderbilt University Medical Center, Nashville, TN, USA
| | - Molly Olson
- Department of Healthcare Policy and Research, Weill Cornell Medicine, New York, NY, USA
| |
Collapse
|
6
|
Data Accuracy and Predictors of High Ratings of Colon and Rectal Surgeons on an Online Physician Rating Website. Dis Colon Rectum 2020; 63:226-232. [PMID: 31914115 DOI: 10.1097/dcr.0000000000001539] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/31/2022]
Abstract
BACKGROUND Online physician rating Web sites are used by over half of consumers to select doctors. No studies have examined physician rating Web sites for colon and rectal surgeons. OBJECTIVE The purpose of this study was to evaluate the accuracy and rating patterns of colon and rectal surgeons on the largest physician rating Web site. DESIGN Physician characteristics and ratings were collected from a randomly selected sample of 500 from 3043 Healthgrades "colon and rectal surgery specialists." Board certifications were verified with the American Board of Surgery and American Board of Colon and Rectal Surgery Web sites. SETTINGS Data acquisition was completed on July 18, 2018. PATIENTS Patients were not directly studied. MAIN OUTCOME MEASURES The primary outcome was to assess the accuracy of Healthgrades in reporting American Board of Surgery and American Board of Colon and Rectal Surgery certification. The secondary outcome was to identify factors associated with high star ratings. RESULTS A total of 48 (9.6%) of the 500 sampled were incorrectly identified as practicing US surgeons and excluded from subsequent analysis. Healthgrades showed 80.1% agreement with verified board certifications for American Board of Surgery and 85.4% for American Board of Colon and Rectal Surgery. The mean star rating was 4.2 of 5.0 (SD = 0.9), and 77 (21.6%) had 5-star ratings. In a multivariable logistic model (p < 0.001), 5-star rating was associated with 1 to 9 years (OR = 2.76; p = 0.04) or >40 years in practice (OR = 3.35; p = 0.04) and fewer reviews (OR = 0.88; p < 0.001). There were no significant associations with surgeon sex, age, geographic region, or board certification. LIMITATIONS Data were limited to a single physician rating Web site. CONCLUSIONS In the modern age of healthcare consumerism, physician rating Web sites should be used with caution given inaccuracies. More accurate online resources are needed to inform patient decisions in the selection of specialized colon and rectal surgical care. See Video Abstract at http://links.lww.com/DCR/B91. PRECISIÓN DE DATOS Y PREDICTORES DE ALTAS CALIFICACIONES DE CIRUJANOS DE COLON Y RECTO EN UN SITIO WEB DE CALIFICACIÓN MÉDICA EN LÍNEA: Más de la mitad de los consumidores utilizan los sitios web de calificación de médicos en línea para seleccionar médicos. Ningún estudio ha examinado los sitios web de calificación de médicos para cirujanos de colon y recto.Evaluar la precisión y los patrones de calificación de los cirujanos de colon y recto en el sitio web más grande de calificación de médicos.Las características y calificaciones de los médicos se obtuvieron de una muestra seleccionada al azar de 500 de 3,043 "especialistas en cirugía de colon y recto" de Healthgrades. Las certificaciones del Consejo se verificaron en los sitios web del Consejo Americano de Cirugía y del Consejo Americano de Cirugía de Colon y Recto.La adquisición de datos se completó el 18 de julio de 2018.Los pacientes no fueron estudiados directamente.El resultado primario fue evaluar la precisión de Healthgrades al informar la certificación por el Consejo Americano de Cirugía y por el Consejo Americano de Cirugía de Colon y Recto. El resultado secundario fue identificar factores asociados con altas calificaciones en estrellas.Un total de 48 (9.6%) de la muestra de 500 fueron identificados incorrectamente como cirujanos practicantes de EE. UU. y excluidos del análisis subsecuente. Healthgrades mostró un 80.1% de concordancia con las certificaciones verificadas del Consejo Americano de Cirugía y el 85.4% con el Consejo Americano de Cirugía de Colon y Recto. La calificación promedio de estrellas fue 4.2 / 5 (SD 0.9), y 77 (21.6%) tuvieron calificaciones de 5 estrellas. En un modelo logístico multivariable (p <0.001), la calificación de 5 estrellas se asoció con 1-9 años (OR 2.76, p = 0.04) o más de 40 años en la práctica (OR 3.35, p = 0.04) y menos evaluaciones (OR 0.88, p <0.001). No hubo asociaciones significativas con el género, edad, región geográfica o certificación por los Consejos del cirujano.Los datos se limitaron a un solo sitio web de calificación de médicos.En la era moderna del consumismo en atención médica, los sitios web de calificación de los médicos deben usarse con precaución debido a imprecisiones. Se necesitan recursos en línea más precisos para que las decisiones de los pacientes sean informadas en la selección de atención quirúrgica especializada de colon y recto. Consulte Video Resumen en http://links.lww.com/DCR/B91. (Traducción-Dr. Jorge Silva-Velazco).
Collapse
|
7
|
Rotman LE, Alford EN, Shank CD, Dalgo C, Stetler WR. Is There an Association Between Physician Review Websites and Press Ganey Survey Results in a Neurosurgical Outpatient Clinic? World Neurosurg 2019; 132:e891-e899. [PMID: 31382063 DOI: 10.1016/j.wneu.2019.07.193] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2019] [Revised: 07/25/2019] [Accepted: 07/26/2019] [Indexed: 11/30/2022]
Abstract
OBJECTIVE Recent studies suggest a poor association between physician review websites and the validated metrics used by the Centers for Medicare and Medicaid Services. The purpose of this study was to evaluate the association between online and outpatient Press Ganey (PG) measures of patient satisfaction in a neurosurgical department. METHODS We obtained PG survey results from one large academic institution's outpatient neurosurgery clinic. Popular physician review websites were searched for each of the faculty captured in the PG data. Average physician rating and percent Top Box scores were calculated for each physician. PG data were separated into new and established clinic visits for subset analysis. Spearman's rank correlation coefficients were calculated to determine associations. RESULTS Twelve neurosurgeons were included. Established patients demonstrated greater PG scores as compared with new patients, with an average physician rating increase of 0.55 and an average Top Box increase of 12.5%. Online physician ratings were found to demonstrate strong agreement with PG scores for the entire PG population, new patient subset, and established patient subset (ρ = 0.77-0.79, P < 0.05). Online Top Box scores demonstrated moderate agreement with overall PG Top Box scores (ρ = 0.59, P = 0.042), moderate agreement with the new patient population Top Box scores (ρ = 0.56, P = 0.059), and weak agreement with established patient population Top Box scores (ρ = 0.38, P = 0.217). CONCLUSIONS Our findings demonstrated a strong agreement between PG ratings and online physician ratings and a poorer correlation when comparing PG Top Box scores with online physician Top Box scores, particularly in the established patient population.
Collapse
Affiliation(s)
- Lauren E Rotman
- Department of Neurosurgery, University of Alabama at Birmingham, Birmingham, Alabama, USA.
| | - Elizabeth N Alford
- Department of Neurosurgery, University of Alabama at Birmingham, Birmingham, Alabama, USA
| | - Christopher D Shank
- Department of Neurosurgery, University of Alabama at Birmingham, Birmingham, Alabama, USA
| | - Caitlin Dalgo
- Department of Neurosurgery, University of Alabama at Birmingham, Birmingham, Alabama, USA
| | - William R Stetler
- Department of Neurosurgery, University of Alabama at Birmingham, Birmingham, Alabama, USA
| |
Collapse
|
8
|
Liu C, Uffenheimer M, Nasseri Y, Cohen J, Ellenhorn J. "But His Yelp Reviews Are Awful!": Analysis of General Surgeons' Yelp Reviews. J Med Internet Res 2019; 21:e11646. [PMID: 31038463 PMCID: PMC6658237 DOI: 10.2196/11646] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/23/2018] [Revised: 11/11/2018] [Accepted: 01/23/2019] [Indexed: 01/27/2023] Open
Abstract
Background Patients use Web-based platforms to review general surgeons. However, little is known about the free-form text and structured content of the reviews or how they relate to the physicians’ characteristics or their practices. Objective This observational study aimed to analyze the Web-based reviews of general surgeons on the west side of Los Angeles. Methods Demographics, practice characteristics, and Web-based presence were recorded. We evaluated frequency and types of Yelp reviews and assigned negative remarks to 5 categories. Tabulated results were evaluated using independent t test, one-way analysis of variance, and Pearson correlation analysis to determine associations between the number of total and negative reviews with respect to practice structure and physician characteristics. Results Of the 146 general surgeons, 51 (35%) had at least 1 review and 29 (20%) had at least 1 negative review. There were 806 total reviews, 679 (84.2%) positive reviews and 127 (15.8%) negative reviews. The negative reviews contained a total of 376 negative remarks, categorized into physician demeanor (124/376, 32.9%), clinical outcomes (81/376, 22%), office or staff (83/376, 22%), scheduling (44/376, 12%), and billing (44/376, 12%). Surgeons with a professional website had significantly more reviews than those without (P=.003). Surgeons in private practice had significantly more reviews (P=.002) and more negative reviews (P=.03) than surgeons who were institution employed. A strong and direct correlation was found between a surgeon’s number of reviews and number of negative reviews (P<.001). Conclusions As the most common category of complaints was about physician demeanor, surgeons may optimize their Web-based reputation by improving their bedside manner. A surgeon’s Web presence, private practice, and the total number of reviews are significantly associated with both positive and negative reviews.
Collapse
Affiliation(s)
- Cynthia Liu
- The Surgery Group of Los Angeles, Research Foundation, Los Angeles, CA, United States
| | - Meka Uffenheimer
- The Surgery Group of Los Angeles, Research Foundation, Los Angeles, CA, United States
| | - Yosef Nasseri
- The Surgery Group of Los Angeles, Research Foundation, Los Angeles, CA, United States
| | - Jason Cohen
- The Surgery Group of Los Angeles, Research Foundation, Los Angeles, CA, United States
| | - Joshua Ellenhorn
- The Surgery Group of Los Angeles, Research Foundation, Los Angeles, CA, United States
| |
Collapse
|
9
|
Evans RW. Negative Online Patient Reviews in Headache Medicine. Headache 2018; 58:1435-1441. [DOI: 10.1111/head.13419] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/28/2018] [Revised: 07/08/2018] [Accepted: 07/08/2018] [Indexed: 12/01/2022]
|
10
|
Abstract
With growing pressures to formulate easily interpreted quality metrics, potential pitfalls exist that deleteriously affect the ultimate outcome of patients. This article defines what quality means in hernia surgery, how it is measured, who measures it, and how it is reported. Key governmental organizations responsible are highlighted. Although striving for high quality seems relatively straightforward, it is a challenge to account for all variables. Most definitions of quality are based on products and derived from minimum standards. This transition to basing it on health care delivery is ongoing, challenging, and incredibly important for the future of patients.
Collapse
Affiliation(s)
- Michael J Rosen
- Lerner College of Medicine, Cleveland Clinic Foundation, 9500 Euclid Avenue, A-100, Cleveland, OH 44195, USA.
| |
Collapse
|