The following is a summary of “Crowdsourced assessment of surgical skills: A systematic review” published in the November 2022 issue of Surgery by Olsen et al.

The procedure’s success significantly impacts the patient’s result, making surgical skill development crucial. Through the use of crowdsourced evaluation, several unskilled members of the general public may complete jobs in the medical industry. For a study, researchers sought to examine the relationship between crowd workers and experienced surgeons to employ crowdsourced judgments of surgical expertise.

From the beginning to the present, a thorough literature assessment was done on April 14, 2021. The Medical Education Research Study Quality Instrument (MERSQI) and Newcastle-Ottawa Scale-Education (NOS-E) were used by two reviewers to screen all papers for inclusion in accordance with the inclusion criteria and evaluate the articles’ quality. For each article, general information was taken out.

There were 250 prospective studies found, and 32 papers were included. Between crowd workers and specialists, there appeared to be an overall moderate to very significant correlation (Cronbach’s alpha 0.72-0.95, Pearson’s r 0.7-0.95, Spearman Rho 0.7-0.89, linear regression 0.45-0.89). On the other hand, six investigations found either a dubious association or none at all between crowd workers and specialists.

In the dry lab, simulation, and real operations, crowdsourced evaluation could offer precise, timely, economic, and objective input for many specializations and types of surgery.

Reference: americanjournalofsurgery.com/article/S0002-9610(22)00453-6/fulltext

Author