Document Type

Article

Publication Date

1-2-2017

Publication Title

BMC Medical Education

Abstract

Background: This study explored the use of virtual patient generated data by investigating the association between students’ unprofessional patient summary statements, which they entered during an on-line virtual patient case, and detection of their future unprofessional behavior.

Method: At the USUHS, students complete a number of virtual patient encounters, including a patient summary, to meet the clerkship requirements of Internal Medicine, Family Medicine, and Pediatrics. We reviewed the summary statements of 343 students who graduated in 2012 and 2013. Each statement was rated with regard to four features: Unprofessional, Professional, Equivocal (could be construed as unprofessional), and Unanswered (students did not enter a statement). We also combined Unprofessional and Equivocal into a new category to indicate a statement receiving either rating. We then examined the associations of students’ scores on these categories (i.e. whether received a particular rating or not) and Expertise score and Professionalism score reflected by a post-graduate year one (PGY-1) program director (PD) evaluation form. The PD forms contained 58 Likert-scale items designed to measure the two constructs (Expertise and Professionalism).

Results: The inter-rater reliability of statements coding was high (Cohen’s Kappa = .97). The measure of receiving an Unprofessional or Equivocal rating was significantly correlated with lower Expertise score (r = −.19, P < .05) as well as lower Professionalism score (r = −.17, P < .05) during PGY-1.

Conclusion: Incident reports and review of routine student evaluations are what most schools rely on to identify the majority of professionalism lapses. Unprofessionalism reflected in student entries may provide additional markers foreshadowing subsequent unprofessional behavior.

DOI

10.1186/s12909-016-0840-9

Share

COinS