ORIGINAL REPORTS
Intraoperative Feedback: A Video-Based
Analysis of Faculty and Resident
Perceptions
Priya H. Dedhia, MD, PhD,a
Meredith Barrett, MD,a
Graham Ives, MD,a
Christopher P. Magas, MBS,a
Oliver A. Varban, MD,a
Sandra L. Wong, MD,b
and Gurjit Sandhu, PhDa
a
Department of Surgery, Michigan Medicine, Ann Arbor, Michigan; and b
Department of Surgery, Dartmouth-Hitch-
cock, Lebanon, New Hampshire
OBJECTIVE: Residents and faculty identify intraoperative
feedback as a critical component of surgical education.
Studies have demonstrated that residents perceive lower
quality and frequency of intraoperative feedback com-
pared to faculty. These differences in perception may be
due to dissimilar identification of feedback. The purpose
of this study was to determine if residents and faculty dif-
ferently identify intraoperative interactions as feedback.
DESIGN: Residents and faculty viewed a segment of a
laparoscopic cholecystectomy video and then time-
stamped the video where they perceived moments of
intraoperative feedback. Validated surveys on timing,
amount, specificity, and satisfaction with operative feed-
back were administered.
SETTING: Viewing of the video and survey administra-
tion was conducted at the University of Michigan.
PARTICIPANTS: A total of 23 of 41 residents (56%) and
29 of 33 faculty (88%) participated in this study.
RESULTS: Survey analysis demonstrated that residents
perceived operative feedback to occur with less immedi-
acy, specificity, and frequency compared to faculty. Dur-
ing the 10-minute video, residents and faculty identified
feedback 21 and 29 times, respectively (p = 0.13). Ten-
second interval analysis demonstrated 7 statistically sig-
nificant intervals (p < 0.05) where residents identified
feedback less frequently than faculty. Analysis of these
7 intervals revealed that faculty were more likely to iden-
tify interactions, especially nonverbal ones, as feedback.
Review of free-text comments confirmed these findings
and suggested that residents may be more receptive to
feedback at the conclusion of the case.
CONCLUSIONS: Using video review, we show that resi-
dents and faculty identify different intraoperative inter-
actions as feedback. This disparity in identification of
feedback may limit resident satisfaction and effective
intraoperative learning. Timing and labeling of feedback,
continued use of video review, and structured teaching
models may overcome these differences and improve
surgical education. ( J Surg Ed 76:906À915. Ó 2019
Association of Program Directors in Surgery. Published
by Elsevier Inc. All rights reserved.)
KEY WORDS: Intraoperative, Feedback, Video, Laparo-
scopic cholecystectomy, Technical skills, Resident learners
COMPETENCIES: Patient Care, Interpersonal and
Communication Skills
ABBREVIATIONS: BID, Briefing, Intraoperative teach-
ing, and Debriefing; PGY, Postgraduate year
INTRODUCTION
Recently, surgical resident education has encountered
constant pressure to adapt to decreased autonomy,
increased medico-legal responsibilities, and the need to
master a rising number of surgical techniques and nonop-
erative skills.1À3
Because of these constraints, increased
emphasis has been placed on making operative experien-
ces as meaningful as possible.4À6
The current paradigm
of operative education is based on “discovery learning,”
which focuses on accumulation of experience and self-
directed learning.7
However, implementation of faculty
supported “guided discovery learning” results in more
efficient and accurate learning than use of pure
Funding/Support: This research did not receive any specific grant from funding
agencies in the public, commercial, or not-for-profit sectors.
Correspondence: Inquiries to Gurjit Sandhu, PhD, Department of Surgery, Sec-
tion of General Surgery, 2207 Taubman Center, 1500 E. Medical Center Drive,
SPC 5346, Ann Arbor, MI 48109; fax: 734-763-5615; e-mail: gurjit@med.umich.
edu
906 Journal of Surgical Education  1931-7204/$30.00© 2019 Association of Program Directors in Surgery. Published by
Elsevier Inc. All rights reserved. https://doi.org/10.1016/j.jsurg.2019.02.003
(unguided) discovery learning.8
As such, a central feature
of guided learning includes incorporation of interactive
feedback into daily practice patterns.7,9À11
Effective feedback uses the knowledge and profi-
ciency level of the learner to provide a clear understand-
ing of the goal and how to make progress towards that
goal.10,11
Moreover, immediate informative feedback is
essential for optimal improvement of performance.12,13
Inadequate feedback results in minimal improvement
even for highly motivated learners.12,14
Providing ade-
quate feedback requires recognition that feedback has
occurred15
as well as faculty competence in delivering
feedback and resident ability to receive and respond to
feedback.14
Ineffective feedback results from learner
misinterpretation of feedback as well as unintended or
lack of response to feedback.11
As such, understanding
resident and faculty perceptions is critical to effective
delivery and receipt of feedback.
Survey-based studies have identified differences in res-
ident and faculty perceptions of timing, amount, speci-
ficity, and effectiveness of feedback.16À19
However,
survey-based studies lack the ability to describe the type
of feedback accurately and also fail to identify nonverbal
forms of feedback. Video review has the potential to
minimize recall bias and eliminate variation by allowing
all viewers to review identical video footage. Further-
more, video review can supplement survey-based stud-
ies by adding new levels of granularity after studying
interactions that are deemed important by participants.
To date, video review has frequently been used to sup-
plement self-directed learning, coaching, evaluate tech-
nical performance or understanding of operative
procedures, and assess patient interactions.20À31
To our
knowledge, we designed the first study to use video
review to study differences in perceptions of feedback.
We designed this study to confirm the findings of previ-
ous survey-based studies on perceptions of perioperative
feedback between faculty and residents. We also sought
to supplement survey data by (1) determining if resi-
dents and faculty would differently identify moments of
operative feedback during video review of a laparo-
scopic cholecystectomy and (2) characterizing any dif-
ferently perceived moments.
MATERIAL AND METHODS
Study Design
A schematic of the study design is shown in Figure 1.
This study was deemed exempt by the Institutional
Review Board at the University of Michigan (IRB no.
HUM00084551). This study involved video review of a
laparoscopic cholecystectomy performed by a postgrad-
uate year (PGY) 2 surgical resident and a faculty
surgeon. Video of a junior resident and a tenured faculty
surgeon were chosen to maximize possible feedback
events. Laparoscopic cholecystectomy was selected for
several reasons. With over 700,000 cases performed
annually, laparoscopic cholecystectomy is one of the
most common cases performed in the United States.32
Furthermore, its minimally invasive nature permits exter-
nal and intraabdominal viewing.33
Also, laparoscopic
cholecystectomy has frequently been studied as a model
procedure for skills assessment and education.34À37
Vid-
eos of laparoscopic surgery were recorded using a video
capture device connected to the laparoscope (Storz
AIDA, DVD SCB 202040-20 Endoscopy Image and Video
Capture). The external footage was captured using an
Apple iPad Mini 2 (Apple, Cupertino, California). Exter-
nal and laparoscopic videos were synchronized and
combined using iMovie 10 (Apple, Cupertino, Califor-
nia). A continuous 10-minute video was extracted that
captured 5 minutes before and 5 minutes after identifica-
tion of the critical view of safety, which involves com-
plete dissection of the cystic duct and cystic artery prior
to division.33,38
Identification of the critical view of
safety is known to minimize bile duct injury during lapa-
roscopic cholecystectomy. We chose 5 minutes prior to
and after acquisition of the critical view to provide a rich
opportunity for feedback.
Next, participants independently reviewed the 10-min-
ute video excerpt in iMovie 10 and were instructed to
identify each moment where a new piece of feedback was
initiated by timestamping the video (Fig. 1). Immediately
after watching the video, participants completed an online
modified version of a previously developed survey tool.19
This tool was piloted to a group of 3 residents and 4 fac-
ulty for readability and comprehension prior to administra-
tion (Supplemental Tables 1-2). The first part of the survey
asked respondents to provide details about the types of
feedback observed in the video. In the second section of
the survey, participants were asked to convey their opin-
ions on their current perioperative feedback experiences
at our institution using a 5-point Likert scale and provide
responses to open-ended questions.
Participants
Forty-one residents and thirty-three faculty from the gen-
eral surgery residency program at the University of Michi-
gan Hospital and Health Systems were recruited via email
invitation to review videos. Participation was voluntary
and consent was obtained. Residents and faculty partici-
pated over the course of 1.5 nonoverlapping months.
Data Collected
Data were collected electronically from October to
December 2015. Survey data were linked to video data
Journal of Surgical Education  Volume 76 /Number 4  July/August 2019 907
using a unique identifier. Timestamps were exported
from iMovie into Final Cut Pro X 10.2.2 (Apple, Cuper-
tino, California) to view and analyze markers. Results
were anonymized prior to coding and analysis. Resident
and faculty timestamping of feedback during the entire
10-minute video were compared. The video was
reviewed in 10-second intervals in order to identify the
distribution of feedback events. The number of time-
stamped events during each 10-second interval of the
video was compared in order to further assess specific
moments of discrepancy between residents and faculty.
Ten-second intervals were also selected to identify
granular yet meaningful interaction. Survey responses
were collected and analyzed. Free text comments were
also reviewed for content.
Statistical Analysis
Comparisons of video timestamping and survey
responses were made using Student t tests. Statistical
analyses were performed using SPSS (IBM, Armonk,
New York) and Microsoft Excel (Microsoft, Redmond,
Washington). A p value of less than 0.05 was considered
FIGURE 1. Overview of study design. Residents and faculty viewed a video excerpt of external and intra-abdominal laparoscopic cholecystectomy footage
and indicated each new moment of feedback by video timestamping. Surveys were also completed by participants. Video timestamping and survey results
were subjected to quantitative and qualitative analysis.
908 Journal of Surgical Education  Volume 76 /Number 4  July/August 2019
statistically significant. Subgroup analysis was not con-
ducted because of the small sample size in each cohort.
RESULTS
Study Participants
The overall participation rate was 70% (52 of 74), which
consisted of 56% of residents (23 of 41) and 88% of fac-
ulty members (29 of 33). The average PGY level of the
residents was 3.96 § 2.2 years (mean § SD). Junior resi-
dents (PGY1-2) represented 30.4% of respondents. Resi-
dent and faculty demographics are shown in Table 1.
Survey Findings
Our survey results demonstrated that residents and fac-
ulty generally agreed on the importance of the timing
and the subject areas of feedback. The only exception
was feedback on knowledge of anatomy, which was
identified as slightly less important by residents
(p = 0.01; Fig. 2). Consistent with previous studies, resi-
dents indicated that operative feedback occurred with
decreased immediacy, specificity, and frequency than
faculty (Fig. 2).16À19
This difference was most notable
for specificity of feedback, p  0.0005. Residents and
faculty demonstrated no significant difference regarding
overall satisfaction with operative feedback, p = 0.12.
Interestingly, faculty believed that residents solicited
feedback less frequently than residents themselves indi-
cated, p = 0.021. For all subject areas of feedback que-
ried, residents perceived feedback less frequently than
faculty, but this was most prominent for knowledge of
anatomy, p  0.0005. 69.0% of faculty indicated that
their ability to give feedback was rarely influenced by
lack of instruction on delivery of feedback. A total of
55.2% of faculty indicated that concern for offending the
resident rarely influenced their ability to give feedback.
A total of 82.8% of faculty indicated that concern for
resident evaluations rarely prevented their ability to give
feedback. Finally, 55.2% of faculty indicated that time
constraints impeded the ability to deliver feedback half
or more of the time.
Quantitative Analysis of Video Review
Residents and faculty were asked to review a 10-minute
segment of video footage of a laparoscopic cholecystec-
tomy and identify each moment of feedback. After
review of the video segment, residents on average identi-
fied fewer moments of feedback (21.1 § 13.6) compared
to faculty (29.3 § 21.5); however, this was not statisti-
cally significant (mean § SD; p = 0.13, Fig. 3A). When
analyzing the video segment in 10-second intervals to
identify granular yet meaningful interactions, faculty
identified feedback more frequently than residents at 7
discrete intervals (p  0.05, Fig. 3B).
Characterization of Differently Identified
Intervals and Free Text Responses
The 7 differently perceived intervals identified on quanti-
tative timestamp analysis were assessed for potential pat-
terns, and 3 key patterns were noted (Table 2). First, all
differently identified intervals lacked explicit mention of
feedback by faculty or residents. Second, faculty were
always more likely to identify an event as feedback com-
pared to residents. Third, 3 of the 7 differently identified
intervals included nonverbal interactions, such as point-
ing to the monitor. These data indicate that none of the 7
interactions were overtly labeled as feedback by the
attending or resident being videotaped, thus necessitating
subjective identification of an event as feedback by resi-
dent and faculty video reviewers. Furthermore, faculty
are more likely to identify any interaction as feedback
compared to residents including nonverbal interactions.
Free-text comments by residents and faculty sup-
ported these patterns (Table 3). In addition, free text
comments suggested that residents may be more recep-
tive to feedback at the conclusion of the case (Table 3).
Together, these data suggest that differences in percep-
tion of intraoperative feedback between residents and
faculty may in part be due to differences in identification
of an intraoperative event as feedback.
DISCUSSION
In this study, we implemented surveys to demonstrate that
resident and faculty perceptions of operative feedback are
different. We then applied video review of a laparoscopic
cholecystectomy to show that these differences in percep-
tions of feedback may be partly due to disparate identifica-
tion of intraoperative interactions as feedback. Overall, our
key findings indicate that residents are less likely to identify
TABLE 1. Demographics of Residents and Faculty Who Partici-
pated in the Study
Residents n (%) Faculty n (%)
Total 23 Total 29
Male 17 (73.9) Male 19 (65.5)
Female 6 (26.1) Female 10 (34.5)
PGY1 4 (17.4) Assistant professor 12 (41.4)
PGY2 3 (13.0) Associate professor 9 (31.0)
PGY3 3 (13.0) Professor 8 (27.6)
ADT year 1 3 (13.0)
ADT year 2 3 (13.0)
PGY4 3 (13.0)
PGY5 4 (17.4)
ADT, academic development time is a dedicated year of research.
Journal of Surgical Education  Volume 76 /Number 4  July/August 2019 909
FIGURE 2. Significant Survey Results. Comparison of resident and faculty survey responses relating to feedback. A five point Likert scaling system was used
for all questions asked, with possible responses listed at the top of each table. Mean results are displayed (yellow circles represent faculty responses; blue
squares represent resident responses). Bars represent 95% confidence intervals. *p  0.05 as determined by t test.
910 Journal of Surgical Education  Volume 76 /Number 4  July/August 2019
a particular constellation of intraoperative interactions as
feedback compared to faculty. As such, our work is the
first to demonstrate that video review can be used to study
perceptions and can supplement survey findings.
In agreement with Jensen et al., our survey findings
demonstrated that residents perceived operative feed-
back as being less frequent, specific, and timely com-
pared to faculty.19
Interestingly, however, resident
participants in our study demonstrated no statistical dif-
ference in overall satisfaction with operative feedback
compared to faculty. This may be due to the fact that res-
idents may place increased weight on a small number of
effective feedback interactions that contribute to overall
skill development; however, further studies will be
needed to characterize this finding.
In 12% (7 out of 60) of the 10-second intervals, resi-
dents were less likely to identify feedback resulting in
potentially missed opportunities for learning and
improvement in response to feedback. These observa-
tions indicate that one’s position, either as resident or
faculty, influences how feedback is observed and experi-
enced. The varying responses between the 2 groups,
despite review of the same operative excerpt, suggest
diverging perceptions of feedback. The interactions that
were preferentially identified as feedback by faculty
were not overtly labeled as feedback in the video and
FIGURE 3. Resident and faculty identified feedback events on video review. (A) The average number of feedback events identified per participant during the
10-minute video segment is shown for residents and faculty. Error bars indicate standard deviation. Mann-Whitney U statistical analysis yielded p = 0.13. (B)
Average number of feedback events per participant for each 10-second interval during the 10-minute video segment are indicated for residents and faculty.
*p  0.05 as determined by Student t test.
Journal of Surgical Education  Volume 76 /Number 4  July/August 2019 911
TABLE 2. Analysis of Events Differentially Identified As Feedback
10s time
interval
Residents That
Identified Feedback
Events, n (%)
Faculty That
Identified Feedback
Events, n (%)
Transcript
10-20 8 (34.8) 22 (75.9) Faculty: Ah see, you’re getting there man. Spread there a little
bit. Yeah, this is so great. Look at that. Just keep working and
working and eventually it all comes out.
100-110 7 (30.4) 20 (69.0) Faculty: Mhmm. Mhmm. Mhmm. Yeah.
110-120 8 (34.8) 22 (75.9) Faculty: Yup. I like that. My intuition tells me there’s a little spot
there that will give.
150-160 5 (21.7) 19 (65.5) Faculty: Do that one more time. You can see it’s a little nothing
and it’s going right into the gallbladder. Just let that sit for a
minute. Get this stuff over here down.
Faculty points to monitor
300-310 0 (0) 6 (20.7) Faculty: We’ll take those little things crossing there. I don’t know
if they’re little vessels or lymphatics.
Faculty points to monitor
380-390 1 (4.3) 12 (41.4) Faculty: We can put one clip here, one clip here and cut
between those. Want to do that?
Resident: I agree.
Faculty: Clip please. Clip applier please. Okay.
Faculty points to monitor
550-560 9 (39.1) 25 (86.2) Faculty: Mhmm. The whole thing is just being so beautiful man!
Where do you think the artery is?
TABLE 3. Characterization of Free Text Comments
Faculty are More Likely Than Residents to Identify an Event As Feedback
Resident 1. As I've gotten more senior, I've noticed that only certain attendings make it a point to give directed and specific
feedback. The overwhelming majority will give zero feedback. It seems there's more comfort with giving feedback
to junior residents because it's more elementary and general skills. Perhaps there should be some education
regarding how to actually gauge the performance of senior residents in order to provide high quality feedback.
2. Generally, intraoperative teaching is done a lot while specific, resident-focused feedback is less common. I most
appreciate attendings who point out specific actions that I am either doing well or things that I am doing incorrectly.
I least appreciate those who take the instrument away without explaining exactly what I am doing wrong.
Faculty 1. Typically, I will deliver feedback as the case progresses after asking them to perform each task. This is verbal to
begin with and if still unclear, I will show them how to perform the task. This seems to be quite effective.
2. Ideally feedback is given continuously throughout the procedure. In emergent situations, feedback that disrupts per-
formance may have to be deferred until after the case is finished.
Nonverbal Interactions May Be Identified As Feedback by Faculty
Resident Residents did not provide relevant comments.
Faculty 1. There was verbal feedback, which was almost non-stop. It was positive, corrective, general and sometimes very
specific. In addition there was a lot of hands on feedback, which involved correcting the angle of the camera,
direction of retraction, angle of instrument etc.
2. Feedback was continuous for all steps of the procedure, very specific with regard to what the resident should do at
the specific time. Also feedback with regard to how to divide specific structures. Some feedback was not verbal,
i.e. attending moved the resident hand to provide more retraction on the infundibulum for better visualization for
dissection of the critical view of safety.
Residents May Be More Receptive to Feedback at Conclusion of Case
Resident 1. I wish we received more feedback postoperatively when I have time to really digest what I'm being told. This feedback
can even be just one area to focus on so that next time I do a case with that attending, I'll be sure to keep that in mind.
2. I wish I could get more feedback on improving efficiency. I appreciate the intra-op feedback but would like more big-
picture strengths/weakness feedback after cases so I know where to focus and how to improve for the following cases.
Faculty 1. I provide ongoing feedback throughout the operation. I try to debrief after an operation but most often forget.
912 Journal of Surgical Education  Volume 76 /Number 4  July/August 2019
included nonverbal interactions. These findings suggest
that residents are less likely to interpret an intraopera-
tive interaction as feedback. Comments from residents
also indicate that intraoperative interactions are more
likely to be interpreted by residents as intraoperative
teaching rather than feedback (Table 3). Feedback in
clinical education has been defined as information on a
trainee’s observed performance in comparison with a
standard in order to improve the trainee’s perfor-
mance.39
Importantly, feedback is learner-centered in
that it is personalized and oriented to the trainee’s
goals.14,40,41
Teaching, on the other hand, is educator-
centered, and content is structured as a one-way transfer
of knowledge and skills.42
In this study we intentionally
withheld definitions of teaching or feedback. Residents
may have been more likely to identify intraoperative
interactions as teaching rather than feedback because
intraoperative interactions may not be aligned to
the goals of the trainee. For example, one resident com-
mented, “I get relatively little feedback, and it usually
focuses on minute or less-important aspects of the
procedure.” Additional studies are necessary to delineate
why residents are less likely to identify certain interac-
tions as feedback. However, identifying trainee goals
and labeling interactions as feedback may facilitate
reception of feedback by the trainee.
Interestingly, faculty comments suggested that non-
verbal feedback was an integral component to intraoper-
ative feedback. In contrast, however, no residents
commented on nonverbal feedback signifying that resi-
dents may not recognize nonverbal feedback. Our
results differed from previous survey-based findings,
which demonstrated no difference between faculty and
resident perceptions of nonverbal feedback.16
This dif-
ference is in part due to recall bias, which is inherent to
survey-based findings. Finally, multiple resident com-
ments specify that residents may not be primed to
receive feedback during the procedure, but may be
more receptive to feedback immediately postopera-
tively. An interaction that occurs immediately postopera-
tively may facilitate bidirectional and thus personalized
conversation, thereby allowing the trainee to identify
the interaction as feedback.
Together, these data suggest that changes in delivery and
reception of feedback may enhance intraoperative learning.
Our data suggest that labeling of feedback during the deliv-
ery of verbal and nonverbal feedback as well as review of
feedback immediately postoperatively may surmount chal-
lenges to intraoperative surgical education by optimizing
reception of feedback and intraoperative learning.
One educational framework that intentionally integra-
tes feedback into surgical practice is the Briefing, Intrao-
perative teaching, and Debriefing (BID) model.4
This
model implements a preoperative briefing in which
specific learner objectives are established; intraoperative
feedback, which focuses on those objectives; and a
debriefing at the end of the case, which provides recom-
mendations for future improvement.4
The BID model
accounts for the importance of specificity, learner
involvement, and goal orientation in operative feed-
back.4
Application of the BID model has resulted in
improved resident perception of nonverbal feedback in
addition to the frequency, clarity, and effectiveness of
operative feedback. These findings suggest that the BID
model may prime both the teacher and learner to admin-
ister and receive feedback.12
Our results, which show
varying perceptions of specificity, learner involvement,
goal orientation, and nonverbal feedback by residents
and faculty, support use of the BID model, which may
improve the intraoperative learning environment for the
resident learner. However, free text comments from fac-
ulty in our survey revealed that the immediately postop-
erative debrief was felt to be unnecessary because
preoperative briefing and intraoperative feedback were
sufficient. Further studies are necessary to determine if
the postoperative debrief results in improved alignment
of expectations of feedback by both residents and fac-
ulty. Certainly, it will be interesting to apply video
review after implementation of the BID model to deter-
mine how residents and faculty may differentially per-
ceive the postoperative debrief.
This study has several limitations. Participation from a
single institution may limit generalizability. The resident
and faculty portrayed in the video were members of the
General Surgery program at the University of Michigan
and were likely to be familiar to respondents. Thus, prior
interactions with either individual may have altered
timestamping of feedback by study participants. In addi-
tion, the study was performed over the course of
3 months. Respondents that participated near the study
conclusion, may have been influenced by the opinions
of early participants. Furthermore, we analyzed the
video segment in 10-second increments in order to per-
mit detailed examination of single interactions. Further
studies with a greater number of participants will be
required to perform alternative interval division analysis.
Moreover, due to our limited sample size we were also
unable to perform subgroup analysis. Finally, as with all
studies with voluntary participation, results may reflect
selection bias. Faculty participation of 88% suggests this
bias is less likely to play a role in faculty results than resi-
dent results where participation is 56%.
CONCLUSIONS
In conclusion we confirmed that surgical residents and
faculty members have differing perceptions on operative
Journal of Surgical Education  Volume 76 /Number 4  July/August 2019 913
feedback. Here, we used a novel video review methodol-
ogy to characterize perceptions of feedback during the
course of an operation. This method is widely applicable
to other learning environments such as medical educa-
tion or patient education where elucidation of differen-
ces of perception is necessary.
ACKNOWLEDGMENTS
The authors would like to thank Ali Jones for her assis-
tance with construction of the website on which the sur-
vey was administered. We also thank Lauren Wancata, M.
D. and Niki Matusko, B.S. for critical review of the manu-
script. Niki Matusko also provided statistical support.
Ethical approval: This study was deemed exempt by
the Institutional Review Board at University of Michigan
on April 8, 2015 (HUM00084551).
REFERENCES
1. Teman NR, Gauger PG, Mullan PB, Tarpley JL, Minter
RM. Entrustment of general surgery residents in the
operating room: factors contributing to provision of res-
ident autonomy. J Am Coll Surg. 2014;219:778–787.
2. Malangoni MA, Biester TW, Jones AT, Klingensmith
ME, Lewis FR. Jr. Operative experience of surgery
residents: trends and challenges. J Surg Educ.
2013;70:783–788.
3. Champagne BJ. Effective teaching and feedback
strategies in the OR and beyond. Clin Colon Rectal
Surg. 2013;26:244–249.
4. Bell Jr. RH. Why Johnny cannot operate. Surgery.
2009;146:533–542.
5. Chung RS. How much time do surgical residents
need to learn operative surgery? Am J Surg.
2005;190:351–353.
6. Snyder RA, Tarpley MJ, Tarpley JL, et al. Teaching in
the operating room: results of a national survey. J
Surg Educ. 2012;69:643–649.
7. Roberts NK, Williams RG, Kim MJ, et al. The brief-
ing, intraoperative teaching, debriefing model for
teaching in the operating room. J Am Coll Surg.
2009;208:299–303.
8. Mayer RE. Should there be a three-strikes rule
against pure discovery learning? Am Psychol.
2004;59:14–19.
9. Accreditation Council for Graduate Medical Education.
Program requirements for graduate medical education
in surgery. Available at: http://www.acgme.org/acgme-
web/portals/0/pfassets/programrequirements/440_gen-
eral_surgery_07012014.pdf. Accessed December 28,
2015.
10. Ende J. Feedback in clinical medical education.
JAMA. 1983;250:777–781.
11. Black P, William D. Developing the theory of forma-
tive assessment. Educ Asses Eval Acc. 2009;21:
5–31.
12. Ericsson KA, Krampe RT, Tesch-Romer C. The role
of deliberate practice in the acquisition of expert
performance. Psychol Rev. 1993;100:363–406.
13. Hupp JR. Strengthening feedback in surgical educa-
tion. J Oral Maxillofac Surg. 2017;75:229–231.
14. Hewson MG, Little ML. Giving feedback: verification
of recommended techniques. J Gen Intern Med.
1998;13:111–116.
15. Hoffman RL, Petrosky JA, Eskander MF, et al. Feed-
back fundamentals in surgical education: tips for
success. Bull Am Coll Surg. 2015;100:35–39.
16. Butvidas LD, Anderson CI, Balogh D, et al. Dispar-
ities between resident and attending surgeon per-
ceptions of intraoperative teaching. Am J Surg.
2011;201:385–389.
17. Liberman AS, Liberman M, Steinert Y, et al. Surgery
residents and attending surgeons have different per-
ceptions of feedback. Med Teach. 2005;27:470–472.
18. Rose JS, Waibel BH, Schenarts PJ. Disparity between
resident and faculty surgeons’ perceptions of preop-
erative preparation, intraoperative teaching, and
postoperative feedback. J Surg Educ. 2011;68:459–
464.
19. Jensen AR, Wright AS, Kim S, et al. Educational feed-
back in the operating room: a gap between resident
and faculty perceptions. Am J Surg. 2012;204:248–
255.
20. Vaughn CJ, Kim E, O’Sullivan P, et al. Peer video
review and feedback improve performance in basic
surgical skills. Am J Surg. 2016;211:355–360.
21. Scott DJ, Rege RV, Bergen PC, et al. Measuring oper-
ative performance after laparoscopic skills training:
edited videotape versus direct observation. J Lapa-
roendosc Adv Surg Tech A. 2000;10:183–190.
22. Beard JD, Jolly BC, Newble DI, Thomas WE, Don-
nelly J, Southgate LJ. Assessing the technical skills of
surgical trainees. Br J Surg. 2005;92:778–782.
914 Journal of Surgical Education  Volume 76 /Number 4  July/August 2019
23. Zevin B, Bonrath EM, Aggarwal R, et al. Develop-
ment, feasibility, validity, and reliability of a scale
for objective assessment of operative performance
in laparoscopic gastric bypass surgery. J Am Coll
Surg. 2013;216:955–965. e8.
24. Abdelsattar JM, Pandian TK, Finnesgard EJ, et al. Do
You see what I see? How we use video as an adjunct
to general surgery resident education. J Surg Educ.
2015;72:e145–e150.
25. Trelease RB. From chalkboard, slides, and paper to
e-learning: How computing technologies have trans-
formed anatomical sciences education. Anat Sci
Educ. 2016. Epub.
26. Makary MA. The power of video recording: taking
quality to the next level. JAMA. 2013;309:1591–
1592.
27. Birkmeyer JD. Surgical skill and complication rates
after bariatric surgery. N Engl J Med. 2013;369:
1434–1442.
28. Herrera-Almario GE. The effect of video review of
resident laparoscopic surgical skills measured by
self- and external assessment. Am J of Surg.
2016;211:315–320.
29. Paskins Z1, McHugh G, Hassell AB. Getting under
the skin of the primary care consultation using
video stimulated recall: a systematic review. BMC
Med Res Methodol. 2014;14:101.
30. Hu YY, Peyre SE, Arriaga AF, et al. Postgame analy-
sis: using video-based coaching for continuous pro-
fessional development. J Am Coll Surg. 2012;
214:115–124.
31. Alken A, Tan E, Luursema JM, Fluit C, van Goor H.
Coaching during a trauma surgery team training:
perceptions versus structured observations. Am J
Surg. 2015;209:163–169.
32. Stokes CS, Krawczyk M, Lammert F. Gallstones:
environment, lifestyle and genes. Dig Dis. 2011;
29:191–201.
33. Strasberg SM, Hertl M, Soper NJ. An analysis of the
problem of biliary injury during laparoscopic chole-
cystectomy. J Am Coll Surg. 1995;180:101–125.
34. Smith SG, Torkington J, Brown TJ, Taffinder NJ,
Darzi A. Motion analysis. Surg Endosc. 2002;16:
640–645.
35. Vassiliou MC, Feldman LS, Andrew CG, et al. A global
assessment tool for evaluation of intraoperative lapa-
roscopic skills. Am J Surg. 2005;190:107–113.
36. Chang L, Hogle NJ, Moore BB, et al. Reliable assess-
ment of laparoscopic performance in the operating
room using videotape analysis. Surg Innov. 2007
Jun;14:122–126.
37. Kramp KH, van Det MJ, Hoff C, Lamme B, Veeger
NJ, Pierie JP. Validity and reliability of global opera-
tive assessment of laparoscopic skills (GOALS) in
novice trainees performing a laparoscopic cholecys-
tectomy. J Surg Educ. 2015;72:351–358.
38. Strasberg SM, Brunt LM. Rationale and use of the
critical view of safety in laparoscopic cholecystec-
tomy. J Am Coll Surg. 2010;211:132–138.
39. van de Ridder JM1, Stokking KM, McGaghie WC,
et al. What is feedback in clinical education? Med
Educ. 2008;42:189–197.
40. Bienstock JL, Katz NT, Cox SM, et al. To the point:
medical education reviews—providing feedback.
Am J Obstet Gynecol. 2007;196:508–513.
41. Kornegay JG, Kraut A, Manthey D, et al. Feedback in
medical education: a critical appraisal. AEM Educ
Train. 2017;1:98–109.
42. Dreyfus SE. The five-stage model of adult skill acqui-
sition. Bull Sci Technol Soc. 2004;24:177–181.
SUPPLEMENTARY INFORMATION
Supplementary material associated with this article can
be found in the online version at doi:10.1016/j.
jsurg.2019.02.003.
Journal of Surgical Education  Volume 76 /Number 4  July/August 2019 915

Analisis de los videos y percepiones de los docentes

  • 1.
    ORIGINAL REPORTS Intraoperative Feedback:A Video-Based Analysis of Faculty and Resident Perceptions Priya H. Dedhia, MD, PhD,a Meredith Barrett, MD,a Graham Ives, MD,a Christopher P. Magas, MBS,a Oliver A. Varban, MD,a Sandra L. Wong, MD,b and Gurjit Sandhu, PhDa a Department of Surgery, Michigan Medicine, Ann Arbor, Michigan; and b Department of Surgery, Dartmouth-Hitch- cock, Lebanon, New Hampshire OBJECTIVE: Residents and faculty identify intraoperative feedback as a critical component of surgical education. Studies have demonstrated that residents perceive lower quality and frequency of intraoperative feedback com- pared to faculty. These differences in perception may be due to dissimilar identification of feedback. The purpose of this study was to determine if residents and faculty dif- ferently identify intraoperative interactions as feedback. DESIGN: Residents and faculty viewed a segment of a laparoscopic cholecystectomy video and then time- stamped the video where they perceived moments of intraoperative feedback. Validated surveys on timing, amount, specificity, and satisfaction with operative feed- back were administered. SETTING: Viewing of the video and survey administra- tion was conducted at the University of Michigan. PARTICIPANTS: A total of 23 of 41 residents (56%) and 29 of 33 faculty (88%) participated in this study. RESULTS: Survey analysis demonstrated that residents perceived operative feedback to occur with less immedi- acy, specificity, and frequency compared to faculty. Dur- ing the 10-minute video, residents and faculty identified feedback 21 and 29 times, respectively (p = 0.13). Ten- second interval analysis demonstrated 7 statistically sig- nificant intervals (p < 0.05) where residents identified feedback less frequently than faculty. Analysis of these 7 intervals revealed that faculty were more likely to iden- tify interactions, especially nonverbal ones, as feedback. Review of free-text comments confirmed these findings and suggested that residents may be more receptive to feedback at the conclusion of the case. CONCLUSIONS: Using video review, we show that resi- dents and faculty identify different intraoperative inter- actions as feedback. This disparity in identification of feedback may limit resident satisfaction and effective intraoperative learning. Timing and labeling of feedback, continued use of video review, and structured teaching models may overcome these differences and improve surgical education. ( J Surg Ed 76:906À915. Ó 2019 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.) KEY WORDS: Intraoperative, Feedback, Video, Laparo- scopic cholecystectomy, Technical skills, Resident learners COMPETENCIES: Patient Care, Interpersonal and Communication Skills ABBREVIATIONS: BID, Briefing, Intraoperative teach- ing, and Debriefing; PGY, Postgraduate year INTRODUCTION Recently, surgical resident education has encountered constant pressure to adapt to decreased autonomy, increased medico-legal responsibilities, and the need to master a rising number of surgical techniques and nonop- erative skills.1À3 Because of these constraints, increased emphasis has been placed on making operative experien- ces as meaningful as possible.4À6 The current paradigm of operative education is based on “discovery learning,” which focuses on accumulation of experience and self- directed learning.7 However, implementation of faculty supported “guided discovery learning” results in more efficient and accurate learning than use of pure Funding/Support: This research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors. Correspondence: Inquiries to Gurjit Sandhu, PhD, Department of Surgery, Sec- tion of General Surgery, 2207 Taubman Center, 1500 E. Medical Center Drive, SPC 5346, Ann Arbor, MI 48109; fax: 734-763-5615; e-mail: [email protected]. edu 906 Journal of Surgical Education 1931-7204/$30.00© 2019 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved. https://doi.org/10.1016/j.jsurg.2019.02.003
  • 2.
    (unguided) discovery learning.8 Assuch, a central feature of guided learning includes incorporation of interactive feedback into daily practice patterns.7,9À11 Effective feedback uses the knowledge and profi- ciency level of the learner to provide a clear understand- ing of the goal and how to make progress towards that goal.10,11 Moreover, immediate informative feedback is essential for optimal improvement of performance.12,13 Inadequate feedback results in minimal improvement even for highly motivated learners.12,14 Providing ade- quate feedback requires recognition that feedback has occurred15 as well as faculty competence in delivering feedback and resident ability to receive and respond to feedback.14 Ineffective feedback results from learner misinterpretation of feedback as well as unintended or lack of response to feedback.11 As such, understanding resident and faculty perceptions is critical to effective delivery and receipt of feedback. Survey-based studies have identified differences in res- ident and faculty perceptions of timing, amount, speci- ficity, and effectiveness of feedback.16À19 However, survey-based studies lack the ability to describe the type of feedback accurately and also fail to identify nonverbal forms of feedback. Video review has the potential to minimize recall bias and eliminate variation by allowing all viewers to review identical video footage. Further- more, video review can supplement survey-based stud- ies by adding new levels of granularity after studying interactions that are deemed important by participants. To date, video review has frequently been used to sup- plement self-directed learning, coaching, evaluate tech- nical performance or understanding of operative procedures, and assess patient interactions.20À31 To our knowledge, we designed the first study to use video review to study differences in perceptions of feedback. We designed this study to confirm the findings of previ- ous survey-based studies on perceptions of perioperative feedback between faculty and residents. We also sought to supplement survey data by (1) determining if resi- dents and faculty would differently identify moments of operative feedback during video review of a laparo- scopic cholecystectomy and (2) characterizing any dif- ferently perceived moments. MATERIAL AND METHODS Study Design A schematic of the study design is shown in Figure 1. This study was deemed exempt by the Institutional Review Board at the University of Michigan (IRB no. HUM00084551). This study involved video review of a laparoscopic cholecystectomy performed by a postgrad- uate year (PGY) 2 surgical resident and a faculty surgeon. Video of a junior resident and a tenured faculty surgeon were chosen to maximize possible feedback events. Laparoscopic cholecystectomy was selected for several reasons. With over 700,000 cases performed annually, laparoscopic cholecystectomy is one of the most common cases performed in the United States.32 Furthermore, its minimally invasive nature permits exter- nal and intraabdominal viewing.33 Also, laparoscopic cholecystectomy has frequently been studied as a model procedure for skills assessment and education.34À37 Vid- eos of laparoscopic surgery were recorded using a video capture device connected to the laparoscope (Storz AIDA, DVD SCB 202040-20 Endoscopy Image and Video Capture). The external footage was captured using an Apple iPad Mini 2 (Apple, Cupertino, California). Exter- nal and laparoscopic videos were synchronized and combined using iMovie 10 (Apple, Cupertino, Califor- nia). A continuous 10-minute video was extracted that captured 5 minutes before and 5 minutes after identifica- tion of the critical view of safety, which involves com- plete dissection of the cystic duct and cystic artery prior to division.33,38 Identification of the critical view of safety is known to minimize bile duct injury during lapa- roscopic cholecystectomy. We chose 5 minutes prior to and after acquisition of the critical view to provide a rich opportunity for feedback. Next, participants independently reviewed the 10-min- ute video excerpt in iMovie 10 and were instructed to identify each moment where a new piece of feedback was initiated by timestamping the video (Fig. 1). Immediately after watching the video, participants completed an online modified version of a previously developed survey tool.19 This tool was piloted to a group of 3 residents and 4 fac- ulty for readability and comprehension prior to administra- tion (Supplemental Tables 1-2). The first part of the survey asked respondents to provide details about the types of feedback observed in the video. In the second section of the survey, participants were asked to convey their opin- ions on their current perioperative feedback experiences at our institution using a 5-point Likert scale and provide responses to open-ended questions. Participants Forty-one residents and thirty-three faculty from the gen- eral surgery residency program at the University of Michi- gan Hospital and Health Systems were recruited via email invitation to review videos. Participation was voluntary and consent was obtained. Residents and faculty partici- pated over the course of 1.5 nonoverlapping months. Data Collected Data were collected electronically from October to December 2015. Survey data were linked to video data Journal of Surgical Education Volume 76 /Number 4 July/August 2019 907
  • 3.
    using a uniqueidentifier. Timestamps were exported from iMovie into Final Cut Pro X 10.2.2 (Apple, Cuper- tino, California) to view and analyze markers. Results were anonymized prior to coding and analysis. Resident and faculty timestamping of feedback during the entire 10-minute video were compared. The video was reviewed in 10-second intervals in order to identify the distribution of feedback events. The number of time- stamped events during each 10-second interval of the video was compared in order to further assess specific moments of discrepancy between residents and faculty. Ten-second intervals were also selected to identify granular yet meaningful interaction. Survey responses were collected and analyzed. Free text comments were also reviewed for content. Statistical Analysis Comparisons of video timestamping and survey responses were made using Student t tests. Statistical analyses were performed using SPSS (IBM, Armonk, New York) and Microsoft Excel (Microsoft, Redmond, Washington). A p value of less than 0.05 was considered FIGURE 1. Overview of study design. Residents and faculty viewed a video excerpt of external and intra-abdominal laparoscopic cholecystectomy footage and indicated each new moment of feedback by video timestamping. Surveys were also completed by participants. Video timestamping and survey results were subjected to quantitative and qualitative analysis. 908 Journal of Surgical Education Volume 76 /Number 4 July/August 2019
  • 4.
    statistically significant. Subgroupanalysis was not con- ducted because of the small sample size in each cohort. RESULTS Study Participants The overall participation rate was 70% (52 of 74), which consisted of 56% of residents (23 of 41) and 88% of fac- ulty members (29 of 33). The average PGY level of the residents was 3.96 § 2.2 years (mean § SD). Junior resi- dents (PGY1-2) represented 30.4% of respondents. Resi- dent and faculty demographics are shown in Table 1. Survey Findings Our survey results demonstrated that residents and fac- ulty generally agreed on the importance of the timing and the subject areas of feedback. The only exception was feedback on knowledge of anatomy, which was identified as slightly less important by residents (p = 0.01; Fig. 2). Consistent with previous studies, resi- dents indicated that operative feedback occurred with decreased immediacy, specificity, and frequency than faculty (Fig. 2).16À19 This difference was most notable for specificity of feedback, p 0.0005. Residents and faculty demonstrated no significant difference regarding overall satisfaction with operative feedback, p = 0.12. Interestingly, faculty believed that residents solicited feedback less frequently than residents themselves indi- cated, p = 0.021. For all subject areas of feedback que- ried, residents perceived feedback less frequently than faculty, but this was most prominent for knowledge of anatomy, p 0.0005. 69.0% of faculty indicated that their ability to give feedback was rarely influenced by lack of instruction on delivery of feedback. A total of 55.2% of faculty indicated that concern for offending the resident rarely influenced their ability to give feedback. A total of 82.8% of faculty indicated that concern for resident evaluations rarely prevented their ability to give feedback. Finally, 55.2% of faculty indicated that time constraints impeded the ability to deliver feedback half or more of the time. Quantitative Analysis of Video Review Residents and faculty were asked to review a 10-minute segment of video footage of a laparoscopic cholecystec- tomy and identify each moment of feedback. After review of the video segment, residents on average identi- fied fewer moments of feedback (21.1 § 13.6) compared to faculty (29.3 § 21.5); however, this was not statisti- cally significant (mean § SD; p = 0.13, Fig. 3A). When analyzing the video segment in 10-second intervals to identify granular yet meaningful interactions, faculty identified feedback more frequently than residents at 7 discrete intervals (p 0.05, Fig. 3B). Characterization of Differently Identified Intervals and Free Text Responses The 7 differently perceived intervals identified on quanti- tative timestamp analysis were assessed for potential pat- terns, and 3 key patterns were noted (Table 2). First, all differently identified intervals lacked explicit mention of feedback by faculty or residents. Second, faculty were always more likely to identify an event as feedback com- pared to residents. Third, 3 of the 7 differently identified intervals included nonverbal interactions, such as point- ing to the monitor. These data indicate that none of the 7 interactions were overtly labeled as feedback by the attending or resident being videotaped, thus necessitating subjective identification of an event as feedback by resi- dent and faculty video reviewers. Furthermore, faculty are more likely to identify any interaction as feedback compared to residents including nonverbal interactions. Free-text comments by residents and faculty sup- ported these patterns (Table 3). In addition, free text comments suggested that residents may be more recep- tive to feedback at the conclusion of the case (Table 3). Together, these data suggest that differences in percep- tion of intraoperative feedback between residents and faculty may in part be due to differences in identification of an intraoperative event as feedback. DISCUSSION In this study, we implemented surveys to demonstrate that resident and faculty perceptions of operative feedback are different. We then applied video review of a laparoscopic cholecystectomy to show that these differences in percep- tions of feedback may be partly due to disparate identifica- tion of intraoperative interactions as feedback. Overall, our key findings indicate that residents are less likely to identify TABLE 1. Demographics of Residents and Faculty Who Partici- pated in the Study Residents n (%) Faculty n (%) Total 23 Total 29 Male 17 (73.9) Male 19 (65.5) Female 6 (26.1) Female 10 (34.5) PGY1 4 (17.4) Assistant professor 12 (41.4) PGY2 3 (13.0) Associate professor 9 (31.0) PGY3 3 (13.0) Professor 8 (27.6) ADT year 1 3 (13.0) ADT year 2 3 (13.0) PGY4 3 (13.0) PGY5 4 (17.4) ADT, academic development time is a dedicated year of research. Journal of Surgical Education Volume 76 /Number 4 July/August 2019 909
  • 5.
    FIGURE 2. SignificantSurvey Results. Comparison of resident and faculty survey responses relating to feedback. A five point Likert scaling system was used for all questions asked, with possible responses listed at the top of each table. Mean results are displayed (yellow circles represent faculty responses; blue squares represent resident responses). Bars represent 95% confidence intervals. *p 0.05 as determined by t test. 910 Journal of Surgical Education Volume 76 /Number 4 July/August 2019
  • 6.
    a particular constellationof intraoperative interactions as feedback compared to faculty. As such, our work is the first to demonstrate that video review can be used to study perceptions and can supplement survey findings. In agreement with Jensen et al., our survey findings demonstrated that residents perceived operative feed- back as being less frequent, specific, and timely com- pared to faculty.19 Interestingly, however, resident participants in our study demonstrated no statistical dif- ference in overall satisfaction with operative feedback compared to faculty. This may be due to the fact that res- idents may place increased weight on a small number of effective feedback interactions that contribute to overall skill development; however, further studies will be needed to characterize this finding. In 12% (7 out of 60) of the 10-second intervals, resi- dents were less likely to identify feedback resulting in potentially missed opportunities for learning and improvement in response to feedback. These observa- tions indicate that one’s position, either as resident or faculty, influences how feedback is observed and experi- enced. The varying responses between the 2 groups, despite review of the same operative excerpt, suggest diverging perceptions of feedback. The interactions that were preferentially identified as feedback by faculty were not overtly labeled as feedback in the video and FIGURE 3. Resident and faculty identified feedback events on video review. (A) The average number of feedback events identified per participant during the 10-minute video segment is shown for residents and faculty. Error bars indicate standard deviation. Mann-Whitney U statistical analysis yielded p = 0.13. (B) Average number of feedback events per participant for each 10-second interval during the 10-minute video segment are indicated for residents and faculty. *p 0.05 as determined by Student t test. Journal of Surgical Education Volume 76 /Number 4 July/August 2019 911
  • 7.
    TABLE 2. Analysisof Events Differentially Identified As Feedback 10s time interval Residents That Identified Feedback Events, n (%) Faculty That Identified Feedback Events, n (%) Transcript 10-20 8 (34.8) 22 (75.9) Faculty: Ah see, you’re getting there man. Spread there a little bit. Yeah, this is so great. Look at that. Just keep working and working and eventually it all comes out. 100-110 7 (30.4) 20 (69.0) Faculty: Mhmm. Mhmm. Mhmm. Yeah. 110-120 8 (34.8) 22 (75.9) Faculty: Yup. I like that. My intuition tells me there’s a little spot there that will give. 150-160 5 (21.7) 19 (65.5) Faculty: Do that one more time. You can see it’s a little nothing and it’s going right into the gallbladder. Just let that sit for a minute. Get this stuff over here down. Faculty points to monitor 300-310 0 (0) 6 (20.7) Faculty: We’ll take those little things crossing there. I don’t know if they’re little vessels or lymphatics. Faculty points to monitor 380-390 1 (4.3) 12 (41.4) Faculty: We can put one clip here, one clip here and cut between those. Want to do that? Resident: I agree. Faculty: Clip please. Clip applier please. Okay. Faculty points to monitor 550-560 9 (39.1) 25 (86.2) Faculty: Mhmm. The whole thing is just being so beautiful man! Where do you think the artery is? TABLE 3. Characterization of Free Text Comments Faculty are More Likely Than Residents to Identify an Event As Feedback Resident 1. As I've gotten more senior, I've noticed that only certain attendings make it a point to give directed and specific feedback. The overwhelming majority will give zero feedback. It seems there's more comfort with giving feedback to junior residents because it's more elementary and general skills. Perhaps there should be some education regarding how to actually gauge the performance of senior residents in order to provide high quality feedback. 2. Generally, intraoperative teaching is done a lot while specific, resident-focused feedback is less common. I most appreciate attendings who point out specific actions that I am either doing well or things that I am doing incorrectly. I least appreciate those who take the instrument away without explaining exactly what I am doing wrong. Faculty 1. Typically, I will deliver feedback as the case progresses after asking them to perform each task. This is verbal to begin with and if still unclear, I will show them how to perform the task. This seems to be quite effective. 2. Ideally feedback is given continuously throughout the procedure. In emergent situations, feedback that disrupts per- formance may have to be deferred until after the case is finished. Nonverbal Interactions May Be Identified As Feedback by Faculty Resident Residents did not provide relevant comments. Faculty 1. There was verbal feedback, which was almost non-stop. It was positive, corrective, general and sometimes very specific. In addition there was a lot of hands on feedback, which involved correcting the angle of the camera, direction of retraction, angle of instrument etc. 2. Feedback was continuous for all steps of the procedure, very specific with regard to what the resident should do at the specific time. Also feedback with regard to how to divide specific structures. Some feedback was not verbal, i.e. attending moved the resident hand to provide more retraction on the infundibulum for better visualization for dissection of the critical view of safety. Residents May Be More Receptive to Feedback at Conclusion of Case Resident 1. I wish we received more feedback postoperatively when I have time to really digest what I'm being told. This feedback can even be just one area to focus on so that next time I do a case with that attending, I'll be sure to keep that in mind. 2. I wish I could get more feedback on improving efficiency. I appreciate the intra-op feedback but would like more big- picture strengths/weakness feedback after cases so I know where to focus and how to improve for the following cases. Faculty 1. I provide ongoing feedback throughout the operation. I try to debrief after an operation but most often forget. 912 Journal of Surgical Education Volume 76 /Number 4 July/August 2019
  • 8.
    included nonverbal interactions.These findings suggest that residents are less likely to interpret an intraopera- tive interaction as feedback. Comments from residents also indicate that intraoperative interactions are more likely to be interpreted by residents as intraoperative teaching rather than feedback (Table 3). Feedback in clinical education has been defined as information on a trainee’s observed performance in comparison with a standard in order to improve the trainee’s perfor- mance.39 Importantly, feedback is learner-centered in that it is personalized and oriented to the trainee’s goals.14,40,41 Teaching, on the other hand, is educator- centered, and content is structured as a one-way transfer of knowledge and skills.42 In this study we intentionally withheld definitions of teaching or feedback. Residents may have been more likely to identify intraoperative interactions as teaching rather than feedback because intraoperative interactions may not be aligned to the goals of the trainee. For example, one resident com- mented, “I get relatively little feedback, and it usually focuses on minute or less-important aspects of the procedure.” Additional studies are necessary to delineate why residents are less likely to identify certain interac- tions as feedback. However, identifying trainee goals and labeling interactions as feedback may facilitate reception of feedback by the trainee. Interestingly, faculty comments suggested that non- verbal feedback was an integral component to intraoper- ative feedback. In contrast, however, no residents commented on nonverbal feedback signifying that resi- dents may not recognize nonverbal feedback. Our results differed from previous survey-based findings, which demonstrated no difference between faculty and resident perceptions of nonverbal feedback.16 This dif- ference is in part due to recall bias, which is inherent to survey-based findings. Finally, multiple resident com- ments specify that residents may not be primed to receive feedback during the procedure, but may be more receptive to feedback immediately postopera- tively. An interaction that occurs immediately postopera- tively may facilitate bidirectional and thus personalized conversation, thereby allowing the trainee to identify the interaction as feedback. Together, these data suggest that changes in delivery and reception of feedback may enhance intraoperative learning. Our data suggest that labeling of feedback during the deliv- ery of verbal and nonverbal feedback as well as review of feedback immediately postoperatively may surmount chal- lenges to intraoperative surgical education by optimizing reception of feedback and intraoperative learning. One educational framework that intentionally integra- tes feedback into surgical practice is the Briefing, Intrao- perative teaching, and Debriefing (BID) model.4 This model implements a preoperative briefing in which specific learner objectives are established; intraoperative feedback, which focuses on those objectives; and a debriefing at the end of the case, which provides recom- mendations for future improvement.4 The BID model accounts for the importance of specificity, learner involvement, and goal orientation in operative feed- back.4 Application of the BID model has resulted in improved resident perception of nonverbal feedback in addition to the frequency, clarity, and effectiveness of operative feedback. These findings suggest that the BID model may prime both the teacher and learner to admin- ister and receive feedback.12 Our results, which show varying perceptions of specificity, learner involvement, goal orientation, and nonverbal feedback by residents and faculty, support use of the BID model, which may improve the intraoperative learning environment for the resident learner. However, free text comments from fac- ulty in our survey revealed that the immediately postop- erative debrief was felt to be unnecessary because preoperative briefing and intraoperative feedback were sufficient. Further studies are necessary to determine if the postoperative debrief results in improved alignment of expectations of feedback by both residents and fac- ulty. Certainly, it will be interesting to apply video review after implementation of the BID model to deter- mine how residents and faculty may differentially per- ceive the postoperative debrief. This study has several limitations. Participation from a single institution may limit generalizability. The resident and faculty portrayed in the video were members of the General Surgery program at the University of Michigan and were likely to be familiar to respondents. Thus, prior interactions with either individual may have altered timestamping of feedback by study participants. In addi- tion, the study was performed over the course of 3 months. Respondents that participated near the study conclusion, may have been influenced by the opinions of early participants. Furthermore, we analyzed the video segment in 10-second increments in order to per- mit detailed examination of single interactions. Further studies with a greater number of participants will be required to perform alternative interval division analysis. Moreover, due to our limited sample size we were also unable to perform subgroup analysis. Finally, as with all studies with voluntary participation, results may reflect selection bias. Faculty participation of 88% suggests this bias is less likely to play a role in faculty results than resi- dent results where participation is 56%. CONCLUSIONS In conclusion we confirmed that surgical residents and faculty members have differing perceptions on operative Journal of Surgical Education Volume 76 /Number 4 July/August 2019 913
  • 9.
    feedback. Here, weused a novel video review methodol- ogy to characterize perceptions of feedback during the course of an operation. This method is widely applicable to other learning environments such as medical educa- tion or patient education where elucidation of differen- ces of perception is necessary. ACKNOWLEDGMENTS The authors would like to thank Ali Jones for her assis- tance with construction of the website on which the sur- vey was administered. We also thank Lauren Wancata, M. D. and Niki Matusko, B.S. for critical review of the manu- script. Niki Matusko also provided statistical support. Ethical approval: This study was deemed exempt by the Institutional Review Board at University of Michigan on April 8, 2015 (HUM00084551). REFERENCES 1. Teman NR, Gauger PG, Mullan PB, Tarpley JL, Minter RM. Entrustment of general surgery residents in the operating room: factors contributing to provision of res- ident autonomy. J Am Coll Surg. 2014;219:778–787. 2. Malangoni MA, Biester TW, Jones AT, Klingensmith ME, Lewis FR. Jr. Operative experience of surgery residents: trends and challenges. J Surg Educ. 2013;70:783–788. 3. Champagne BJ. Effective teaching and feedback strategies in the OR and beyond. Clin Colon Rectal Surg. 2013;26:244–249. 4. Bell Jr. RH. Why Johnny cannot operate. Surgery. 2009;146:533–542. 5. Chung RS. How much time do surgical residents need to learn operative surgery? Am J Surg. 2005;190:351–353. 6. Snyder RA, Tarpley MJ, Tarpley JL, et al. Teaching in the operating room: results of a national survey. J Surg Educ. 2012;69:643–649. 7. Roberts NK, Williams RG, Kim MJ, et al. The brief- ing, intraoperative teaching, debriefing model for teaching in the operating room. J Am Coll Surg. 2009;208:299–303. 8. Mayer RE. Should there be a three-strikes rule against pure discovery learning? Am Psychol. 2004;59:14–19. 9. Accreditation Council for Graduate Medical Education. Program requirements for graduate medical education in surgery. Available at: http://www.acgme.org/acgme- web/portals/0/pfassets/programrequirements/440_gen- eral_surgery_07012014.pdf. Accessed December 28, 2015. 10. Ende J. Feedback in clinical medical education. JAMA. 1983;250:777–781. 11. Black P, William D. Developing the theory of forma- tive assessment. Educ Asses Eval Acc. 2009;21: 5–31. 12. Ericsson KA, Krampe RT, Tesch-Romer C. The role of deliberate practice in the acquisition of expert performance. Psychol Rev. 1993;100:363–406. 13. Hupp JR. Strengthening feedback in surgical educa- tion. J Oral Maxillofac Surg. 2017;75:229–231. 14. Hewson MG, Little ML. Giving feedback: verification of recommended techniques. J Gen Intern Med. 1998;13:111–116. 15. Hoffman RL, Petrosky JA, Eskander MF, et al. Feed- back fundamentals in surgical education: tips for success. Bull Am Coll Surg. 2015;100:35–39. 16. Butvidas LD, Anderson CI, Balogh D, et al. Dispar- ities between resident and attending surgeon per- ceptions of intraoperative teaching. Am J Surg. 2011;201:385–389. 17. Liberman AS, Liberman M, Steinert Y, et al. Surgery residents and attending surgeons have different per- ceptions of feedback. Med Teach. 2005;27:470–472. 18. Rose JS, Waibel BH, Schenarts PJ. Disparity between resident and faculty surgeons’ perceptions of preop- erative preparation, intraoperative teaching, and postoperative feedback. J Surg Educ. 2011;68:459– 464. 19. Jensen AR, Wright AS, Kim S, et al. Educational feed- back in the operating room: a gap between resident and faculty perceptions. Am J Surg. 2012;204:248– 255. 20. Vaughn CJ, Kim E, O’Sullivan P, et al. Peer video review and feedback improve performance in basic surgical skills. Am J Surg. 2016;211:355–360. 21. Scott DJ, Rege RV, Bergen PC, et al. Measuring oper- ative performance after laparoscopic skills training: edited videotape versus direct observation. J Lapa- roendosc Adv Surg Tech A. 2000;10:183–190. 22. Beard JD, Jolly BC, Newble DI, Thomas WE, Don- nelly J, Southgate LJ. Assessing the technical skills of surgical trainees. Br J Surg. 2005;92:778–782. 914 Journal of Surgical Education Volume 76 /Number 4 July/August 2019
  • 10.
    23. Zevin B,Bonrath EM, Aggarwal R, et al. Develop- ment, feasibility, validity, and reliability of a scale for objective assessment of operative performance in laparoscopic gastric bypass surgery. J Am Coll Surg. 2013;216:955–965. e8. 24. Abdelsattar JM, Pandian TK, Finnesgard EJ, et al. Do You see what I see? How we use video as an adjunct to general surgery resident education. J Surg Educ. 2015;72:e145–e150. 25. Trelease RB. From chalkboard, slides, and paper to e-learning: How computing technologies have trans- formed anatomical sciences education. Anat Sci Educ. 2016. Epub. 26. Makary MA. The power of video recording: taking quality to the next level. JAMA. 2013;309:1591– 1592. 27. Birkmeyer JD. Surgical skill and complication rates after bariatric surgery. N Engl J Med. 2013;369: 1434–1442. 28. Herrera-Almario GE. The effect of video review of resident laparoscopic surgical skills measured by self- and external assessment. Am J of Surg. 2016;211:315–320. 29. Paskins Z1, McHugh G, Hassell AB. Getting under the skin of the primary care consultation using video stimulated recall: a systematic review. BMC Med Res Methodol. 2014;14:101. 30. Hu YY, Peyre SE, Arriaga AF, et al. Postgame analy- sis: using video-based coaching for continuous pro- fessional development. J Am Coll Surg. 2012; 214:115–124. 31. Alken A, Tan E, Luursema JM, Fluit C, van Goor H. Coaching during a trauma surgery team training: perceptions versus structured observations. Am J Surg. 2015;209:163–169. 32. Stokes CS, Krawczyk M, Lammert F. Gallstones: environment, lifestyle and genes. Dig Dis. 2011; 29:191–201. 33. Strasberg SM, Hertl M, Soper NJ. An analysis of the problem of biliary injury during laparoscopic chole- cystectomy. J Am Coll Surg. 1995;180:101–125. 34. Smith SG, Torkington J, Brown TJ, Taffinder NJ, Darzi A. Motion analysis. Surg Endosc. 2002;16: 640–645. 35. Vassiliou MC, Feldman LS, Andrew CG, et al. A global assessment tool for evaluation of intraoperative lapa- roscopic skills. Am J Surg. 2005;190:107–113. 36. Chang L, Hogle NJ, Moore BB, et al. Reliable assess- ment of laparoscopic performance in the operating room using videotape analysis. Surg Innov. 2007 Jun;14:122–126. 37. Kramp KH, van Det MJ, Hoff C, Lamme B, Veeger NJ, Pierie JP. Validity and reliability of global opera- tive assessment of laparoscopic skills (GOALS) in novice trainees performing a laparoscopic cholecys- tectomy. J Surg Educ. 2015;72:351–358. 38. Strasberg SM, Brunt LM. Rationale and use of the critical view of safety in laparoscopic cholecystec- tomy. J Am Coll Surg. 2010;211:132–138. 39. van de Ridder JM1, Stokking KM, McGaghie WC, et al. What is feedback in clinical education? Med Educ. 2008;42:189–197. 40. Bienstock JL, Katz NT, Cox SM, et al. To the point: medical education reviews—providing feedback. Am J Obstet Gynecol. 2007;196:508–513. 41. Kornegay JG, Kraut A, Manthey D, et al. Feedback in medical education: a critical appraisal. AEM Educ Train. 2017;1:98–109. 42. Dreyfus SE. The five-stage model of adult skill acqui- sition. Bull Sci Technol Soc. 2004;24:177–181. SUPPLEMENTARY INFORMATION Supplementary material associated with this article can be found in the online version at doi:10.1016/j. jsurg.2019.02.003. Journal of Surgical Education Volume 76 /Number 4 July/August 2019 915