Editorial Type:
Article Category: Research Article
 | 
Online Publication Date: 05 Oct 2022

Standardized Patient Encounters and Facilitated Debrief Impact Teaching Pedagogy and Programmatic Improvements

PhD and
EdD
Page Range: 162 – 173
DOI: 10.4085/1947-380X-21-087
Save
Download PDF

Context

Standardized patients (SPs) are assessment measures used within athletic training education to provide a consistent assessment result during professional education. However, it remains unclear how educators use these SP encounters and facilitated debrief to improve teaching pedagogy and overall curriculum.

Objective

To understand how athletic training educators use SP encounters and facilitated debrief to inform teaching pedagogy and programmatic improvements.

Design

Consensual qualitative research.

Setting

Individual interviews.

Patients or Other Participants

Thirteen educators (12 female, 1 male, age = 36 ± 4.79 years, 10 ± 5.83 years teaching experience).

Data Collection and Analysis

Data were collected via semistructured interviews, which were recorded and transcribed verbatim. Using a consensual qualitative research design, data were independently analyzed by a 3-person team, who independently coded the data and compared ideas until consensus was reached. Trustworthiness was established through member checks, multi-analyst triangulation, and external review.

Results

Three themes emerged regarding how SP encounters and facilitated debrief inform teaching pedagogy and curriculum development: (1) mechanisms for programmatic improvement, (2) pattern identification in student performance and behavior, and (3) aids in transition to practice. Participants described mechanisms for programmatic improvements through changes occurring in didactic courses, driving decisions for clinical education, and identifying needs for personal development, including both faculty and preceptors. Participants expressed the ability to identify patterns in student performance and behaviors individually and globally particularly useful to informing curricular and course improvements. Aiding in transition to autonomous clinical practice was also identified as important in preparing students for patient care.

Conclusion(s)

Standardized patient encounters and facilitated debrief are being used by athletic training educators to inform classroom teaching while also serving as data points in making decisions regarding programmatic improvements. As SP and facilitated debrief use increases, educators need to reflect on how these encounters can inform teaching pedagogy and drive programmatic improvement.

KEY POINTS

  • As the use of standardized patients and facilitated debrief increases in athletic training education, educators should consider how student performance during these encounters can inform teaching pedagogy and programmatic improvement.

  • Educators have found benefit in observing students' performance during standardized patient encounters for identifying patterns in student performance and behavior in both individual students and globally as a group.

  • Standardized patients are beneficial in aiding in students' transition to practice.

INTRODUCTION

Athletic training educators are challenged to use assessment methods capable of driving continued program improvement (Standard 4).1 Purposeful assessment drives instruction and affects learning, allowing assessment, pedagogy, and curriculum to work together. This is no simple feat, as assessment measures within a programmatic assessment plan must include assessment of the quality of instruction, quality of clinical education, student learning, and overall programmatic effectiveness (Standard 5).1 While programmatic assessment plans vary, key components include good outcomes measures and data to use as benchmarks for validating program effectiveness or guiding programmatic improvements.2 Within athletic training, standardized patients (SPs) have been documented as beneficial assessment tools for improvements in students' clinical and interpersonal skills,36 patient-centered care,7,8 and confidence.6,9,10

Generating rich assessment data can be accomplished using authentic assessments which require students to use prior knowledge, recent learning, and relevant skills to solve realistic, complex problems.11 Standardized patients and facilitated debrief are examples of authentic assessments, as they replicate real-world challenges and standards of professional performance. Standardized patients provide opportunity for both formative and summative assessment.6 As a formative assessment, teaching encounters provide opportunities for students (typically in small groups) to engage a patient in a nonthreating environment with an instructor present.5,6 Teaching SPs allow the use of the time-in-time-out method, where learners are allowed to stop the patient encounter to ask for assistance from the instructor or patient.5,6 As a summative assessment, evaluation encounters are used to evaluate learning throughout a course, such as at the end of a learning module. Completed individually, evaluation encounters are useful in assessing the student's ability to provide holistic patient-centered care for a certain clinical task,5,6 such as a shoulder evaluation or implementing therapeutic interventions. Both teaching and evaluation encounters are concluded with a facilitated debrief, often facilitated within 24 to 48 hours after the simulated encounter.

Facilitated debrief is a guided discussion that provides students and educators with an opportunity to explore students perceptions, attitudes, skill application, and decision making, all critical for reflective learning.12 The debrief, often completed as a small group discussion with all students present, allows students to reflect on their actions or inactions during a patient encounter, while also developing strategies that can be applied in subsequent patient encounters.13 The debrief after a SP encounter allows faculty to understand the thought processes behind actions and clinical decisions students make during the encounter.14 Both SP encounters and facilitated debrief serve as sources of assessment data within the programmatic assessment plan; however, no research exists documenting their use in athletic training. Therefore, the purpose of this investigation was to explore how athletic training educators use SP encounters/simulations and facilitated debrief to inform teaching pedagogy and curriculum development.

METHODS

Design

This study used a qualitative research design performed and analyzed in a consensual qualitative research (CQR) format (Figure 1). Consensual qualitative research was developed though the integration of phenomenology, grounded theory, and comprehensive process analysis.15 Combining these approaches allow for an increased emphasis on multiple researchers to gain consensus throughout the analysis process to identify the best representation of the data.15 The use of open-ended questions via semistructured interviews guided data collection. Three researchers were used in the data analysis process in CQR, where each individual reviewed the data independently and then came together to reach consensus on the meaning of the data.15 This approach used multiple perspectives to enable the research team to capture some of the complexities of the data as different members perceived data differently.15 The consensus process is important to establish trustworthiness, as consensus allows multiple perspectives to be combined during data analysis to represent the data thoroughly and richly. Thus, if multiple researchers have independently analyzed the data and agree on the interpretation, there is confidence that similar individuals would also agree on that interpretation.15 The Standards for Reporting Qualitative Research checklist was used as a means of quality assurance of the investigation and reporting (Table 1).

Figure 1Figure 1Figure 1
Figure 1 Research procedure flow chart.

Citation: Athletic Training Education Journal 17, 2; 10.4085/1947-380X-21-087

Table 1 Standards for Reporting Qualitative Research Checklista
Table 1

Participants

Participants were recruited purposely for their known experience with SP encounters in professional education in an athletic training program. For inclusion in this study, participants had to administer at least 8 SP encounters within a professional education course or program and completed at least 1 year of professional practice as a certified athletic trainer at the time of the interview. Through the known population (through completion of formal training in using SPs), random sampling was completed to identify potential participants. Snowball sampling was also used during data collection, where participants informed the researchers of other eligible educators that met the inclusion criteria. Thirteen athletic training educators (12 female, 1 male; mean age 36.92 ± 4.78 years) volunteered to participate. Participants included program directors (n = 3), clinical education coordinators (n = 5), and faculty (n = 5) and encompassed professional master's and bachelor's programs. Participant demographics are presented in Table 2. Each participant was assigned a pseudonym.

Table 2 Participant Demographics
Table 2

Instrumentation

The interview guide was developed by the primary researchers based on the purpose of the study. The aim of the interview guide was to determine how SPs and facilitated debrief impacted teaching pedagogy and programmatic improvements. Following CQR, the interview guide was developed based on this previous research.810,1618 The interview guide was reviewed for content and clarity by the other members of the research team, as well as 2 educators trained in qualitative research design. The primary researcher performed pilot testing of the interview guide with 2 educators that did not meet the inclusion criteria, and modifications were made to develop the final interview guide (Table 3).

Table 3 Final Interview Guide
Table 3

The semistructured interview guide included 11 open-ended questions. The first 6 questions pertained to how participants use SPs/simulation within their athletic training curriculum (eg, method of use, specific courses, or content area). The remaining 5 questions related to how participants used the information gleaned from SP encounters and facilitated debrief informed teaching pedagogy and impacted programmatic improvements. Subset questions were used as a prompting tool throughout each section of the interview guide. Keeping the interviews semistructured allowed the data collection to remain robust, so that participants could elaborate where appropriate.

Data Collection

Participants who met the criteria were contacted via email to obtain consent for participation and preferred method of interview. Once informed consent was obtained, interviews were scheduled. According to participant availability, interviews were conducted via telephone or in person at the Athletic Training Educators Conference. Semistructured interviews of approximately 45 minutes in length were used to collect data. Interviews were conducted by the researchers until data saturation occurred. To maintain anonymity, each participant was given a pseudonym. To initiate the interview, the researcher described the purpose of the study and reviewed confidentiality concerns to address all participant questions and have participants provide informed consent. Time restraints were not placed upon the interview.

Procedures

Institutional review board approval was obtained before data collection. Participants who responded to the recruitment email were screened for inclusion criteria, and an interview was scheduled. Once consent was obtained, the primary researcher performed the interviews via telephone or in person, which lasted 30 to 45 minutes. All interviews were audio recorded and transcribed verbatim by a professional transcriptionist. The transcripts were reviewed by the research team and cleaned for accuracy.

Data Analysis

All members of the research team were trained in qualitative analysis and CQR. Data were analyzed via the CQR method. The CQR method consists of independent review by each member of the research team, followed by a group consensus to develop a preliminary and final codebook.15 Each member of the research team had experience and expertise in using SPs and in using qualitative research methodologies. The consensus process allows for multiple-analyst triangulation, and the variety of viewpoints help decrease researcher bias while developing an understanding of the meaning of the data.15 Frequency counting allows a depiction of representativeness of the data by determining how often each category was applied across the sample.15 Data saturation was confirmed.

Trustworthiness

Several strategies were implemented to establish trustworthiness of this study. Transcripts were sent to participants to ensure accuracy of the interview as a method of member check.19 As described by CQR, member checking not only allows the participant to check for accuracy but to provide further comments on how well the data represent the experience described.15 Member check was completed, all indicating that the transcripts were accurate representations of the interview. Additionally, an external auditor was used to review the transcripts and codebook, to gain further consensus of the codebook. This ensured accuracy of data representation, while allowing for multiple perspectives to investigate the meaning of the data.

RESULTS

Three themes emerged regarding how information gleaned from student performance during SP encounters and debriefing facilitate purposeful change in instruction and learning: (1) mechanisms for programmatic improvement, (2) pattern identification in student performance and behavior for curricular improvement, and (3) aids in transition to practice. The mechanisms for programmatic improvements were divided into 3 categories: changes occurring in didactic courses, driving decisions for clinical education, and identifying needs for personal development, including both faculty and preceptors. The ability to identify patterns in student performance and behaviors both in individual students was divided into 2 categories: patterns occurring globally within students and patterns observed in individual students. Aiding in transition to autonomous clinical practice was divided into 3 categories: safe environment, authenticity, and remediation. See Figure 2 for a representation of the data.

Figure 2Figure 2Figure 2
Figure 2 Conceptual framework of qualitative data.

Citation: Athletic Training Education Journal 17, 2; 10.4085/1947-380X-21-087

Mechanisms for Programmatic Improvement

Mechanisms for programmatic improvement included comments regarding how data were used for programmatic improvements. Comments were divided into 3 subcategories, changes occurring in didactic courses, driving decisions for clinical education, and areas of personal development. Data are presented by each subcategory.

Changes Occurring in Didactic Courses

Many participants referenced how illusive the process was to better understand what they could do to better guide the students' learning. Despite best efforts in planning, many participants observed skill application errors from their students during SPs that were hard for the educator to understand since they felt they had taught the material in a way that should have avoided observed errors. The observed disconnect combined with the facilitated debrief discussion prompted self-reflection on their pedagogy and classroom training. Participants described information learned from SP encounters and student debrief sessions to be one of the most impactful means to understanding their students' ability to process information. This process provided a lens into how students are thinking, allowing the educator to better determine how to supplement or course correct their thinking. As Colby stated:

One can learn about the student's clinical skills and abilities. . . especially if they have some misconceptions, I feel like that comes out pretty quickly. When you are doing [SPs], they can see that they don't know something, or they misunderstand. I try to get them to discuss their train of thought so I can understand maybe if they're misunderstanding a term or have misinformation versus a lack of confidence.

Participants also highlighted that the SP and debrief process were helpful in identifying knowledge gaps in students' knowledge and analyzing where those gaps may be coming from. Marley commented:

Okay, where is that gap coming from? Are they not using it [knowledge or skill] in their clinical experience? Were they not super comfortable when they left class? Maybe we need to build something into our classes that refreshes that skill.

Similarly, the use of SPs and debrief provided clarity for how to better scaffold course learning opportunities. Participants described watching SP encounters that did not go well being impactful wake-up calls to their teaching. They were reminded that the knowledge and skill they possess is novel to their students. These encounters and debrief reminded them to reflect when they were skipping steps or lacking instruction for outcomes they had hoped to achieve. As Taylor shared:

I don't know why they didn't get this. We've went over it several times. . . when students continued to miss things we covered in class multiple times, I recognized that my own teaching interventions may need to change, be adapted, or be broadened during class to help students increase their mental flexibility in how they apply their skills to various situations.

Whether using SPs for teaching or assessment, participants reported attaining a better understanding of a students' readiness for skill application and realizing their failure to prepare students for expected outcomes which prompted future course changes. As Sawyer shared:

We recognize communication, transition to care, and that handoff was rough. When we set them up for that we are going to give them more specific things they should be doing. We just said, “Oh, just go in there and have a conversation with them.” It was a mess. Now we literally have a form of how to have that conversation. Here are some good introductory questions. Here's lines of thinking and behavior that you need to get at, and this is how to do that.

Participants noted several course-level changes that directly resulted as an observation from students' participation in SP and debrief. The standardized nature of the SP process helped participants see what students globally could not do. If all students were struggling, the participants noted their newfound understanding that their educational strategies needed to be reviewed. Blake stated:

We have definitely added more documentation experience after receiving their SP documentation. We also have instituted more phone communication. Practices of phone communication, conversation with the parents, conversation with team physician. Those are skills that we would role play it in class before they get to the SP encounter. We go through scripts and talk about what are the elements you need to present with. Students were not able to communicate by phone very well.

Driving Decisions for Clinical Education

Participants described that the SP and debrief process was helpful to placing students in clinical education that is increasingly complex, leading to autonomous patient and client-care experiences, while also illuminating students' educational needs that could be bolstered through intentional clinical experience placements. Many participants indicated they could determine which students needed exposure to specific patient populations or pathologies and therefore would place students in clinical experiences that could provide a higher volume of patient populations or pathologies. As Peyton explained:

With the SP encounters, I'm able to see a full eval top to bottom, and I'm able to see their strengths and weaknesses, and it has really helped me. I know some clinical placements, they can have more of a focus on documentation and patient care, where some are going to be more rehab intensive, or some are going to be evaluation intensive. I am able to see really what their weaknesses and strengths are in a very controlled setting. That helps me place them in a place they need to work on that.

Identifying the importance of the partnership between didactic and clinical experience educators was important for participants as they collaborate to advance student development. Some participants use SP and debrief data to build student profiles that are shared with preceptors. The encounter and debrief also informed didactic teaching that can better close the loop between didactic and clinical education. Elliott remarked:

SP encounters identified student behaviors that can be used in preceptor development discussions, especially in situations that we see common errors and identify that the students are not getting adequate practice opportunities during clinical education.

Similarly, combining the information learned from SP and debrief with outcomes data derived from clinical education patient encounter logs allowed participants to determine what SP encounters would best supplement patient care. As Elliott continued:

Using students' patient encounter logs, we are able to see what our students are doing and see what our students are not doing [in clinical education]. We try to design our SPs based on what they are not seeing much of in clinical education, allowing students to demonstrate and work on those skills.

SP encounters have also been meaningful ways to supplement clinical education through simulation. Marley stated:

Previously we've used them a lot in a general medical, behavioral health, both of those types of courses to supplement things that students don't always see alone in clinical education, less in an evaluative way and more a teaching strategy in terms of supplementing the things that they don't get a lot of experience with.

Identifying Personal Development Needs

Participants perceived the use of SPs and debrief as important mechanisms for self-reflection on their teaching. It was reported that SPs and debrief increased participants' satisfaction in teaching as this instructional strategy provided a space for students' skill application followed by a guided discussion that highlights thought processes and decision making. These encounters and debrief helped the participants understand what students need to do better and what they, as the course instructors, needed to do better. As Blake shared:

It's given me a new point of reference of how my teaching style and how I feel invigorated in the classroom, really makes the learning more fun. Objective Structured Clinical Examinations (OSCEs) and SPs were an extension of that. It was a lot more problem-based, a lot more active learning.

Participants found themselves reflecting on their teaching practice to better understand whether they felt foundational knowledge was paced appropriately, whether they had provided enough explanation or skill practice, and whether the students were adequately prepared for the SP encounter from a course planning perspective. As Taylor described:

Did I time the SPs correctly? Were the teaching points and activities in the classroom accurately teaching the correct knowledge, skills, behaviors, and attitudes? Educators have the opportunity to identify students' patterns that can make them better educators. If we use those patterns in our didactic classrooms, we can attempt to develop students ready to navigate those situations better in clinical practice.

Similarly, SPs and debrief were helpful in identifying where individual level improvements were needed, identifying both faculty and preceptor expertise needed. Blake commented:

The gaps in our faculty and preceptor expertise were revealed. Whether or not they are teaching up-to-date information, that's been something, or the students are relying on their preceptors, and the information or techniques they are advocating are no longer the best evidence. Those are the things we have identified in the debriefing where so and so said and so and so said, and this is where I got this from. Okay, so how do we deal with that?

The SP and debrief also exposed discrepancies between clinical practice and didactic teaching. These realizations created opportunity for conversations within the program to bridge identified gaps through preceptor training and discussions. As Blake described:

We've had differences of opinion on what's right. I think that, in some cases, it's philosophical. The modality controversy is whether or not to ice. How do we talk about that, and what should the clinician tell the patient? Ice, don't ice, how long, and why to ice. We have to come back to let's all get together and discuss. What are we going to tell the students? You can all have your own interpretation, but what does the evidence say? That one's still not been resolved because of the different competing belief in recovery ice versus injury, etc.

Pattern Identification in Student Performance and Behavior for Curricular Improvement

The second theme, pattern identification in student performance and behaviors for curricular improvement, included comments regarding patterns participants identified through SP encounters and facilitated debrief. Two subcategories emerged, including global pattern recognition and individual pattern recognition. Data are presented by each subcategory.

Global Pattern Recognition

Broad and global themes, such as medical documentation, communication, holistic health care, and cultural competency, were commonly mentioned to be areas participants identified where students were not achieving outcomes set by their respective program. By observing student performance during SP encounters and listening to student debrief, participants recognized the need for resequencing course content by moving some knowledge and skill earlier in the course sequence and holding off on others to be sure students had the appropriate foundational knowledge. Blake stated:

I began having students conduct history taking and patient communication experiences early in the curriculum so that they could develop cultural competence, they could develop cultural awareness, that they would be able to make sense of their own personal biases of patients. I would want them to be confronted with that early. Then we can really respond to why this is a factor and provide patient care.

Participants reported using information learned from SPs and debrief as a positive source of data to better students' knowledge and skill attainment, specifically related to confidence, communication, the evaluation process and differential diagnosis, critical thinking, decision-making, and student ability to self-reflect. Understanding these patterns helped participants make course changes or confirmed their students were meeting learning objectives.

Confidence

Participants identified the need to teach beyond clinical skill understanding and application. Through the SP encounter and debrief, participants learned their students' confidence level and their ability to navigate the experience. This understanding of students' confidence helped participants identify misconceptions in perceived confidence levels that could be mitigated through professional discussions and positive reinforcement. As Alex recalled:

Sometimes it's interesting to look at how they perceive how we see them versus how they see their own self. Maybe totally different. Where we see them as strong, sometimes they don't see themselves in that area that well.

Participants acknowledged the challenge in building confidence in students and found the use of SPs, especially the debrief process, was critical in fostering students' confidence.

Communication

Participants identified the usefulness of SP encounters and debrief to practice communication skills with patients, as well as dialoguing about the challenges students face with communication. Many identified this as a foundational skill needed early in a students' education. As Carter explained:

I feel like it's made me reflect on the whole communication piece that I feel like that needs to be emphasized earlier on in our program. That's the part I might be evaluating on is their ability to communicate. Not necessarily on did you get the right differential and initial diagnosis.

Furthermore, the SP encounter and debrief were helpful in establishing communication that prioritizes patient-centered care. Riley commented that her program has incorporated several teaching SPs into their curriculum, and that has encouraged incorporation of the communication side of evaluation into their didactic course, where historically communication was only assessed in clinical education. This is another example of how SP encounters and debrief have been beneficial in closing the educational loop.

Critical Thinking

Participants described their longstanding goal to teach students to think critically regarding their clinical practice. Standardized patient encounters followed by a debrief provided an impactful way to prompt critical thinking. The debrief allowed participants' students a space to objectively analyze and evaluate how the SP encounter went. The group analysis in the debrief helped the students form judgments on their previous actions and their future actions. As Marley described:

The students were discussing different strategies they had used and different things that had happened. The debrief was a really good place for them to get it all out. They ended up having a really interesting discussion without any prompting from me.

Participants also felt they could explore advanced steps in managing patient care to challenge students' ability to handle the next steps in patient care. For example, Sawyer described:

We just did one [SP] where we gave an ankle case and depending on what they thought they needed we gave them a phone number for a parent, the phone number for a physician, and they knew they could've called 911. I guess, in a way, we kind of got at their thinking about how urgent the situation was, and did I need to refer this, or do I need to call mom right now or later or what? In some of the higher-level encounters where there are multiple things to manage, I think we get their critical thinking.

Decision Making

Participants perceived developing clinical decision-making skills in their students to be supplemented by SP encounters and debrief. Participants felt strongly that SPs and debrief were a critical component to developing the learners' role professionally and clinically. As Marley stated:

Students have a hard time making decisions. They're not sure of themselves. They are startled by things they haven't seen before. I think one thing I like about SPs, it gives us an opportunity to explore things they don't always get to see in clinicals and make decisions or be in an instance where they are the clinician.

Participants also recognized that SP encounters provide an experience students perceive as stressful, mirroring how they felt in real-life patient care. Many times, in clinical practice, preceptor supervision lightens the cognitive load to the student. Participants found SP encounters and debrief to be effective to promote linkage of didactic content to clinical problems that require flexible application of knowledge and skills. As Marley continued:

They [students] are protected by a preceptor, which is good. That's what we want. That's the goal, but the preceptor sometimes takes the brunt of that emotional experience or that experience in general. I think SPs give us a place where we can put them in scenarios where, “You're the only clinician in the room. I need you to make a decision.”

Student Self-Reflection

Participants considered the power of student self-reflection to be a particularly important skill to develop within early career professionals to be prepared for self-reflection during future clinical practice. Blake articulated:

My attempt [in reference to the debrief process] at providing these realistic encounters was to provide students with an opportunity to fail, provide them to be presented with a very stressful environment where it's safe to fail, then also the intentionality of reflecting on what you did right. We always begin with, “What did you do right? How did you feel coming out? What could you have done better? What are some of the things you probably could work on? Then what are you going to take away and learn from this?”

Debrief was described by participants to be a critical component to develop self-reflection and clinical reasoning. The method prioritized student-centered learning and provided an opportunity for cognitive reframing, exploring various perspectives, and solution generation for the students. As Charlie shared:

From their experience, quality debrief is essential for self-reflection. Using a formal debrief framework, such as the Debrief Diamond,20 provide a framework that keeps the instructor focused on guiding the student through the self-reflection process. Without this, the debrief process is sabotaged with stories of the instructor saying, “When I was an athletic trainer.”

Participants described how debrief prioritized centering the experience of the student, essential to allowing time and space to self-reflect, allowing the students to verbalize, process, and discuss their clinical experience.

Individual Pattern Identification

Participants reported that SP encounters and debrief were important for identifying patterns within individual students as well.

Development of Soft Skills

Participants noted improvements in students' ability to interact effectively with their patients during SP encounters. In reflecting on interactions through debrief, students learned from each other how communication was established and methods that could be useful for them in the future. As Taylor explained:

I also think about, how do I better foster soft skills in the students? What does their communication skills, bedside manner, note taking, and attention to the patient look like when they are in the SP encounter? These are important skills but not necessarily the ones that we look at when we are teaching in the classroom.

This has prompted participants to increase their classroom attention to soft skills as students are developing their clinical skills.

Assessment

Quality assessment supports the academic program's ability to determine student patterns over time (both global and individual patterns) that can be used for program modification or student remediation. Participants reported their respective programs use of SP encounters and debrief were useful for long-term assessment within their programs. As Taylor continued:

SP encounters can help you identify what your students are or are not retaining. Continuously examining course sequences, course structures, and skill acquisition can help identify retention of information and skill application.

Some participants are using SP encounters as part of their programmatic benchmark assessment to identify adequate clinical skill performance to allow progression from one programmatic level to another. Blake reported how his program uses benchmark data:

We do use it as a remediation identifier as well. We have benchmarks[that] students have to score in the sophomore year. Students had to score in the 60% of all OSCEs (that include SPs), and if they didn't, then they have to remediate that domain, and then they have to perform another remediation OSCE. The junior year it is 65%. At the senior (year), it is 70%. Those will go up in the graduate level. It has definitely enabled us to identify who needs remediation.

Additional ways participants reported using SP encounters and debrief data is through a pretest/posttest to allow demonstration of learning over time between semesters that include progressive instruction and skill development interventions.

Aids in Transition to Autonomous Practice

The third theme that emerged included how SP encounters and debrief facilitated professional socialization and transition to autonomous clinical practice. This theme is divided into 3 subcategories, including safe learning environment, authenticity, remediation.

Several participants spoke of the pressure to prepare students ready for autonomous clinical practice without supervision immediately upon graduation. Participants felt SP encounters and debrief were key to preparing for transition to practice. Standardized patients provided a mechanism for the most authentic patient care experiences possible in a classroom setting. Participants also referenced the positive feedback they are getting from preceptors, alumni, and employers since implementing SPs and debrief into their courses and curriculums. As Blake recalled:

The deliberate effort that it takes to go through that process has so many more tangible benefits on the opposite side than having students come in and just going through a checklist. I've seen the outcomes from the way that our students report back to me as alumni, that within their first year, within their first week, within their first emergency clinical encounter, they don't freeze. They feel ready. They feel comfortable. Our coordinating supervisors for the hospital systems where they've tended to have gotten hired have said, “Your students just get it. They are ready.” Our students that have gone on to preprofessional programs in other disciplines, pre-PT, pre-PA. They also remark, “I am just so much better at communicating with patients, integrating with patients, having that dialogue.” Those are skills that you must practice.

Safe Learning Environment

Participants spoke often of putting students in safe learning environments (ie, not patients with real injuries) in which they can practice patient care skills. Several participants identified the need for a safe learning environment and the opportunity to fail to demonstrate clinical readiness. As Blake commented:

I'm glad to see that more programs are using [SPs] and simulation. I realize the new standards say simulation can be a component of clinical education or clinical experiences, but without simulation and failure, we are going to have AT students and young professionals who will be left alone in their settings at their high schools, and they'll likely have to experience a significant amount of experience that was unnecessary dissonance. They should have been better prepared before they take that independent position where they are by themselves in a high school or in a college or in a work site setting. They should have been exposed. That's why we do 38 [SPs across the curriculum].

Authenticity

Participants recognized student buy-in regarding the realism of the SP encounter began in the classroom from the beginning of the class. Generating this engagement with students was critical to establishing realism of the encounters. Furthermore, the development of a realistic SP case and the learning space was critical for the learning opportunities to feel real. As Taylor stated:

I've learned to set the tone for realism during my didactic courses by stating that the students need to transition from seeing their patient in front of them rather than their classmates. They need to practice treating patients, not practice treating their classmates. Repetition will reflect in your clinical practice.

Many participants perceived SP encounters and debriefing as pivotal in providing realistic scenarios for students that focus on learning a process rather than independent skillsets. The SP encounters and debrief helped students understand what is or is not needed during their patient evaluation. Blake remarked:

I don't know why other programs don't use them. I worry about the state of the profession if students aren't getting realistic patient encounter experiences in which they fail. I'm concerned about just the rote memorization of how I do an evaluation. The students don't develop a clinician mindset until they realize, “Oh, well, I don't really need to do this test.”

The process of completing or ending a patient encounter is another aspect of SP encounters participants believed added realism to the learning experience. No longer can students complete an evaluation and stop when they feel like they have reached their diagnosis. Participants realized that they need to also teach students how to deliver a diagnosis to the patient and end the exchange. As Marley shared, “We found ourselves prompting the students, ‘What do you do when you get to the end of an eval?' You can't just walk away from your patient.”

Remediation

Participants shared how information learned from SP encounters and debrief were beneficial in identifying areas where remediation were needed and how that remediation aided in transition to autonomous clinical practice. Of particular importance, participants described understanding what remediation is needed globally across multiple students, while also being able to identify when remediation is specifically needed individually to support a student's development. As Taylor shared:

Maybe I note that they [students] all do a particular mannerism during the encounter. I may demonstrate that and ask them for feedback about how I just interacted with the patient, then walk them through reflection to see how many noticed you do this in your SP encounter.

Similarly, Colby described areas where remediation occurred on a global level with all students:

I think it gives me a good idea if there's something that I missed, especially if multiple students are missing or misinterpreting something I said or if they all miss the same thing. If they all forget to ask about contraindications to an intervention, then I obviously did not do a good job of discussing the importance or finding contraindications. I think it helps me to go back and address things with the class as a whole.

Remediation on the individual level was also important to assist students in the transition to autonomous clinical practice, as Jordan noted:

It has helped me see what students need to work on, not just [with an injury or pathology] but also the evaluation process. Often, students get so used to role playing and doing things with each other, they forget to actually do that on a real patient. In role play, your partner can give you the answer. An SP does not do that for you.

DISCUSSION

Professional athletic training programs are required to use assessment results as a mechanism for continuous quality improvement of the program.1 The interpretation of results and use of results from assessment are the final components of the assessment loop2 and provide what specific improvements are needed programmatically. Within athletic training, the documented use of SPs to improve students' knowledge and clinicals skills,810,1618,21,22 as well as the inclusion of simulation and SPs as a means for programs to meet clinical education components of accreditation standards,1 validate the importance of using SPs within the programmatic assessment plan.

Mechanisms for Programmatic Improvements

Nearly all participants reported using information learned from SP encounters and facilitated debrief as important means for assessing quality of instruction. The goal of simulation-based learning has been identified to bridge gaps between theory and practice, safely practice skills without harm to a patient, and assist faculty in identifying student competency.23 However, these educational activities are often not included within the assessment plan to provide data regarding the quality of instruction. The most common sources of evaluation data for teaching quality include students, peers, supervisors, and teachers themselves.24 However, for many, student ratings of course instruction have long been used as a primary and sole means of evaluating faculty teaching.25 The student ratings of course instruction usually contain both open-ended and closed-ended questions, where students are encouraged to provide written comments about the content and instruction.26 After surveys are completed, the responses are summarized across instructors, departments, and colleges and are viewed as evidence of teaching effectiveness.26 These evaluations provide diagnostic information for faculty on specific aspects of their teaching to help improve their performance.24

While nearly all previously documented methods of evaluating quality of instruction provide quantitative data,2326 our participants expressed how they value the qualitative data that SP encounters and facilitated debrief provide for making programmatic improvements, which result in improving quality of instruction. No previous research exists to document how these encounters and facilitated debrief are used within the assessment plan. The data gleaned from SP encounters and facilitated debrief allowed faculty to make connections to instruction and didactic courses, work with preceptors to enhance clinical education, while also facilitating reflection to self-identify areas where improvements can be made.

Pattern Identification in Student Performance and Behavior

The assessment of student performance is critical to both didactic and clinical education. Unfortunately, clinical education cannot provide students exposure to all patient encounters they will face during patient care; thus, those experiences are supplemented. Within our study, participants were able to glean patterns for both individual student development and global curricular course improvements from SP encounters and facilitated debrief. Similarly, Kilbourne et al27 reported that students reported that many skills or procedures they have watched their preceptor perform are things they have not ever done themselves. This finding further validates the need for simulated learning experiences to adequately prepare students for clinical practice.

Teaching (formative) and evaluation (summative) SPs with facilitated debrief were noted by participants in identifying patterns in student performance and behaviors. Formative assessments provide students with useful feedback that help them become better clinicians and learners, whereas other forms of summative assessment provide little or no feedback to students other than their outcomes on an assessment, such a score, or a grade.28 As noted by our participants, formative assessment is valuable to provide students with critical feedback regarding their performance or behaviors but also for faculty in planning or modifying learning activities. For students, formative assessment is used to facilitate student self-awareness about specific clinical skills, actions, or behaviors, with the intent to reinforce good performance or redirect and correct specific deficiencies.29 Our participants reported using SP encounters and facilitated debrief as both formative and summative assessment data in identifying patterns individually within students and globally across the programs to make course-level or programmatic improvements.

Aid in Transition to Practice

Our participants expressed using data from SPs and facilitated debrief as a means for fostering transition to autonomous clinical practice throughout the professional program. As with other health care disciplines, no matter the level of autonomy provided by a preceptor, full autonomy does not occur during athletic training clinical education, as all experiences must be supervised.27 However, clinical education provides the foundation necessary for students to gain an understanding of the expectations they would face when they transition to clinical practice.30 Previous researchers3032 noted the pivotal role clinical education and preceptors play in the transition to practice of students, by allowing some degree of independence in making clinical decisions as a student while still being supervised.

Our participants recognized the importance that simulation and SPs also aid in transition to practice, by providing authentic real-time patient encounters in a safe environment with patient encounters students are not often afforded during clinical education. Specifically, health care administration and conflict management are aspects of clinical practice that students are often shielded from in patient care.27 As noted by participants, regular communication with both students and preceptors allowed faculty to develop simulations to provide opportunities for students to engage in encounters missed during clinical education (or use simulation to remediate negative patient encounters) to be better prepared for clinical practice.

LIMITATIONS AND FUTURE RESEARCH

Results of this study are generalizable to athletic training educators and programs using SPs. It should be noted that our participants were not novice users of SPs. Each of our participants were from a known sample of individuals actively using SPs for teaching and assessment, all of whom completed formal training for implementing SPs and facilitating debrief. We also recognize that most participants represented a professional program that had fully integrated SPs into multiple courses and as a part of the program's overall assessment plan.

Future research should include a longitudinal follow-up with participants to understand how and what programmatic changes occur within sustained long-term use of SPs for teaching and assessment. Additionally, further research should collect preceptor feedback (to combine with what is observed from educators during SP encounters and debrief) to develop holistic global and individual areas of improvement at the student and programmatic level. Lastly, follow-up with recent graduates (1 to 3 years of clinical practice) who were exposed to SPs during professional education to assess students' perceptions of development of clinical practice skills and transition to practice as autonomous practitioners should be explored.

CONCLUSIONS

Program assessment should yield data that are useful for driving programmatic change. Standardized patient encounters are being used by athletic training educators to inform not only classroom teaching but serve as data points in making data-informed decisions regarding programmatic improvements. Current athletic training educators who have integrated SPs within their curriculum and courses reported that data gleaned from the SP encounter and debrief were valuable pieces both individually and globally to make data-informed decisions that drive course-level and program-level change. At the individual level, this information provided specific feedback on individual student learning needs. At the program level, data were important to identify where course-level or overall curricular changes were needed. As athletic training educators move forward, it is imperative to consider SP encounters and debrief as an assessment measure in the program's assessment plan to facilitate continuous quality improvement.

Copyright: © National Athletic Trainers' Association
Figure 1
Figure 1

Research procedure flow chart.


Figure 2
Figure 2

Conceptual framework of qualitative data.


Contributor Notes

Dr Frye is currently the Program Director for Athletic Training in the Department of Health Professions at James Madison University. Please address correspondence to Jamie L. Frye, PhD, Department of Health Professions, James Madison University, MSC 4315, Health and Behavioral Studies Building, Harrisonburg, VA 28807. fryejl@jmu.edu.
  • Download PDF