Preceptor Understanding, Comfort, and Use Related to Evidence-Based Practice Competencies
The Fifth Edition of the National Athletic Trainers' Association Athletic Training Education Competencies includes the significant addition of competencies covering evidence-based practice (EBP). While the concept of EBP is not new, the terminology in the Competencies may be new to clinical practitioners who did not receive the same educational experiences. The objective was to explore the understanding, comfort, and use related to EBP competencies by preceptors. Specifically, we explored the efficacy of an educational intervention designed to increase preceptors' understanding of, comfort with, and use of the EBP competencies. Participants were assigned to an experimental or control group. A pretest and posttest design was used to measure understanding, comfort, and use. An educational intervention designed to increase understanding, comfort, and use of the EBP competencies was used with the experimental group. The education intervention was a combination of presentations, student-led article reviews, and a student-led project. The study was completed at a large state-affiliated Midwest university. Nineteen preceptors from the collegiate and high school setting (12 men and 7 women, average age = 32 ± 8.3 years, average experience = 8.1 ± 8.8 years). A survey instrument (EBP Preceptor Survey) was designed and tested for reliability (α = .979). All participants completed the EBP Preceptor Survey before and then after the intervention. Experimental preceptors completed the education intervention. Repeated-measures analysis of covariance was used to detect pretest to posttest differences at the P ≤ .01 significance level. Statistically significant results indicate that after the intervention the experimental preceptors increased understanding for 4 of the 14 competencies, comfort for 9 of the 14 competencies, and use for 1 of the 14 competencies. A focused education intervention may increase understanding and comfort but might not increase use of EBP concepts.Context
Objective
Design
Setting
Patients or Other Participants
Main Outcome Measures
Results
Conclusions
INTRODUCTION
Evidence-based practice (EBP) is an integral component of modern health professional practice and education. Sackett and colleagues1 defined EBP as “the conscientious, explicit, and judicious use of current evidence in making decisions about the care of individuals.” The Institute of Medicine released a report in 2003 that stated that EBP is an essential competency of health care professionals.2 Hertel3 noted that when operating in an environment of accountability clinicians need enhanced research training beyond design and statistics and into areas such as clinical epidemiology and interpretation of clinical significance.
Many studies have identified barriers to EBP implementation that include, but are not limited to, clinician time,4–8 clinician autonomy to change practice,4,9,10 clinician knowledge of EBP concepts,4,9,10 and clinician access to resources.7,8,10 There exists an inherent didactic-to-clinic gap within many clinical-based education programs, and this gap creates a barrier to EBP implementation. To overcome the barriers, many suggestions have been made. Authors have suggested faculty development opportunities,11,12 research focused on clinical outcomes,5 and communication between clinician and faculty.13 Clinicians have noted that access to processed EBP information and workshops focused on EBP outcomes along with repetition and exposure can help overcome barriers.14
The Fifth Edition of the National Athletic Trainers' Association Athletic Training Educational Competencies15 (Table 1) includes competencies covering EBP concepts that had not previously been included. While the average clinician may practice various degrees of EBP, it is unclear if they have had a formal education in the concepts of EBP as they are now presented to all students in athletic training programs. Manspeaker and Van Lunen16 hypothesized that only a small proportion of our profession has a formal education in EBP concepts. Yew and Reid6 demonstrated that medical residents instructed in EBP skills did not regularly practice these skills. Hankemeier et al17 demonstrated that clinicians had the lowest level of EBP knowledge when compared with students and educators. Preceptors are charged with modeling behaviors that students will follow. Hankemeier and Van Lunen18 reported that preceptor modeling of the behaviors of clinical decision making based upon evidence was the best way to encourage students to use EBP. Laurent and Weidner19 have shown that students prefer mentors who model behavior. This raises the question of what concepts are being reinforced by the preceptors who are supervising students. One could argue that if a preceptor is unfamiliar with a concept he will not properly reinforce that concept.

Multiple authors have explored a variety of variables associated with implementation of EBP, such as knowledge of concepts,4,10,17,20,21 attitudes and beliefs about the concepts,4,10,14,21 comfort with EBP concepts,20 use of EBP concepts,14,17,21 and confidence and comfort in the application of EBP concepts.17,20 Knowledge of EBP concepts has been measured using 2 methods: multiple-choice quiz-like questions17,20,21 and Likert scale–rating questions about background training and education in EBP concepts.4,10 Confidence and comfort have been measured using Likert scale questions in the context of comfort level with the ability to implement EBP concepts20 and confidence to answer questions about EBP concepts.17 Use of EBP has been measured using checklists and ranking of items as well as open-ended questions about intended and future use of EBP. To date the only study21 found to use a premeasure and postmeasure after an education intervention was with a group of students and did not use a control group for comparison.
Adult learners are motivated by many factors, including the immediate relevance to their work, the reason for learning something new, and use of experience to provide a basis for learning.22 Our education intervention used self-directed and student-led activities to teach and reinforce the EBP concepts and encourage participation. The purpose of this project was to determine if an education intervention can increase the understanding of EBP concepts, comfort in defining EBP concepts, and use of EBP concepts among preceptors. It was our assumption that by combining these 2 types of activities, we would enhance the experience for the preceptor and the student. We hypothesized that the educational intervention would increase the understanding, comfort, and use of EBP concepts among preceptors.
METHODS
Participants
A purposeful sample of convenience drawn from one large state-affiliated Midwest university was used for the project. The preceptors (n = 10) working with second-year students were assigned to the experimental group, which allowed in-class student-led assignments to be used as part of the educational intervention. The preceptors (n = 9) not working with second-year students were assigned to the control group. All students were enrolled in a 3-year program; thus, the control group had first-year and third-year students. All preceptors worked in the collegiate or high school setting. Institution review board approval was obtained from the host institution, and informed consent was obtained.
Instrument
The EBP Preceptor Survey was created to assess the effectiveness of the educational intervention. An online survey was used for data collection. The 14 EBP competencies,15 listed in Table 1, were presented verbatim, and then each participant was asked the following questions: “How well do you understand the concept(s) in the competency?”; “How comfortable are you defining the concept(s) in the competency to the student you supervise?”; and “How likely are you to use the concept(s) in your clinical practice?” Our project used the term understanding in the context of “How well do you understand the concepts of each competency?” as a substitute for knowledge. We used the term comfort in the context of “comfort with their ability to define the EBP concepts to a student.” We used the term use in the context of “How likely are you to use the concepts of this EBP competency in your clinical practice?” Each question response was rated on a scale of 0 to 10 (0 = not understood / not at all comfortable / not at all likely to use and 10 = very well understood / very comfortable / very likely to use). We wanted a scale that would allow for variability and thus detectable differences. Likert23 stated that scales should be 5 to 7 points. DeVellis24 found that the number of questions asked affects the reliability of the number of choices that should be used. In practice, researchers often assign the number of points according to personal taste and past convention.25 We felt that the 10-point scale would allow for specificity and variability. The survey instrument was distributed to the 4 athletic training educators familiar with the EBP competencies for comment and review to establish face validity. Minor modifications were made based upon the expert feedback. A Cronbach α analysis was conducted to determine reliability of the collected data (α = 0.979).
Intervention
An educational intervention was created to increase understanding, comfort, and use of the EBP concepts. All components of the intervention were reviewed by a panel of 4 athletic training educators familiar with the EBP content and educational delivery methods. Several modifications were made that included wording changes and inclusion of different articles. Three components were created using 2 strategies that were hypothesized to foster adult learning: (1) adult learners prefer to learn at their own pace, and (2) student-led activities would encourage participation. The first component was a series of 3 PowerPoint presentations that were delivered via email to define the various concepts and terminology in the EBP competencies. Real-world examples were included to reinforce the concepts. The content was divided into 3 presentations (Overview and Terminology, Research Design, and Clinical Outcome Assessment) to limit user fatigue and to encourage full viewing within a practical time frame (less than 30 minutes) for an active clinician. The second component was a student-led activity involving a weekly journal article review between the student and their preceptor. Students were directed to each article and required to make copies for their preceptor and schedule a meeting to discuss the article. All articles chosen were available free to the students online. Students were given a weekly journal article assignment form to use during their discussion that asked “What is the main purpose of the article?”; “What methods were used to develop the article?”; “What are the conclusions?”; and “What is the take-home message / how will you implement this into your practice?” Students were required to submit the form after reviewing it with their preceptor and obtaining the preceptor's signature. Responses from this form were not collected for this project; it was merely a check to ensure they were involving their preceptor in the article review. Five peer-reviewed and published articles were selected to reinforce the EBP concepts.26–30 Two were chosen for providing background on EBP,26,27 while the remaining 3 were chosen for their practical application to clinical practice and had a minimum Physiotherapy Evidence Database (PEDro) score of 8/10.28–30 The final component of the intervention was a student-led EBP assignment that required preceptor involvement and was adapted from a project developed by a colleague (J. Popp, unpublished data, 2012). Preceptors were involved in each step of the assignment and reviewed materials as the assignment progressed. Students were directed to find a patient under care and an appropriate Patient-Centered Outcome (PCO) measure. The student was instructed to use the PCO measure with the patient for a minimum of 3 weeks, produce a report of the findings, discuss the PCO and its measurement properties and how the treatment plan was adjusted based upon the PCO results, and reflect upon the challenges faced by the use of the PCO. It was felt that since PCOs are not widely used by athletic trainers but are an important component of EBP, the involvement of the preceptor would reinforce EBP concepts.
Design
A premeasure-postmeasure of the independent variables (EBP Preceptor Survey) was used to measure the effectiveness of the educational intervention. The between-subjects variable was group (experimental and control), while the within-subjects variable was time (pre and post). All participants were sent a URL for the EBP Preceptor Survey via email. Follow-up emails were sent 1 week later to encourage a response. Table 2 represents the timeline of the study. The experimental group received a PowerPoint presentation via email once a week for the first 3 weeks of the project. Student-led article reviews were conducted over a 5-week period. The student-led EBP project was then conducted over the following 4 weeks. Posttest EBP Preceptor Surveys were distributed via email, and a 1-week follow-up was sent to encourage participation.

Analysis
Data were captured via Qualtrics (Qualtrics LLC, Provo, UT), cleaned, and coded in Microsoft Excel 2012 (Microsoft Inc, Redding, WA), then imported into SPSS Statistics 20 Predictive Analytics Software (SPSS Inc, Chicago, IL) for analysis. Analysis included descriptive statistics and a repeated-measures analysis of covariance (RMANCOVA) with a conservative correction for Type I error (α ≤ .01) with 1 between (group: experimental and control) and 1 within (time: pretest and posttest) variable and 42 dependent variables (understanding, comfort, and use for each of the 14 competencies). Upon inspection of sample demographics, the average preceptor years of experience appeared to be unequal between groups. To control for this observed difference, preceptor years of experience was included as a covariate during each analysis. Statistical significance was established using a conservative probability level of .01. An analysis of variance found no significant differences between the groups' pretest scores at the conservative α ≤ .01 level.
RESULTS
The sample of 19 participants is described in Table 3. Originally, 10 participants were assigned to the experimental group and 10 were assigned to the control group. However, 1 control group participant did not complete the project. Seven females and 12 males participated in the project. No analyses were conducted to address gender differences.

The pretest and posttest descriptive statistics for the RMANCOVA are presented in Tables 4 through 6 for the experimental and control groups. With 14 EBP competencies and 3 variables within each (knowledge, comfort, and use) there were 42 possible variables. As illustrated in Table 7, the RMANCOVA revealed that there were significant differences (P ≤ .01) between the experimental and control groups for 14 (33%) of the 42 variables.




Of the significant increases between groups, 9 of the 14 (64%) were for the comfort, 4 (29%) for understanding, and 1 (7%) for use. Only 1 competency (No. 5: develop a relevant clinical question using a predefined question format) had significant increases for all 3 variables (understanding, comfort, and use).
Effect sizes (partial η2) were calculated and ranged from small (.10) to large (.50) across the 42 variables. The largest effect sizes were found with the comfort variables and the smallest were found with the use variables. For the 14 significant differences effect sizes ranged from a medium level of .348 to a high level of .599. The knowledge variables ranged from a low effect size of .143 (competency No. 14) to a moderate effect size of .410 (competency No. 10). The comfort variables ranged from a low effect size of .063 (competency No. 13) to a high effect size of .599 (competency No. 5). The use variables ranged from a nonexistent effect size of .000 (competency No. 11) to a moderate effect size of .359 (competency No. 5).
DISCUSSION
As a concept, EBP is central to modern medicine. The terminology within those concepts is being taught to the current generation of athletic training students. While most preceptors have been using the concepts in their daily practice by balancing current research, their experience, and patient feedback, many have not received formal education on the participant. This study sought to increase preceptors' knowledge, comfort, and use of EBP. Our findings suggest that an education intervention may increase knowledge and comfort but not use.
Overall, the findings indicate an increase in preceptors' knowledge and comfort of EBP, with 33% of the 42 variables demonstrating significant increases after the education intervention, with no significant differences within the control group. This would suggest that the education intervention had the desired effect of overall increasing scores. Five competencies (Nos. 4, 6, 11, 13, and 14) had no increases in any of the 3 variables.
A closer examination of the increased scores of the experimental group, however, indicates that increases in knowledge and comfort were statistically significant. Our findings are similar to those reported by Manspeaker et al,21 who looked at differences in EBP in students after an education intervention. While the studies are similar, Manspeaker et al. did not use a control comparison group in their study and used students as opposed to preceptors, so a direct comparison of results is not possible.
Similar to prior studies on EBP, how likely a preceptor was to use the EBP concepts was not influenced by an educational intervention. Considering the limited similarities in design (preceptors versus students), Manspeaker et al21 reported that students were more confident in the use of EBP principles after the intervention; however, it is important to note that the finding for use is not in the same context as our project, which used the question stem of “How likely are you to use the concepts in your clinical practice?” as opposed to “confidence in use” and “intended future use” used by Manspeaker and colleagues. The finding that the use of EBP concepts did not increase is troubling. With the general push within the athletic training profession to increase the use of EBP, more needs to be done to overcome barriers to use and implementation for preceptors and students alike. Welch and colleagues31 have recently reported that strategies to overcome barriers and increase use of EBP in clinical practice include resources, processed information, workshops, peer discussion and mentorship, and repetition and exposure. Similar to the findings of Welch and colleagues,31 anecdotal feedback indicated that our participants were interested in learning more via in-service meetings, handouts, podcasts, and small group discussions.
While the control group did not demonstrate statistically significant differences pretest to posttest, they reported higher mean scores pretest than posttest for all of the knowledge and comfort competencies and for many of the use competencies. This may have been due to the small sample size or may have another source. This increase may have been due to an experience bias in that the control group had older and more experienced participants. The increase may have been due to a learning curve associated with test-retest methods. The Dreyfus and Dreyfus32 model of skill acquisition states that novice learners move from unconsciously incompetent to consciously incompetent. This may explain why the experimental group with less experience had lower pretest scores as the reality of the actual competencies was driven home. Anecdotal reports indicated that there was a certain humbling effect when the respondents saw the actual wording of the EBP competencies and felt challenged to understand the concepts, comfortably define the concepts, and use the concepts.
A review of effect sizes allowed for the quantification of both significant and nonsignificant differences. We chose to use effect sizes in our analysis and discussion of nonsignificant data as a result of our small sample size. Effect sizes are not affected by sample size and provide an estimation of the magnitude of difference. The comfort variable had the highest effect sizes, while use had the lowest effect sizes, which is consistent with the RMANCOVA significance findings. However, a review of the effect sizes for the nonsignificant differences suggests that the intervention has a low to moderate effect for understanding and comfort variables and a low effect for the use variable. The effect size review further reinforces our observations that the intervention can have a positive effect upon understanding and comfort but not upon use of EBP competencies.
Limitations and Suggestions
The small, purposeful, nonrandom sample of convenience was a limitation to the generalizability of the findings. It is possible that with a larger randomized sample the findings might be different. The sample was used because of the nature of the student-led activities and the focused effort of using students to help educate the preceptors. A multicentered approach from multiple institutions using similar methods should be considered to gather a larger sample from different clinical practice perspectives. The self-reported nature of data could be a limitation, as one must assume that all participants are being honest. The difference in years of experience for the preceptors, although controlled for during the statistical analysis, may have influenced the results and warrants further study. The multiple strategies used in this study (self-directed and student-led activities) may not have complimented one another as intended. Further study with each strategy would be warranted. Educators and clinicians should experiment with the various strategies we used and those found in the literature to increase their own understanding, comfort, and use of EBP.
CONCLUSIONS
Evidence-based practice implementation in the clinical setting is crucial for many reasons: to ensure students are reinforcing newly learned EBP content, to ensure that our clinicians are using the best possible practices, and to ensure that the profession as a whole addresses the accountability inherent in the ever-changing health care world. Strategies to encourage use of EBP concepts by practicing clinicians should be studied. Barriers to use of EBP concepts have been identified in the literature, and various strategies are being used. A focused effort by all vested parties to encourage a culture of EBP will in time overcome the current lack of use. We cannot simply wait for the next generation of students to matriculate into the practicing profession in hopes of increasing EBP use and application. Utilizing current students to educate the preceptors is one of many possible approaches that should be considered. New Board of Certification recertification requirements require a focus on EBP content, which is another step in the right direction. Future research should explore additional methods for educating preceptors.
Contributor Notes