Editorial Type:
Article Category: Research Article
 | 
Online Publication Date: 01 Jul 2015

The Effects of an Electronic Audience Response System on Athletic Training Student Knowledge and Interactivity

MET, ATC and
EdD, ATC
Page Range: 212 – 218
DOI: 10.4085/1003212
Save
Download PDF

Context

Electronic audience response systems (ARSs) are a technological teaching tool currently being used with widespread success within various disciplines of higher education. Researcher support for its application in athletic training education remains sparse, however.

Objective

The aim of this study was to examine whether use of an ARS in a basic athletic training course improved student knowledge acquisition and interactivity.

Design

Preintervention and postintervention surveys.

Setting

Commission on Accreditation of Athletic Training Education–accredited athletic training program.

Patients or Other Participants

Sixty-nine undergraduate students enrolled in one of 2 sections of an introductory athletic training course.

Main Outcome Measure(s)

A mixed-measures analysis of variance (ANOVA) was conducted to look for differences in knowledge acquisition based upon group membership (control versus experimental) and the effect of instruction.

Results

An interaction was discovered for the effect of instruction and use of the ARS (F1,59 = 5.89, P = .018, η2p = .091), indicating that the acquisition of knowledge in the experimental group (7.97 ± 1.49) was greater than for the control group (7.24 ± 1.75). A mixed-measure ANOVA found differences in classroom interactivity based upon group membership. There was a main effect for interactivity (F1,59 = 5.40, P = .024, η2p = .084), indicating that interactive participation increased among students from 7.16 ± 1.23 on the pretest to 7.56 ± 1.08 on the posttest; however, there was no interaction between interactivity and group membership, indicating that both the control and experimental groups increased interactivity at the same rate.

Conclusions

Audience response system technology improved student knowledge when used in an introductory athletic training course. Additional research should investigate active learning tools to determine what most strongly affects students' interactivity.

INTRODUCTION

Today's students have a preference for digital literacy, experiential learning, interactivity, and immediacy.1,2 In response to these preferences, higher education is shifting classroom teaching methods from traditional lecture-based methods to more learner-centered and active learning environment methods.3,4 The emphasis of technology use in the classroom has been at the forefront of this shift in teaching methods.1 As a result of this shift, the development of educational technology teaching tools that support active learning classroom environments has grown exponentially over the past decade.5 Researchers3,68 have shown that an actively engaged student will absorb and retain more classroom content and report higher satisfaction with the course. In addition, using a variety of teaching and learning methodologies enhances learning for students among different learning styles.9,10

Electronic audience response systems (ARSs) are a technological teaching tool that are being used with widespread success within various disciplines of higher education.1113 Audience response systems are referred to by an assortment of names, including student response systems, personal response stations, interactive voting systems, electronic voting systems, and, most commonly, “clickers.”11 Regardless of the nomenclature, all systems typically consist of a receiver attached to the instructor's computer, individual handheld wireless response devices, and the accompanying software program running the application and collection of responses. Audience response system technology allows students to instantly respond to an instructor-generated question via the response device keypad. The instructor has the option of displaying the aggregate results to the class and/or collecting the results for further analysis. Most systems have the ability to collect responses either anonymously or in an individually identifiable format.12

Audience response systems have been used in a wide variety of healthcare education disciplines, including nursing, medicine, pharmacy, psychology, and many others.11,14,15 Among these disciplines, ARSs have been used in a variety of course types, from large introductory courses to smaller discussion courses. In many studies,1618 students reported that ARSs were useful in increasing engagement in class lectures. For example, in an experimental undergraduate psychology lecture-based course, Bartsch and Murphy19 randomly assigned student participants to receive a 10-minute lecture either with or without an ARS. The instructors found that students who used an ARS demonstrated a higher level of engagement in the lecture. In another study, Berry1 incorporated an ARS within a baccalaureate didactic pediatric nursing education course to assess whole-class engagement in lecture. This course was simultaneously taught to 2 groups of nursing students; one group received the lecture over Interactive Television, which included students in a synchronous session at a distant site, and the second group were in a classroom for the traditional lecture. Despite the obvious challenges of increasing whole-class engagement for this course, after the use of an ARS, students reported increased engagement, higher satisfaction, and an overall positive attitude related to the ARSs within the course.1 A variety of other researchers8,11,12,17 support increases in student knowledge acquisition when using ARSs in class compared with traditional lecture classrooms. One study in medical education20 examined students across 2 sections of a course. The control group (section 1), received a standard didactic lecture, while the experimental group (section 2) received the identical lecture material with an ARS integrated into the delivery. Using postlecture quizzes as an assessment, students who used an ARS within the lecture had significantly higher learning (P = .02) and long-term retention (P = .001) scores on the day of the lecture and 3 months later.20 Additional positive aspects of ARS use include increased learning,12,16,21 interactivity,3,10,19,22 attendance,23,24 and enjoyment.3,21,22

Most healthcare professionals are challenged to provide didactic knowledge and experiences that apply to clinical encounters. This unique challenge creates a demand for students to master classroom knowledge in order to make effective transitions to clinical decision-making and reasoning.15 Most research concerning the use and effectiveness of ARS in the education of students in health professions is limited to didactic lecture courses. However, in order to provide students with the competencies necessary for clinical proficiency, most professional programs, including athletic training programs, are structured to include skills-based laboratory courses with hands-on learning in addition to the didactic lecture courses. In a search of athletic training literature, no original studies were found to demonstrate the effects of an ARS in either didactic or laboratory classes within athletic training education. There was only one study25 that described pedagogical methods of using ARSs as a Board of Certification test preparation strategy to increase athletic training student motivation and accountability. Therefore, the purpose of this study was to determine (1) if there is an increase in student acquisition of knowledge in a basic athletic training course when using ARSs during classroom lectures/discussions and (2) if there is an increase in individual student interactivity when using ARSs during classroom lectures/discussions. Overall, the aim of this study was to determine if the use of ARSs comprises an appropriate instructional modality for introductory athletic training courses in terms of improving student acquisition of knowledge and interactivity.

METHODS

Study Design

We used preintervention and postintervention evaluations of students' knowledge and degree of interactivity via a researcher-developed survey. Athletic training students were enrolled in an introductory athletic training course at a Commission on Accreditation of Athletic Training Education (CAATE)–accredited public institution during the time of participation. This introductory athletic training course was selected for the following reasons: (1) it is an introductory course that does not require previous knowledge of athletic training; (2) it has 2 sections offered by the same co-instructors; (3) the different sections are offered on the same days of the week and both are morning classes; (4) the course is a very typical athletic training course that teaches both theory and application of the theory and skills learned in the course; and (5) the course has a natural break in course objectives after unit 1. The theory portion was taught in a tradition lecture/discussion format. The skill application was taught via a laboratory format. The course participants met for 2 hours twice a week. The first hour of each day was theory and the second hour was lab based. The ARS was only used during the theory portion of the course. The main objectives of the first unit of this course related to basic emergency-related athletic training skills (eg, vital assessments and cardipulmonary resuscitation [CPR]). University institutional review board approval was obtained before data collection began.

Participants

Undergraduate athletic training students registered in an introductory athletic training course were asked to participate in this study. All participants (N = 69; control n = 35; experimental n = 34) had declared athletic training as their major but had not yet been admitted into the athletic training program. Participants were allowed to select which section they enrolled in based on their personal schedules. Each participant was asked to complete a self-reported demographic survey at the beginning of the study. Based on self-report questionnaire responses the control group was composed of 21 females and 14 males. The experimental group was made up of 19 females and 15 males. The Table provides additional selected demographic information on the participants in the 2 groups. Additional demographic information revealed the 2 groups combined consisted of 57 (82.6%) Caucasian, 3 (4.3%) Asian American, 4 (5.8%) African American, 2 (2.9%) Latino, 2 (2.9%) Native American, and 1 (1.4%) Pacific Islander students. Within this sample, 44 (63.8%) were freshman, 15 (21.7%) were sophomores, and 10 (14.5%) were juniors.

Table.  Participant Demographic Information Within Each Group

            Table. 

Instrument

We developed and used the Knowledge and Interactivity Survey (KIS) to assess basic athletic training knowledge and degree of individual interactivity for this study. The KIS consisted of 10 questions (multiple choice and fill in the blank) to assess knowledge learned within the course and 10 statements examining an individual's perception of his/her degree of interactivity with the course. The 10 knowledge-based questions were generated from the content of the required textbook for the course. Knowledge questions in the KIS instrument were examined for face validity by a panel of experts (n = 8). No modifications were deemed necessary based upon feedback. Reliability of knowledge questions in the KIS instrument was determined with a pre-post design, and independent sample t tests in a sample of undergraduate athletic training students (n = 20) who had previously completed the course used in the study. We identified no significant differences (t18 = −1.372, P = .187) presurvey and postsurvey. Additionally, a paired samples correlation revealed high intercorrelations between presurvey and postsurvey (r = 0.93), thus making this a reliable instrument.

The interactivity questions were adopted from a previous study26 conducted to establish reliability and were validated by a panel of experts (n = 8). Cronbach α was calculated for all of the items in this pilot study at the value of 0.86, which suggests that the instrument is highly reliable. Interactivity was assessed using a Likert scale measuring responses to 10 questions ranking the degree to which the participant felt he interacted in the class. The scales included 9 ordered choices to posed questions ranging from 1, which indicated strongly disagree, to 9, which indicated strongly agree. The KIS instrument in its entirety may be found in the Figure.

Figure. . Knowledge and Interactivity Survey (KIS) instrument. Abbreviations: AED, automated external defibrillator; CPR, cardipulmonary resuscitation.Figure. . Knowledge and Interactivity Survey (KIS) instrument. Abbreviations: AED, automated external defibrillator; CPR, cardipulmonary resuscitation.Figure. . Knowledge and Interactivity Survey (KIS) instrument. Abbreviations: AED, automated external defibrillator; CPR, cardipulmonary resuscitation.
Figure.  Knowledge and Interactivity Survey (KIS) instrument. Abbreviations: AED, automated external defibrillator; CPR, cardipulmonary resuscitation.

Citation: Athletic Training Education Journal 10, 3; 10.4085/1003212

Procedures

For this study, we selected one section of the introductory athletic training course to serve as the control group and the second section to serve as the experimental group. The only modification to the experimental group was the inclusion of the ARS technology (Turning Technologies, Youngstown, OH) during the lecture/discussion portion of the course.

On the first day of class, all participants were informed of the study, given a chance to ask questions, and asked to complete a consent form if willing to participate. Each participant was given a 2-digit number to place on the presurvey and postsurvey KIS instrument to track responses. Participants in both the control and experimental groups filled out the KIS instrument at the end of the first week of the course (approximately 4 hours into the course). This provided the baseline (pre-) measurement for all participants' knowledge of concepts related to the specific course objectives in the first half (unit 1) of the course and each individual's self-assessed degree of interactivity related to the course.

PowerPoint presentations were used in both sections of the course to deliver course content to participants and were identical in content. However, each PowerPoint presentation for the experimental group contained 5 to 8 additional slides with ARS questions presented at a pace of approximately 1 question every 20 minutes. Pacing and placement of questions using the ARS were critically considered. Previous researchers15,27 have suggested that ARSs may negatively affect cognitive gains and a student's interactivity in class if questions are posed too frequently or are not presented at an appropriate cognitive level.

At the conclusion of the study in unit 1 (8 weeks into the 16-week course), both sections were given a paper copy of the KIS instrument to complete during class. Participants were informed this was the posttest for the study. Upon completion participants were thanked for their involvement in the study. Results were analyzed between the control and experimental sections as well as within the 2 sections for patterns.

Data Analysis

The data were imported from the presurvey and postsurvey KIS instruments into IBM SPSS Statistical Package for Windows (Version 20) for statistical analysis (SPSS Inc, Chicago, IL). Following the advice of Tabachnick and Fidell,28 we screened the data for accuracy, missing data, normality, and outliers. No issues were identified with inaccurate data entry or recording, nonnormality, or outliers. One participant from the control group was removed from the data set as a result of excessive missing data. To quantify the effects of the ARS, the difference between responses to the presurvey and postsurvey KIS instrument was calculated and grouped responses into 2 sets of pre/post pairs—knowledge and interactivity—based upon the variable addressed in each survey question. Survey questions 1 through 10 addressed knowledge, while survey questions 11 through 20 addressed interactivity. Paired samples t tests examined the effect of the ARS on the variables. Repeated-measures analysis of variance (ANOVA) tests examined the differences between the control group and the experimental group in terms of the knowledge and interactivity variables.

The dependent variables (knowledge and interactivity) were examined with a Pearson product-moment correlation coefficient29 to determine whether they were sufficiently correlated to conduct a multivariate ANOVA. The dependent variables were weakly correlated at r = 0.340, P = .02, so the decision was made to conduct all further analyses with mixed-measures ANOVAs.28

RESULTS

Effect on Knowledge

A mixed-measures ANOVA and Box Test of Equality of Covariance Matrices found no significant difference between experimental and control groups, F(3,928881) = .332, P = .803, ns, indicating that the assumption of homogeneity of covariant matrices was not violated. An interaction was discovered for the effect of instruction and use of the ARS, F1,59 = 5.89, P = .018, η2p = .091, indicating that the acquisition of knowledge in the experimental group (7.97 ± 1.49) was greater than for the control group (7.24 ± 1.75). This finding supports using ARS within an introductory athletic training course to increase student knowledge acquisition.

Effect on Interactivity

A mixed-measures ANOVA and with Box Test of Equality of Covariance Matrices found a significant difference between experimental and control groups, F(3,928881) = 5.65, P = .001, indicating that the assumption of homogeneity of covariant matrices had been violated. Because the covariance matrices of the dependent variables could not be assumed to be equal across groups, the interpretation of the ANOVA was conducted using a Greenhouse-Geisser correction.

There was a main effect for interactivity, F1,59 = 5.40, P = .024, η2p = .084, indicating that individual interactivity in the course increased among participants from presurvey (7.16 ± 1.23) to post (7.56 ± 1.08); however, there was no overall interaction between interactivity and group membership.

DISCUSSION

Effects on Knowledge

The participants demonstrated a statistically significant increase in knowledge acquisition within an introductory athletic training course covering assessment of vital signs and recognition and management of cardiac and breathing emergencies with the use of an ARS. Furthermore, a difference between the control and experimental group was demonstrated when examining acquisition of knowledge. When using an ARS within a section of an introductory athletic training course, participants demonstrated higher knowledge acquisition than did the control group. The interaction suggests that the use of ARSs makes a small, but statistically significant, contribution to knowledge acquisition among participants who use them when compared with a course that does not use ARSs. Of the previous authors1,3,12,16,22 who examined knowledge acquisition through the use of ARSs during lectures, positive findings were also reported. One researcher30 incorporated ARS into an undergraduate emergency health course over a 4-week unit of the semester. This course was interprofessional in design, as it included students from paramedics/nursing, occupational therapy, physiotherapy, and health sciences. Among this group, 77% reported an increase in knowledge acquisition in the ARS unit.30 This is supported by our findings.

Instructors' evaluations of ARSs as a formative learning assessment tool reveal that the devices are an effective way to discover which material students do and do not understand.31 Therefore, when a student responds to a multiple-choice or true-false question presented with an ARS during a lecture, the instructor can efficiently evaluate the materials presented. This enables the instructor to make immediate modifications to lessons to address student challenges. Additionally, ARSs benefit students by providing a method by which to gauge understanding of course material. Compared with hearing their classmates' verbal responses to questions posted in lectures, answering questions through an ARS more accurately allows a student to assess her level of self-mastery of material.32 Identification of deficiencies can provide a student direction in what material she needs to spend greater time reviewing after class.

The uses of an ARS as they relate to teaching, learning, and assessment are widespread and applicable in a variety of situations. One factor for faculty to consider is whether the ARS questions are graded or simply formative with no grades attached. In the present study, we used a formative ARS study design, in which participants and the instructor were able to evaluate the student's comprehension from assigned reading and presented materials. All participants, both in the control and experimental groups, were assigned readings before each of the lessons. However, the completion of these readings along with the participants' achieved level of understanding of the materials were not evaluated. It could be hypothesized that if the participants knew they would be graded on ARS questions, they may have spent more time on reading and understanding content before class. Some researchers33 suggested that graded ARS questions insure class attention and effort in preparation. However, other researchers34 evaluating the impact of ARSs found that students rated the system less positively when it was used for a graded rather than a formative purpose. Additionally, Cain and Robinson11 concluded that to ensure the genuineness of the feedback process, formative questions should not be graded. Further research in athletic training education may warrant an exploration into knowledge acquisition over time with graded ARS exercises.

Effects on Interactivity

Both the control and experimental groups indicated statistically significant increases in classroom interactivity. However, both groups increased at the same rate, which did not support our hypothesis that differences would exist between the groups when using ARS technology. The benefits of an ARS on participant interactivity, emotion, and satisfaction in lecture-based courses have long been demonstrated10,16,26 to have positive effects in a wide variety of educational disciplines. In one study,10 a large introductory psychology lecture class examined the impact of an ARS on student interactivity and satisfaction between 2 classes. One class was presented in a traditional lecture format with allowance for questions through informal hand raising while the other class used an ARS for formal review questions throughout the lecture. Using a Likert scale evaluation, students reported their impressions of the class, including their perception of the individual degree of interactivity in the class and with the instructor. Findings demonstrated significantly higher interactivity scores, greater positive emotion, and satisfaction with the course.10

We suspect that the structure of this athletic training course may have had an impact on interactivity measures within this study. Many athletic training courses are designed to integrate lecture material with hands-on skill practice. The objectives covered in this study included hands-on practice of CPR techniques, heart and lung auscultation, and vital sign assessment in a laboratory session after the associated lecture. Participation in these hands-on group interactions may have been the cause of an overall positive effect on the students' classroom interactivity in both the control and experimental groups, as demonstrated by the positive gains in both groups. Therefore, if there were any differences in interactivity between the groups related to the ARS in lectures, they may have been overshadowed by the group interaction in the laboratory component of the course that followed. Further studies in athletic training education may warrant exploration of ARS in terms of individual interactivity in the laboratory component of a course.

DeBourgh15 found the use of ARSs within a baccalaureate nursing education program effective in promoting students' acquisition and application of advanced reasoning skills in addition to increasing students' interactivity within the course. Audience response system questions were carefully designed to facilitate discussion within the course. Often questions were purposefully vague and designed to stimulate engaging debate. Beatty35 described this questioning technique as successful in sensitizing students to the clinical integration of concepts. The goal of this type of questioning is not to promote memorization of factual knowledge but instead to demonstrate critical thinking strategies. Athletic trainers are among those health profession practitioners that require well-developed critical thinking abilities. However, as Knight36 explains, information and new skills must be absorbed and practiced before they can be converted into performance knowledge. Educators can facilitate the acquisition of critical thinking skills through carefully designed methods such as the ARS questions DeBourgh15 described. Therefore, careful consideration should be given to the type of ARS questions (factual knowledge versus discussion promoting) in relation to the overall goals of the lesson. Related to athletic training education courses, in order to maximize the interactivity and effectiveness of the ARS, the instructor is challenged with assessing the level to which students have absorbed factual knowledge and with identifying appropriate opportunities to transition to critical thinking.

CONCLUSIONS

To our knowledge, this study is the first to explore student knowledge acquisition and interactivity with the use of an ARS in an introductory athletic training course. As students' learning preferences continue to change and as technology becomes more integrated into all aspects of our lives, educators must continue to explore effective modes of teaching the next generation of athletic trainers. The evidence gathered supports the success of ARS technology in improving student knowledge when used in an introductory athletic training course. However, additional studies in athletic training education should be conducted to investigate multiple forms of active learning strategies, including ARS technology within didactic and laboratory courses, to determine what affects students' learning the most. Pedagogical, technical, and logistical issues should be addressed in order to achieve successful implementation in an educational environment. Further, the athletic training educator must carefully consider these issues as well as the design of the course (lecture, clinical skills, or laboratory based) in order to achieve the desired outcomes of ARS technology.

REFERENCES

  • 1
    Berry J.
    Technology support in nursing education: clickers in the classroom. Nurs Educ Perspect. 2009;30(
    5
    ):295298.
  • 2
    Howe N,
    Strauss W.
    Millennials Rising: The Next Great Generation.
    New York, NY
    :
    Vantage;
    2009:235238.
  • 3
    Hoffman C,
    Goodwin S.
    A clicker for your thoughts: technology for active learning. New Lib World. 2006;107(
    9/10
    ):422433.
  • 4
    MacArthur JR,
    Jones LL.
    A review of literature reports of clickers applicable to college chemistry classrooms. Chem Educ Res Pract. 2008;9(
    3
    ):187195.
  • 5
    Anderson R,
    Anderson R,
    Davis K,
    Linnell N,
    Prince C,
    Razmov V.
    Supporting active learning and example based instruction with classroom technology. ACM SIGCSE Bull. 2007;39(
    1
    ):6973.
  • 6
    Martyn M.
    Clickers in the classroom: an active learning approach. Educ Q. 2007;30(
    2
    ):7174.
  • 7
    Kenwright K.
    Clickers in the classroom. TechTrends. 2009;53(
    1
    ):7477.
  • 8
    Kay RH,
    LeSage A.
    Examining the benefits and challenges of using audience response systems: a review of the literature. Comput Educ. 2009;53(
    3
    ):819827.
  • 9
    Stowell JR,
    Oldham T,
    Bennett D.
    Using student response systems (“clickers”) to combat conformity and shyness. Teaching Psychol. 2010;37(
    2
    ):135140.
  • 10
    Stowell JR,
    Nelson JM.
    Benefits of electronic audience response systems on student participation, learning, and emotion. Teaching Psychol. 2007;34(
    4
    ):253258.
  • 11
    Cain J,
    Robinson E.
    A primer on audience response systems: current applications and future considerations. Am J Pharm Educ. 2008;72(
    4
    ):77.
  • 12
    Patterson B,
    Kilpatrick J,
    Woebkenberg E.
    Evidence for teaching practice: the impact of clickers in a large classroom environment. Nurs Educ Today. 2010;30(
    7
    ):603607.
  • 13
    Abrahamson L.
    A brief history of networked classrooms: effects, cases, pedagogy, and implications. Audience Res Syst Higher Educ Appl Cases.
    Melbourne, Australia
    :
    Information Science Publishing;
    2006:125.
  • 14
    Premkumar K,
    Coupal C.
    Rules of engagement—12 tips for successful use of “clickers” in the classroom. Med Teacher. 2008;30(
    2
    ):146149.
  • 15
    DeBourgh GA.
    Use of classroom “clickers” to promote acquisition of advanced reasoning skills. Nurs Educ Pract. 2008;8(
    2
    ):7687.
  • 16
    Meedzan N,
    Fisher K.
    Clickers in nursing education: an active learning tool in the classroom. Online J Nurs Inf. 2009;13(
    2
    ):119.
  • 17
    Lantz ME.
    The use of ‘clickers' in the classroom: teaching innovation or merely an amusing novelty? Comput Hum Behav. 2010;26(
    4
    ):556561.
  • 18
    Duncan D.
    Clickers: a new teaching aid with exceptional promise. Astron Educ Rev. 2007;5(
    1
    ):7088.
  • 19
    Bartsch RA,
    Murphy W.
    Examining the effects of an electronic classroom response system on student engagement and performance. J Educ Comput Res. 2011;44(
    1
    ):2533.
  • 20
    Rubio El,
    Bassingnani MJ,
    White MA,
    Brant WE.
    Effect of an audience response system on resident learning and retention of lecture material. Am J Roentgentol. 2008;190(
    6
    ):W319W322.
  • 21
    Cain J,
    Black EP,
    Rohr J.
    An audience response system strategy to improve student motivation, attention, and feedback. Am J Pharm Educ. 2009;73(
    2
    ):21.
  • 22
    Bachman L,
    Bachman C.
    A study of classroom response system clickers: increasing student engagement and performance in a large undergraduate lecture class on architectural research. J Int Learning Res. 2011;22(
    1
    ):521.
  • 23
    Skiba DJ.
    Emerging technologies center: got large lecture hall classes? Use clickers. Nurs Educ Perspect. 2006;27(
    5
    ):278280.
  • 24
    Hall RH,
    Collier HL,
    Thomas ML,
    Hilgers MG.
    A student response system for increasing engagement, motivation, and learning in high enrollment lectures. Association for Information Systems (AMCIS) 2005 Proceedings. Omaha, NE: AMCIS;2005:255.
  • 25
    Potteiger K,
    Lundren A.
    Using an audience response system to prepare athletic training students for the Board of Certification exam. Athl Train Educ J. 2012;7(
    4
    ):198204.
  • 26
    Siau K,
    Sheng H,
    Nah FH.
    Use of a classroom response system to enhance classroom interactivity. Educ IEEE Trans. 2006;49(
    3
    ):398403.
  • 27
    Boyd WM.
    Repeating questions in prose learning. J Educ Psychol. 1973;64(
    1
    ):31.
  • 28
    Tabachnick BG,
    Fidell LS.
    Using Multivariate Statistics. 5th ed.
    New York, NY
    :
    Routledge;
    2007.
  • 29
    Miles JNV,
    Banyard P.
    Understanding and Using Statistics in Psychology: A Practical Introduction. 4th ed.
    London, UK
    :
    Sage;
    2007:210.
  • 30
    Williams B,
    Boyle M.
    The use of interactive wireless keypads for interprofessional learning experiences by undergraduate emergency health students. Int J Educ Dev ICT. 2008;4(
    1
    ). http://ijedict.dec.uwi.edu/viewarticle.php?id=405. Accessed January 1, 2015.
  • 31
    Schackow TE,
    Chavez M,
    Loya L,
    Friedman M.
    Audience response system: effect on learning in family medicine residents. Family Med Kansas City. 2004;36(
    7
    ):496504.
  • 32
    Nayak L,
    Erinjeri JP.
    Audience response systems in medical student education benefit learners and presenters. Acad Radiol. 2008;15(
    3
    ):383389.
  • 33
    Willoughby SD,
    Gustafson E.
    Technology talks: clickers and grading incentive in the large lecture hall. Am J Phys. 2009;77(
    2
    ):180183.
  • 34
    Graham CR,
    Tripp TR,
    Seawright L,
    Joeckel G.
    Empowering or compelling reluctant participators using audience response systems. Active Learning Higher Educ. 2007;8(
    3
    ):233258.
  • 35
    Beatty ID.
    Transforming student learning with classroom communication systems. Cornell University Library Web site. http://arxiv.org/abs/physics/0508129. Accessed January 1, 2015.
  • 36
    Knight K.
    Hyposkillia and critical thinking: what's the connection [editorial]? Athl Train Educ J. 2008;3(
    3
    ):7981.
Copyright: © National Athletic Trainers' Association
<bold>Figure. </bold>
Figure. 

Knowledge and Interactivity Survey (KIS) instrument. Abbreviations: AED, automated external defibrillator; CPR, cardipulmonary resuscitation.


Contributor Notes

Dr Tivener is currently a Clinical Instructor in the Sports Medicine and Athletic Training Department at Missouri State University. Please address all correspondence to Kristin Ann Tivener, MET, ATC, Department of Sports Medicine and Athletic Training, Missouri State University, 901 South National Avenue, PROF 160, Springfield, MO 65897. KTivener@MissouriState.edu.
  • Download PDF