To provide certified athletic trainers, physicians, and other health care and fitness professionals with recommendations based on current evidence regarding the prevention of noncontact and indirect-contact anterior cruciate ligament (ACL) injuries in athletes and physically active individuals. Preventing ACL injuries during sport and physical activity may dramatically decrease medical costs and long-term disability. Implementing ACL injury-prevention training programs may improve an individual's neuromuscular control and lower extremity biomechanics and thereby reduce the risk of injury. Recent evidence indicates that ACL injuries may be prevented through the use of multicomponent neuromuscular-training programs. Multicomponent injury-prevention training programs are recommended for reducing noncontact and indirect-contact ACL injuries and strongly recommended for reducing noncontact and indirect-contact knee injuries during physical activity. These programs are advocated for improving balance, lower extremity biomechanics, muscle activation, functional performance, strength, and power, as well as decreasing landing impact forces. A multicomponent injury-prevention training program should, at minimum, provide feedback on movement technique in at least 3 of the following exercise categories: strength, plyometrics, agility, balance, and flexibility. Further guidance on training dosage, intensity, and implementation recommendations is offered in this statement.Objective:
Background:
Recommendations:
An estimated 40 million school-aged children (age range = 5−18 years) participate annually in sports in the United States, generating approximately 4 million sport-related injuries and requiring 2.6 million emergency department visits at a cost of nearly $2 billion. To determine the effects of a school-based neuromuscular training (NMT) program on sport-related injury incidence across 3 sports at the high school and middle school levels, focusing particularly on knee and ankle injuries. Randomized controlled clinical trial. A total of 5 middle schools and 4 high schools in a single-county public school district. A total of 474 girls (222 middle school, 252 high school; age = 14.0 ± 1.7 years, height = 161.0 ± 8.1 cm, mass = 55.4 ± 12.2 kg) were cluster randomized to an NMT (CORE; n = 259 athletes) or sham (SHAM; n = 215 athletes) intervention group by team within each sport (basketball, soccer, and volleyball). The CORE intervention consisted of exercises focused on the trunk and lower extremity, whereas the SHAM protocol consisted of resisted running using elastic bands. Each intervention was implemented at the start of the season and continued until the last competition. An athletic trainer evaluated athletes weekly for sport-related injuries. The coach recorded each athlete-exposure (AE), which was defined as 1 athlete participating in 1 coach-directed session (game or practice). Injury rates were calculated overall, by sport, and by competition level. We also calculated rates of specific knee and ankle injuries. A mixed-model approach was used to account for multiple injuries per athlete. Overall, the CORE group reported 107 injuries (rate = 5.34 injuries/1000 AEs), and the SHAM group reported 134 injuries (rate = 8.54 injuries/1000 AEs; F1,578 = 18.65, P < .001). Basketball (rate = 4.99 injuries/1000 AEs) and volleyball (rate = 5.74 injuries/1000 AEs) athletes in the CORE group demonstrated lower injury incidences than basketball (rate = 7.72 injuries/1000 AEs) and volleyball (rate = 11.63 injuries/1000 AEs; F1,275 = 9.46, P = .002 and F1,149 = 11.36, P = .001, respectively) athletes in the SHAM group. The CORE intervention appeared to have a greater protective effect on knee injuries at the middle school level (knee-injury incidence rate = 4.16 injuries/1000 AEs) than the SHAM intervention (knee-injury incidence rate = 7.04 injuries/1000 AEs; F1,261 = 5.36, P = .02). We did not observe differences between groups for ankle injuries (F1,578 = 1.02, P = .31). Participation in an NMT intervention program resulted in a reduced injury incidence relative to participation in a SHAM intervention. This protective benefit of NMT was demonstrated at both the high school and middle school levels.Context:
Objective:
Design:
Setting:
Patients or Other Participants:
Intervention(s):
Main Outcome Measure(s):
Results:
Conclusions:
The Functional Movement Screen (FMS) is a tool used to assess the quality of human movement. Previous FMS researchers reported a difference between the comprehensive and individual FMS test scores of injured and uninjured participants. To evaluate the accuracy of the FMS for predicting injury in National Collegiate Athletic Association Division II athletes and to evaluate how an injury definition may affect the prognostic values. Cross-sectional study. University preparticipation examinations. A total of 257 collegiate athletes (men = 176, women = 81) between the ages of 18 and 24 years. The athletes were prospectively screened with the FMS and monitored for subsequent injury. The ability of the FMS to accurately predict musculoskeletal injuries, overall injuries, and severe injuries was determined. The receiver operating characteristic curve provided the FMS cut score of ≤15 for the study sample. The areas under the curve were 0.53, 0.56, and 0.53 for musculoskeletal injury, overall injury, and severe injury, respectively. Sensitivity was 0.63 (0.62, 0.61, 0.65), whereas specificity was below 0.50 (0.49, 0.49, 0.45) for all 3 injury definitions of musculoskeletal injury, overall injury, and severe injury, respectively. Relative risk was 1.25 for musculoskeletal injuries, 1.24 for overall injuries, and 1.45 for severe injuries. The overall prognostic accuracy of the FMS offered a slightly better than 50/50 chance of correctly classifying those most at risk for injury. As such, the FMS did not provide discriminatory prediction of musculoskeletal injury, overall injury, or severe injury in National Collegiate Athletic Association Division II athletes. Using the identified optimal cut score produced inadequate validity, regardless of the injury definition. We recommend using the FMS to assess movement quality rather than as a standalone injury-prediction tool until additional research suggests otherwise. Clinicians screening for injury risk should consider multiple risk factors identified in the literature.Context:
Objective:
Design:
Setting:
Patients or Other Participants:
Main Outcome Measure(s):
Results:
Conclusions:
The fourth edition of the Preparticipation Physical Evaluation recommends functional testing for the musculoskeletal portion of the examination; however, normative data across sex and grade level are limited. Establishing normative data can provide clinicians reference points with which to compare their patients, potentially aiding in the development of future injury-risk assessments and injury-mitigation programs. To establish normative functional performance and limb-symmetry data for high school-aged male and female athletes in the United States. Cross-sectional study. Athletic training facilities and gymnasiums across the United States. A total of 3951 male and female athletes who participated on high school-sponsored basketball, football, lacrosse, or soccer teams enrolled in this nationwide study. Functional performance testing consisted of 3 evaluations. Ankle-joint range of motion, balance, and lower extremity muscular power and landing control were assessed via the weight-bearing ankle-dorsiflexion–lunge, single-legged anterior-reach, and anterior single-legged hop-for-distance (SLHOP) tests, respectively. We used 2-way analyses of variance and χ2 analyses to examine the effects of sex and grade level on ankle-dorsiflexion–lunge, single-legged anterior-reach, and SLHOP test performance and symmetry. The SLHOP performance differed between sexes (males = 187.8% ± 33.1% of limb length, females = 157.5% ± 27.8% of limb length; t = 30.3, P < .001). A Cohen d value of 0.97 indicated a large effect of sex on SLHOP performance. We observed differences for SLHOP and ankle-dorsiflexion–lunge performance among grade levels, but these differences were not clinically meaningful. We demonstrated differences in normative data for lower extremity functional performance during preparticipation physical evaluations across sex and grade levels. The results of this study will allow clinicians to compare sex- and grade-specific functional performances and implement approaches for preventing musculoskeletal injuries in high school-aged athletes.Context:
Objective:
Design:
Setting:
Patients or Other Participants:
Main Outcome Measure(s):
Results:
Conclusions:
Weather-based activity modification in athletics is an important way to minimize heat illnesses. However, many commonly used heat-safety guidelines include a uniform set of heat-stress thresholds that do not account for geographic differences in acclimatization. To determine if heat-related fatalities among American football players occurred on days with unusually stressful weather conditions based on the local climate and to assess the need for regional heat-safety guidelines. Cross-sectional study. Data from incidents of fatal exertional heat stroke (EHS) in American football players were obtained from the National Center for Catastrophic Sport Injury Research and the Korey Stringer Institute. Sixty-one American football players at all levels of competition with fatal EHSs from 1980 to 2014. We used the wet bulb globe temperature (WBGT) and a z-score WBGT standardized to local climate conditions from 1991 to 2010 to assess the absolute and relative magnitudes of heat stress, respectively. We observed a poleward decrease in exposure WBGTs during fatal EHSs. In milder climates, 80% of cases occurred at above-average WBGTs, and 50% occurred at WBGTs greater than 1 standard deviation from the long-term mean; however, in hotter climates, half of the cases occurred at near average or below average WBGTs. The combination of lower exposure WBGTs and frequent extreme climatic values in milder climates during fatal EHSs indicates the need for regional activity-modification guidelines with lower, climatically appropriate weather-based thresholds. Established activity-modification guidelines, such as those from the American College of Sports Medicine, work well in the hotter climates, such as the southern United States, where hot and humid weather conditions are common.Context:
Objective:
Design:
Setting:
Patients or Other Participants:
Main Outcome Measure(s):
Results:
Conclusions:
The female athlete triad (Triad) and relative energy deficiency in sport (RED-S) specify the consequences of energy imbalance. Athletic trainers (ATs) are positioned to identify athletes who are fueling themselves inadequately and experiencing related health and performance consequences. To assess the knowledge of collegiate ATs about the Triad and RED-S and to examine variability in related screening and referral behaviors among National Collegiate Athletic Association divisions. Cross-sectional study. Collegiate athletic training departments. Head ATs at National Collegiate Athletic Association member institutions (n = 285, response rate = 33%). An electronic survey was administered. The number of Triad components that were correctly identified and screening and referral behaviors related to Triad components were measured. Nearly all respondents (98.61% [n = 281]) had heard of the Triad; a smaller proportion (32.98% [n = 94]) had heard of RED-S. On average, respondents correctly identified 2 components of the Triad. We observed differences by sex, with women correctly identifying more components than men (U = 12.426, P = .003). More than half (59.93% [n = 163]) indicated that athletes at their institutions were screened for eating disorders. Nearly three-quarters (70.55% [n = 115]) of respondents indicated that all female athletes at their institutions were screened annually for menstrual dysfunction. More comprehensive referral behaviors for athletes identified as experiencing menstrual dysfunction or a bone injury (eg, athlete referred to a nutritionist, dietitian, or counselor) occurred at Division I institutions than at Division II and III institutions. Continuing education for ATs about the Triad and RED-S may encourage a more comprehensive approach to referral and screening after a diagnosis of menstrual dysfunction or bone-stress injury. Using institutional opportunities, such as preparticipation screening, for identifying components of the Triad or RED-S and specifying protocols for referring athletes who screen positive for 1 of these components should also be explored.Context:
Objective:
Design:
Setting:
Patients or Other Participants:
Main Outcome Measure(s):
Results:
Conclusions:
Increased pitch volume and altered glenohumeral (GH) and hip range of motion (ROM) and strength contribute to injury risk in baseball pitchers. Although these factors affect one another, whether they are related is unknown. To examine relationships among cumulative and seasonal pitch volume, ROM, and strength of the GH and hip joints in youth baseball pitchers. Cross-sectional study. Baseball practice facilities. A total of 28 healthy baseball pitchers (age = 13.9 ± 2.9 years). A demographic and pitching questionnaire was used to quantify pitch volume. Glenohumeral internal-rotation (IR) and external-rotation (ER) ROM and strength of the throwing arm; total arc of motion (IR + ER ROM); and bilateral hip IR, ER, and total arc of motion ROM and strength in IR, ER, and abduction were measured. A goniometer was used to assess ROM; a handheld dynamometer, to assess strength. Frequency analyses and bivariate correlations (age covariate) described data and identified relationships. Correlations between years of competitive play and increased strength in lead-leg hip IR (r = 0.52, P = .02) and abduction (r = 0.48, P = .04) and stance-leg hip IR (r = 0.45, P = .05) were fair to good. The number of months played in the last year had a fair correlation with decreased GH IR strength (r = −0.39, P = .04) and increased stance-leg hip IR strength (r = 0.44, P = .05). Limited pitch time had a fair correlation with increased GH ER ROM (r = 0.40, P = .04) and an excellent correlation with increased lead-leg hip IR ROM (r = 0.79, P < .001). Increased innings pitched per game had a fair to good correlation with decreased GH IR strength (r = −0.41, P = .04) and stance-leg hip ER ROM (r = −0.53, P = .03). More pitches per game had a fair to good correlation with increased GH ER ROM (r = 0.44, P = .05) and decreased stance-leg hip ER ROM (r = −0.62, P = .008). The significant relationships identified in this study suggest the need to further examine youth and adolescent cumulative and seasonal pitch guidelines.Context:
Objective:
Design:
Setting:
Patients or Other Participants:
Main Outcome Measure(s):
Results:
Conclusions:
Ankle sprains are one of the most common injuries in the physically active population. Previous researchers have shown that supporting the ankle with taping or bracing is effective in preventing ankle sprains. However, no authors have compared the effects of self-adherent tape and lace-up ankle braces on ankle range of motion (ROM) and dynamic balance in collegiate football players. To examine the effectiveness of self-adherent tape and lace-up ankle braces in reducing ankle ROM and improving dynamic balance before and after a typical collegiate football practice. Crossover study. Collegiate athletic training room. Twenty-nine National Collegiate Athletic Association Division I football athletes (age = 19.2 ± 1.14 years, height = 187.52 ± 20.54 cm, mass = 106.44 ± 20.54 kg). Each participant wore each prophylactic ankle support during a single practice, self-adherent tape on 1 leg and lace-up ankle brace on the other. Range of motion and dynamic balance were assessed 3 times for each leg throughout the testing session (baseline, prepractice, postpractice). Ankle ROM for inversion, eversion, dorsiflexion, and plantar flexion were measured at baseline, immediately after donning the brace or tape, and immediately after a collegiate practice. The Y-Balance Test was used to assess dynamic balance at these same time points. Both interventions were effective in reducing ROM in all directions compared with baseline; however, dynamic balance did not differ between the tape and brace conditions. Both the self-adherent tape and lace-up ankle brace provided equal ROM restriction before and after exercise, with no change in dynamic balance.Context:
Objective:
Design:
Setting:
Patients or Other Participants:
Intervention(s):
Main Outcome Measure(s):
Results:
Conclusions:
Forming a professional identity is a process by which an individual achieves an awareness of his or her own self-concept in the context of the profession. Identity in relation to an individual's profession includes the ability to articulate one's role as a professional and professional philosophy. Professional identity has been studied extensively in other fields, but currently no professional identity scales have been validated within the athletic training profession. To validate the Professional Identity and Values Scale (PIVS) among an athletic trainer population. Cross-sectional study. Web-based questionnaire. Athletic trainers employed in National Collegiate Athletic Association Division I, II, III, or National Association of Intercollegiate Athletics colleges or universities (n = 299, 56.5% female, 43.5% male). The average age of the participants was 33.6 ± 8.3 years, and they had 10.3 ± 7.6 years of experience. Participants were asked to complete a demographic questionnaire and the 32-item PIVS. The variables included demographics and the PIVS (Professional Orientation and Values subscale [18 items] and the Professional Development subscale [14 items]). Exploratory factor analysis reduced the survey from 32 to 20 items and revealed 6 factors. Three factors emerged from the Professional Development subscale and emphasized professional insecurities during the early career stages, the importance of mentors during the intermediate stages, and self-confidence and awareness during the later stages of professional development. An additional 3 factors emerged from the Professional Orientation and Values subscale: (1) patient care and advocacy, (2) professional engagement and collaboration, and (3) personal wellness and values. A Cronbach α of 0.80 indicated good internal consistency. A modified PIVS is a valid and reliable measure of professional identity among athletic trainers employed in the collegiate setting.Context:
Objective:
Design:
Setting:
Patients or Other Participants:
Main Outcome Measure(s):
Results:
Conclusions:
Recent employment data from collegiate athletic training settings have demonstrated departure trends among men and women. These trends have been hypothesized to be related to work-life balance. However, work-life balance is only 1 aspect of a myriad of factors. Due to the complex nature of the work-life interface, a multilevel examination is needed to better understand the precipitators of departure. To quantitatively examine factors that may influence collegiate athletic trainers' (ATs') job satisfaction and career intentions via a multilevel examination of the work-life interface. Cross-sectional study. Web-based questionnaire. Athletic trainers employed in National Collegiate Athletic Association Division I, II, or III or National Association of Intercollegiate Athletics colleges or universities (N = 299: 56.5% female, 43.5% male). The average age of participants was 33.6 ± 8.3 years, and their average experience was 10.3 ± 7.6 years. Participants responded to an online questionnaire consisting of demographic questions, 9 Likert-scale surveys, and open-ended questions. Job-satisfaction Scores (JSSs) and intention-to-leave scores (ITLSs) served as the dependent variables and factors from individual, organizational, and sociocultural levels were the independent variables. Hierarchical regression analysis was run to determine the predictability of factors. No sex differences in ITLS or JSS were found in our sample. Independent variables explained 68.5% of the variance in JSS and 28.8% of the variance in ITLS. Additions of factor levels increased the percentage of explained variance in both scores. A combination of individual-, organizational-, and sociocultural-level factors was able to best predict JSS and ITLS among collegiate ATs.Context:
Objective:
Design:
Setting:
Patients or Other Participants:
Data Collection and Analysis:
Results:
Conclusions:
An organizational climate is largely based on an employee's perceptions of the working conditions in which he or she engages regularly. A multifaceted concept, the organizational climate is often formed by perceptions of employee welfare, rewards, and support. Achieving work-life balance is also a part of the climate. To learn collegiate athletic trainers' perceptions of organizational climate and specifically how it may pertain to their work-life balance. Phenomenologic study. Collegiate practice setting. Thirty athletic trainers working in the collegiate athletics setting took part in 1-on-1 phone interviews. The participants were 30.5 (interquartile range [IQR] = 7.75) years old and had been certified for 7 (IQR = 5) years and at their current position for 4 (IQR = 3) years. Participants completed a phone interview that followed a semistructured framework. All transcribed interviews were analyzed using a phenomenologic approach. Researcher triangulation, expert review, and data saturation were used to establish credibility. Athletic trainers working in the collegiate athletics setting who had positive perceptions of their work-life balance described their organizational climate as family friendly. Our participants' supervisors allowed for autonomy related to work scheduling, which provided opportunities for work-life balance. These athletic trainers believed that they worked in a climate that was collegial, which was helpful for work-life balance. In addition, the importance of placing family first was part of the climate. The perceptions of our participants revealed a climate of family friendliness, supervisor support, and collegiality among staff members, which facilitated the positive climate for work-life balance. The mindset embraced the importance of family and recognized that work did not always have to supersede personal priorities.Context:
Objective:
Design:
Setting:
Patients or Other Participants:
Data Collection and Analysis:
Results:
Conclusions:
To describe the concepts of measurement reliability and minimal important change. All measurements have some magnitude of error. Because clinical practice involves measurement, clinicians need to understand measurement reliability. The reliability of an instrument is integral in determining if a change in patient status is meaningful. Measurement reliability is the extent to which a test result is consistent and free of error. Three perspectives of reliability—relative reliability, systematic bias, and absolute reliability—are often reported. However, absolute reliability statistics, such as the minimal detectable difference, are most relevant to clinicians because they provide an expected error estimate. The minimal important difference is the smallest change in a treatment outcome that the patient would identify as important. Clinicians should use absolute reliability characteristics, preferably the minimal detectable difference, to determine the extent of error around a patient's measurement. The minimal detectable difference, coupled with an appropriately estimated minimal important difference, can assist the practitioner in identifying clinically meaningful changes in patients.Objective:
Background:
Description:
Recommendations:
JAT eISSN: 1938-162X
JAT ISSN: 1062-6050
ATEJ ISSN: 1947-380X