Concussions affect a large number of US athletes each year. Returning an athlete to activity once self-reported symptoms have resolved can be problematic if unrecognized neurocognitive and balance deficits persist. Pairing cognitive and motor tasks or cognitive and quiet-stance tasks may allow clinicians to detect and monitor these changes postconcussion. To prospectively examine adolescent athletes' gait and quiet-stance performance while concurrently completing a cognitive task acutely after concussion and after symptom resolution. Case-control study. Sport concussion clinic. Thirty-seven athletes (age = 16.2 ± 3.1 years; 54% female) were diagnosed with a concussion, and their performance was compared with that of a group of 44 uninjured control participants (age = 15.0 ± 2.0 years; 57% female). Participants diagnosed with a concussion completed a symptom inventory and single- and dual-task gait and quiet-stance evaluations within 21 days of injury and then again after symptom resolution. Gait and postural-control measurements were quantified using an inertial sensor system and analyzed using multivariate analyses of covariance. Post-Concussion Symptom Scale, single-task and dual-task gait measures, quiet-stance measures, and cognitive task performance. At the initial postinjury examination, single-task gait stride length (1.16 ± 0.14 versus 1.25 ± 0.13 m, P = .003) and dual-task gait stride length (1.02 ± 0.13 m versus 1.10 ± 0.13 m, P = .011) for the concussion group compared with the control group, respectively, were shorter. After symptom resolution, no single-task gait differences were found, but the concussion group demonstrated slower gait velocity (0.78 ± 0.15 m/s versus 0.92 ± 0.14 m/s, P = .005), lower cadence (92.5 ± 12.2 steps/min versus 99.3 ± 7.8 steps/min, P < .001), and a shorter stride length (0.99 ± 0.15 m versus 1.10 ± 0.13 m, P = .003) during dual-task gait than the control group. No between-groups differences were detected during quiet stance at either time point. Acutely after concussion, single-task and dual-task stride-length alterations were present among youth athletes compared with a control group. Although single-task gait alterations were not detected after symptom resolution, dual-task gait differences persisted, suggesting that dual-task gait alterations may persist longer after concussion than single-task gait or objective quiet-stance alterations. Dual-task gait assessments may, therefore, be a useful component in monitoring concussion recovery after symptom resolution.Context:
Objective:
Design:
Setting:
Patients or Other Participants:
Intervention:
Main Outcome Measure(s):
Results:
Conclusions:
Dynamic balance during functional movement may provide important clinical information after concussion. The Sport Concussion Assessment Tool, version 3 (SCAT3), includes a timed tandem-gait test (heel-to-toe walking) administered with a pass-fail scoring system. Minimal evidence supports inclusion of the tandem-gait test in the SCAT3, especially in high school athletes. To determine (1) the percentage of healthy high school athletes who passed (best trial ≤14 seconds) the tandem-gait test at baseline, (2) the association between sex and test performance (pass versus fail), and (3) the relationships among sex, age, height, and tandem-gait test score. Cross-sectional study. High school sports medicine center. Two hundred athletes from 4 high schools (age = 15.8 ± 1.2 years, height = 170.3 ± 10.3 cm, weight = 64.8 ± 14.5 kg). Healthy participants completed 4 trials of the SCAT3 tandem-gait test and a demographic questionnaire. Outcome measures were passing rate at baseline on the tandem-gait test and tandem-gait test score (time). Overall, 24.5% (49/200) of participants passed the test. Sex and performance were associated (χ2 = 15.15, P < .001), with a passing rate of 38.6% (32/83) for males and 14.5% (17/117) for females. The regression model including predictor variables of sex and height, with the outcome variable of tandem-gait test score and time, was significant (R2 = 0.20, P < .01). Our findings suggest that the tandem-gait test had a high false-positive rate in high school athletes. Given that more than 75% of healthy participants failed the tandem-gait test, the 14-second cutoff appears to have limited clinical utility in the adolescent population. Functional movement deficits after concussion need to be accounted for, but the 14-second cutoff for the SCAT3 tandem-gait test does not appear to be an ideal way to assess these deficits in high school athletes.Context:
Objective:
Design:
Setting:
Patients or Other Participants:
Main Outcome Measure(s):
Results:
Conclusions:
Several tasks have been used to examine landing biomechanics for evaluation and rehabilitation, especially as related to anterior cruciate ligament injuries. However, comparing results among studies in which different tasks were used can be difficult, and it is unclear which task may be most appropriate. To compare lower extremity biomechanics across 5 commonly used landing tasks. Descriptive laboratory study. University-operated US Air Force Special Operations Forces human performance research laboratory. A total of 65 US Air Force Special Tactics Operators (age = 27.7 ± 5.0 years, height = 176.5 ± 5.7 cm, mass = 83.1 ± 9.1 kg). Kinematic and kinetic analysis of double- and single-legged drop landing, double- and single-legged stop jump, and forward jump to single-legged landing. Hip-, knee-, and ankle-joint kinematics; knee-joint forces and moments; and ground reaction forces (GRFs) were the dependent measures. We used repeated-measures analyses of variance or Friedman tests, as appropriate, to assess within-subject differences across tasks. Peak vertical GRF and peak knee-flexion angle were different among all tasks (P < .001). Single-legged landings generated higher vertical GRF (χ2 = 244.68, P < .001) and lower peak knee-flexion values (F4,64 = 209.33, P < .001) except for forward jump to single-legged landing, which had the second highest peak vertical GRF and the lowest peak knee-flexion value. The single-legged drop landing generated the highest vertical (χ2 = 244.68, P < .001) and posterior (χ2 = 164.46, P < .001) GRFs. Peak knee-valgus moment was higher during the double-legged drop landing (χ2 = 239.63, P < .001) but similar for all others. Different landing tasks elicited different biomechanical responses; no single task was best for assessing a wide range of biomechanical variables related to anterior cruciate ligament injuries. Therefore, depending on the goals of the study, using multiple assessment tasks should be considered.Context:
Objective:
Design:
Setting:
Patients or Other Participants:
Intervention(s):
Main Outcome Measure(s):
Results:
Conclusions:
Environmental sustainability efforts are becoming a critical concern in health care. However, little is known regarding how athletic trainers feel about the environment or what can be done to reduce the environmental impact of the practice of athletic training. To examine athletic trainers' attitudes toward and perceptions of factors related to environmental sustainability. Sequential, mixed methods using a survey, focus groups, and personal interviews. Field study. Four hundred forty-two individuals completed the survey. Sixteen participated in the qualitative portion. Quantitative results from the Athletic Training Environmental Impact Survey included data from a 5-point Likert scale (1 = lowest rating and 5 = highest rating). Descriptive statistics and 1-way analyses of variance were used to describe perceptions and determine differences in mean opinion, National Athletic Trainers' Association district, and use of green techniques. Qualitative data were transcribed verbatim and analyzed inductively. The mean score for opinion of the environment was 3.30 ± 0.52. A difference was found between opinion and National Athletic Trainers' Association district (F9, 429 = 2.43, P = .01). A Bonferroni post hoc analysis identified this difference (P = .03) between members of District 2 (Delaware, New Jersey, New York, Pennsylvania) and District 9 (Alabama, Florida, Georgia, Kentucky, Louisiana, Mississippi, Tennessee). An inductive analysis resulted in 3 emergent themes: (1) barriers to using green techniques, (2) motivators for using green techniques, and (3) solutions to overcoming the barriers. The information gleaned from participants in the qualitative portion of the study can be useful for clinicians wishing to implement basic conservation efforts in their practice settings and may guide future sustainability projects. Overall, participants reported a positive opinion of environmental sustainability topics related to athletic training. However, many barriers to practicing green techniques were identified.Context:
Objective:
Design:
Setting:
Patients or Other Participants:
Main Outcome Measure(s):
Results:
Conclusions:
Environmental sustainability is a critical concern in health care. Similar to other professions, the practice of athletic training necessitates the use of a large quantity of natural and manufactured resources. To examine the perceptions of the waste produced by the practice of athletic training and the green practices currently used by athletic trainers (ATs) to combat this waste. Mixed-methods study. Field setting. A total of 442 ATs completed the study. Sixteen individuals participated in the qualitative portion. Data from sections 2 and 3 of the Athletic Training Environmental Impact Survey were analyzed. Focus groups and individual interviews were used to determine participants' views of waste and the efforts used to combat waste. Descriptive statistics were used to examine types of waste. Independent t tests, χ2 tests, and 1-way analyses of variance were calculated to identify any differences between the knowledge and use of green techniques. Interviews and focus groups were transcribed verbatim and analyzed inductively. Participants reported moderate knowledge of green techniques (3.18 ± 0.53 on a 5-point Likert scale). Fifty-eight percent (n = 260) of survey participants perceived that a substantial amount of waste was produced by the practice of athletic training. Ninety-two percent (n = 408) admitted they thought about the waste produced in their daily practice. The types of waste reported most frequently were plastics (n = 111, 29%), water (n = 88, 23%), and paper for administrative use (n = 81, 21%). Fifty-two percent (n = 234) agreed this waste directly affected the environment. The qualitative aspect of the study reinforced recognition of the large amount of waste produced by the practice of athletic training. Types of conservation practices used by ATs were also explored. Participants reported concern regarding the waste produced by athletic training. The amount of waste varies depending on practice size and setting. Future researchers should use direct measures to determine the amount of waste created by the practice of athletic training.Context:
Objective:
Design:
Setting:
Patients or Other Participants:
Main Outcome Measure(s):
Results:
Conclusions:
Organizational factors have been identified as barriers to finding work-life balance (WLB) in athletic training. Despite the existence of organizational policies to address WLB, little is known about athletic trainers' (ATs') awareness of these policies that could assist them. To better understand the perceptions of ATs regarding the workplace practices available to them, which may help them achieve WLB. Phenomenologic study. Collegiate practice setting. Twenty-one ATs (women = 10, men = 11) employed at the collegiate level (National Collegiate Athletic Association Division I = 12, Division II = 5, Division III = 4) volunteered for our study. The average age of the participants was 33 ± 9 years. Saturation of the data was met at n = 21. Participants completed an in-depth, 1-on-1 phone interview, which was then transcribed verbatim. Data were analyzed using a phenomenologic approach. Credibility was determined by member checks, peer review, and researcher triangulation. Our analyses revealed that participants (1) had a limited awareness of formal policies that were offered within their university or collegiate infrastructure; (2) used informal policies to manage their personal, family, and work obligations; and (3) thought that more formal policies, such as adherence to adequate staffing patterns and work schedules, could help establish WLB within collegiate athletic training settings. Informal workplace policies were more commonly used by our participants and were viewed as a means of creating a supportive atmosphere. Administrators and supervisors should consider creating or endorsing more formal policies specific to the demands of an AT in the collegiate setting to help with WLB.Context:
Objective:
Design:
Setting:
Patients or Other Participants:
Data Collection and Analysis:
Results:
Conclusions:
Professional horse racing is considered a high-risk sport, yet the last analysis of fall and injury incidence in this sport in Ireland was completed between 1999 and 2006. To provide an updated analysis of the fall and injury incidence in professional flat and jump horse racing in Ireland from 2011 through 2015, compare it with the previous analysis, and detail the specific types and locations of injuries. Descriptive epidemiology study. A medical doctor recorded all injuries that occurred at every official flat and jump race meeting for the 2011 through 2015 seasons using standardized injury-report forms. Injury and fall rates and their 95% confidence intervals (CIs) were reported for flat and jump racing. Incidence rate ratios and 95% CIs were calculated between flat and jump racing, between the 1999–2006 analysis and the current results, and between 2011 and 2015. The distribution of injuries for type and location of injury was reported. Compared with flat racing, jump racing had significantly more falls per 1000 rides (49.5 versus 3.8), injuries per 1000 rides (10.1 versus 1.4), and injuries per 1000 meetings (776.0 versus 94.1). However, the rate of injuries per 1000 falls was significantly higher in flat racing (352.8 versus 203.8). An increase in injuries per 1000 falls between 2011 and 2015 was found in flat racing (P = .005). Since the previous analysis, a significant increase in injuries per 1000 rides and falls was noted in jump racing. Soft tissue injuries were predominant in flat and jump racing (61.54% and 68.80%, respectively), with fractures the second most common injury (15.38% and 18.06%, respectively). Concussions were more prevalent from flat-racing falls (incidence rate ratio = 0.30; 95% CI = 0.15, 0.61). The lower limb was the most frequent location of injury (32.89%) in flat racing; however, in jump racing, upper limb injuries (34.97%) were predominant. An update on professional flat- and jump-racing fall and injury epidemiology is provided. Further research to identify risk factors for injury, design and investigate the feasibility of injury-prevention strategies, and document their effects on fall and injury incidence is required.Context:
Objective:
Design:
Setting:
Main Outcome Measure(s):
Results:
Conclusions:
Soccer is the most popular junior sport in the world. In junior sports, injury analysis and injury-prevention measures for players, especially those under 12 years of age, are urgently needed. To prospectively study the incidence, sites, types, and mechanisms of injuries in elementary school-aged junior soccer players during games and practices. Descriptive epidemiology study. Elementary school-aged junior soccer teams in Nagoya, Japan. Eighty-nine players in 5 community-based club teams of junior soccer (U-12, age range = 11–12 years; U-11, age range = 10–11 years; U-10, age ≤10 years). Data on all game and practice injuries for the 2013–2014 season were collected using an injury report form. Injury rates were calculated according to injury site, type, and mechanism. The overall injury rate was 2.59/1000 athlete-hours (AHs). The game injury rate (GIR; 6.43/1000 AHs) was higher than the practice injury rate (PIR; 1.49/1000 AHs; P < .05). The most common anatomical areas of injury during games and practices were the lower limbs (62.5% and 4.02/1000 AHs versus 38.5% and 0.57/1000 AHs, respectively). Contusions (27.6%, n = 8) were the most frequent type of overall injuries. Most game injuries resulted from body contact (43.8%, 2.81/1000 AHs), whereas most practice injuries resulted from other types of contact (53.8%, 0.83/1000 AHs). The GIRs were higher than the PIRs in Japanese junior soccer players. A lower overall PIR suggested that players in the U-12 age group practiced under appropriate conditions. However, the higher GIR in this age category needs to be decreased.Context:
Objective:
Design:
Setting:
Patients or Other Participants:
Main Outcome Measure(s):
Results:
Conclusions:
Seventy-seven percent of musculoskeletal injuries sustained by United States Army Special Forces Operators are preventable. Identification of predictive characteristics will promote the development of screening methods to augment injury-prevention programs. To determine physical and performance characteristics that predict musculoskeletal injuries. Clinical laboratory. A total of 95 Operators (age = 32.7 ± 5.1 years, height = 179.8 ± 6.9 cm, mass = 89.9 ± 12.7 kg). Laboratory testing consisted of body composition, aerobic and anaerobic capacity, upper and lower body strength and flexibility, balance, and biomechanical evaluation. Injury data were captured for 12 months after laboratory testing. Injury frequencies, cross-tabulations, and relative risks (RRs) were calculated to evaluate the relationships between physical characteristics and injury proportions. Between-groups differences (injured versus uninjured) were assessed using appropriate t tests or Mann-Whitney U tests. Less shoulder-retraction strength (RR = 1.741 [95% confidence interval = 1.003, 3.021]), knee-extension strength (RR = 2.029 [95% confidence interval = 1.011, 4.075]), and a smaller trunk extension : flexion ratio (RR = 0.533 [95% confidence interval = 0.341, 0.831]) were significant risk factors for injury. Group comparisons showed less trunk strength (extension: P = .036, flexion: P = .048) and smaller right vertical ground reaction forces during landing (P = .025) in injured Operators. Knee strength, aerobic capacity, and body mass index were less in the subgroup of spine-injured versus uninjured Operators (P values = .013−.036). Knee-extension and shoulder-retraction strength were risk factors for musculoskeletal injury in Operators. Less trunk-flexion and -extension strength, higher body mass index, lower aerobic capacity, and increased ground reaction forces during landing were characteristics that may also contribute to musculoskeletal injury. Having 2 or more risk factors resulted in a greater injury proportion (χ2 = 13.512, P = .015); however, more research is needed. Athletic trainers working in the military or similar high-demand settings can use these data to augment screening and injury-prevention protocols.Context:
Objective:
Setting:
Patients or Other Participants:
Main Outcome Measure(s):
Results:
Conclusions:
Wet bulb globe temperature (WBGT) is the gold standard for assessing environmental heat stress during physical activity. Many manufacturers of commercially available instruments fail to report WBGT accuracy. To determine the accuracy of several commercially available WBGT monitors compared with a standardized reference device. Observational study. Field test. Six commercially available WBGT devices. Data were recorded for 3 sessions (1 in the morning and 2 in the afternoon) at 2-minute intervals for at least 2 hours. Mean absolute error (MAE), root mean square error (RMSE), mean bias error (MBE), and the Pearson correlation coefficient (r) were calculated to determine instrument performance compared with the reference unit. The QUESTemp° 34 (MAE = 0.24°C, RMSE = 0.44°C, MBE = –0.64%) and Extech HT30 Heat Stress Wet Bulb Globe Temperature Meter (Extech; MAE = 0.61°C, RMSE = 0.79°C, MBE = 0.44%) demonstrated the least error in relation to the reference standard, whereas the General WBGT8778 Heat Index Checker (General; MAE = 1.18°C, RMSE = 1.34°C, MBE = 4.25%) performed the poorest. The QUESTemp° 34 and Kestrel 4400 Heat Stress Tracker units provided conservative measurements that slightly overestimated the WBGT provided by the reference unit. Finally, instruments using the psychrometric wet bulb temperature (General, REED Heat Index WBGT Meter, and WBGT-103 Heat Stroke Checker) tended to underestimate the WBGT, and the resulting values more frequently fell into WBGT-based activity categories with fewer restrictions as defined by the American College of Sports Medicine. The QUESTemp° 34, followed by the Extech, had the smallest error compared with the reference unit. Moreover, the QUESTemp° 34, Extech, and Kestrel units appeared to offer conservative yet accurate assessments of the WBGT, potentially minimizing the risk of allowing physical activity to continue in stressful heat environments. Instruments using the psychrometric wet bulb temperature tended to underestimate WBGT under low wind-speed conditions. Accurate WBGT interpretations are important to enable clinicians to guide activities in hot and humid weather conditions.Context:
Objective:
Design:
Setting:
Patients or Other Participants:
Main Outcome Measure(s):
Results:
Conclusions:
Harmon KG, Zigman M, Drezner JA. The effectiveness of screening history, physical exam, and ECG to detect potentially lethal cardiac disorders in athletes: a systematic review/meta-analysis. J Electrocardiol. 2015;48(3):329–338. Which screening method should be considered best practice to detect potentially lethal cardiac disorders during the preparticipation physical examination (PE) of athletes? The authors completed a comprehensive literature search of MEDLINE, CINAHL, Cochrane Library, Embase, Physiotherapy Evidence Database (PEDro), and SPORTDiscus from January 1996 to November 2014. The following key words were used individually and in combination: ECG, athlete, screening, pre-participation, history, and physical. A manual review of reference lists and key journals was performed to identify additional studies. The Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines were followed for this review. Studies selected for this analysis involved (1) outcomes of cardiovascular screening in athletes using the history, PE, and electrocardiogram (ECG); (2) history questions and PE based on the American Heart Association recommendations and guidelines; and (3) ECGs interpreted following modern standards. The exclusion criteria were (1) articles not in English, (2) conference abstracts, and (3) clinical commentary articles. Study quality was assessed on a 7-point scale for risk of bias; a score of 7 indicated the highest quality. Articles with potential bias were excluded. Data included number and sex of participants, number of true- and false-positives and negatives, type of ECG criteria used, number of cardiac abnormalities, and specific cardiac conditions. The sensitivity, specificity, false-positive rate, and positive predictive value of each screening tool were calculated and summarized using a bivariate random-effects meta-analysis model. Fifteen articles reporting on 47 137 athletes were fully reviewed. The overall quality of the 15 articles ranged from 5 to 7 on the 7-item assessment scale (ie, participant selection criteria, representative sample, prospective data with at least 1 positive finding, modern ECG criteria used for screening, cardiovascular screening history and PE per American Heart Association guidelines, individual test outcomes reported, and abnormal screening findings evaluated by appropriate diagnostic testing). The athletes (66% males and 34% females) were ethnically and racially diverse, were from several countries, and ranged in age from 5 to 39 years. The sensitivity and specificity of the screening methods were, respectively, ECG, 94% and 93%; history, 20% and 94%; and PE, 9% and 97%. The overall false-positive rate for ECG (6%) was less than that for history (8%) or PE (10%). The positive likelihood ratios of each screening method were 14.8 for ECG, 3.22 for history, and 2.93 for PE. The negative likelihood ratios were 0.055 for ECG, 0.85 for history, and 0.93 for PE. A total of 160 potentially lethal cardiovascular conditions were detected, for a rate of 0.3%, or 1 in 294 patients. The most common conditions were Wolff-Parkinson-White syndrome (n = 67, 42%), long QT syndrome (n = 18, 11%), hypertrophic cardiomyopathy (n = 18, 11%), dilated cardiomyopathy (n = 11, 7%), coronary artery disease or myocardial ischemia (n = 9, 6%), and arrhythmogenic right ventricular cardiomyopathy (n = 4, 3%). The most effective strategy to screen athletes for cardiovascular disease was ECG. This test was 5 times more sensitive than history and 10 times more sensitive than PE, and it had a higher positive likelihood ratio, lower negative likelihood ratio, and lower false-positive rate than history or PE. The 12-lead ECG interpreted using modern criteria should be considered the best practice in screening athletes for cardiovascular disease, and the use of history and PE alone as screening tools should be reevaluated.Reference/Citation:
Clinical Question:
Data Sources:
Study Selection:
Data Extraction:
Main Results:
Conclusions:
Donnell-Fink LA, Klara K, Collins JE, et al. Effectiveness of knee injury and anterior cruciate ligament tear prevention programs: a meta-analysis. PLoS One. 2015;10(12)e0144063. Is neuromuscular and proprioceptive training effective in preventing knee and anterior cruciate ligament (ACL) injuries? The authors searched CINAHL, Cochrane Central Register of Controlled Trials, MEDLINE/EMBASE, PubMed, and Web of Science databases from 1996 through December 2014 and limited the results to peer-reviewed manuscripts published in English. Search terms for all databases were knee injury OR knee injuries; OR anterior cruciate ligament injury OR anterior cruciate ligament injuries; OR ACL injury OR ACL injuries; OR lower limb injury OR lower limb injuries AND prevention. Inclusion criteria were (1) English language, (2) published from 1996 through 2014, (3) the intervention used neuromuscular or proprioceptive training to prevent knee or ACL injuries, (4) human participants, (5) the incidence of knee or ACL injury was provided. For the articles that met the inclusion criteria, 2 authors worked independently using the Jadad scale to extract the first author, year of publication, title, sport type, participant sex, participant age, country in which the study was conducted, number of participants in the control and intervention groups, intervention characteristics or components, and knee or ACL injury outcome. A total of 24 studies with 1093 participants were included in this review. Intervention efficacy was determined from weighted incidence rate ratios. After the intervention of neuromuscular and proprioceptive training exercises, the incidence ratio (frequency of a disease or injury occurrence in a population over a specific time frame) was calculated at 0.731 (95% confidence interval = 0.614, 0.871) for knee injury and at 0.493 (95% confidence interval = 0.285, 0.854) for ACL injury. This indicated a link between neuromuscular and proprioceptive training programs and injury reduction. No significant correlation was present between more components added to training and a greater decrease in injury to either the knee or ACL. Neuromuscular and proprioceptive training appeared to decrease the incidence of injury to the knee and specifically the ACL. However, no evidence suggested that a specific group of exercises was better than others.Reference/Citation:
Clinical Question:
Data Sources:
Study Selection:
Data Extraction:
Main Results:
Conclusions:
JAT eISSN: 1938-162X
JAT ISSN: 1062-6050
ATEJ ISSN: 1947-380X