After a concussion or mild traumatic brain injury (mTBI), patients often suffer from light sensitivity, or photophobia, which contributes to decreased quality of life post-mTBI. Whereas sunglasses may provide some relief from photophobia, they are not practical indoors or in low light. A light-mitigation strategy can be easily used indoors as needed to optimize the relief. We have found that many photophobic patients experience relief using colored sunglasses. To provide the athletic trainer with a means and method to assess whether an athlete is suffering from photophobia after concussion and to determine if colored glasses provide relief. Cross-sectional study. Rehabilitation clinic. Fifty-one patients being treated after concussion. We assessed postconcussion patients for visual symptoms including photophobia and photosensitivity. Off-the-shelf glasses were used to determine whether specific colors provided relief from photophobia. Screening was done using a penlight and multiple pairs of colored glasses. Self-reported mitigation of photophobia symptoms and the specific color frequency that reduced symptoms in each individual. Of the 39 patients studied who had visual symptoms, 76% complained of photophobia. Using glasses of 1 or more colors, symptoms were relieved in 85% of patients reporting photophobia. The colors that provided the most relief were blue, green, red, and purple. No adverse events were reported. An empirical assessment of frequency-specific photophobia is easy to perform. A traditional penlight is used to elicit photophobia and then the colored glasses are tested for optimal relief. Frequency-specific photophobia can be reduced with a strategy of light-mitigation therapy, including colored glasses, sunglasses, hats, and light avoidance. This, we believe, helps to improve the patient's quality of life and may aid in the recovery process. More work is needed to identify the best colors and methods of mitigating frequency-specific photophobia.Context:
Objective:
Design:
Setting:
Patients or Other Participants:
Intervention(s):
Main Outcome Measure(s):
Results:
Conclusions:
Recurrence rates for ankle sprains are high. Therefore, preventive measures such as ankle bracing during sports are recommended. To systematically evaluate the perceived ease of use, quality, comfort, stability, and hindrance of and the overall satisfaction with 3 contemporary brace types in 3 types of sports. Randomized comparative user survey. Recreational sports: soccer, volleyball, and running. Young adult recreational athletes (29 soccer players, 26 volleyball players, and 31 runners). Compression brace (CB), lace-up brace (LB), and semirigid brace (SB). Rating of perceived ease of use, quality, comfort, stability, and hindrance of and overall satisfaction with the brace types during sports on a 5-point Likert scale. The secondary outcome measure was participants' willingness to buy the tested brace. Overall, the 3 brace types received high mean scores for ease of use and quality. Soccer players preferred the CB over both alternatives, considering the higher scores for comfort (CB = 4.0, LB = 3.5, SB = 2.8), hindrance (CB = 3.7, LB = 2.9, SB = 2.8), overall satisfaction (CB = 3.6, LB = 3.0, SB = 2.5), and greatest willingness to buy this brace. Volleyball players preferred the LB over both alternatives, considering the higher scores for stability (LB = 4.2, CB = 3.2, SB = 3.3), overall satisfaction (LB = 3.8, CB = 3.0, SB = 3.0), and greatest willingness to buy this brace. Runners preferred the CB over both alternatives considering the better score for hindrance (CB = 3.6, LB = 2.8, SB = 2.9) and greatest willingness to buy this brace. All 3 ankle-brace types scored high on perceived ease of use and quality. Regarding the brace types, soccer players, volleyball players, and runners differed in their assessments of subjective evaluation of comfort, stability, hindrance, overall satisfaction, and willingness to buy the brace. Soccer players and runners preferred the CB, whereas volleyball players preferred the LB.Context:
Objective:
Design:
Setting:
Patients or Other Participants:
Intervention(s):
Main Outcome Measure(s):
Results:
Conclusions:
The extent to which lower extremity lean mass (LELM) relative to total body mass influences one's ability to maintain safe landing biomechanics during prolonged exercise when injury incidence increases is unknown. To examine the influence of LELM on (1) pre-exercise lower extremity biomechanics and (2) changes in biomechanics during an intermittent exercise protocol (IEP) and (3) determine whether these relationships differ by sex. We hypothesized that less LELM would predict higher-risk baseline biomechanics and greater changes toward higher-risk biomechanics during the IEP. Cohort study. Controlled laboratory. A total of 59 athletes (30 men: age = 20.3 ± 2.0 years, height = 1.79 ± 0.05 m, mass = 75.2 ± 7.2 kg; 29 women: age = 20.6 ± 2.3 years, height = 1.67 ± 0.08 m, mass = 61.8 ± 9.0 kg) participated. Before completing an individualized 90-minute IEP designed to mimic a soccer match, participants underwent dual-energy x-ray absorptiometry testing for LELM. Three-dimensional lower extremity biomechanics were measured during drop-jump landings before the IEP and every 15 minutes thereafter. A previously reported principal components analysis reduced 40 biomechanical variables to 11 factors. Hierarchical linear modeling analysis then determined the extent to which sex and LELM predicted the baseline score and the change in each factor over time. Lower extremity lean mass did not influence baseline biomechanics or the changes over time. Sex influenced the biomechanical factor representing knee loading at baseline (P = .04) and the changes in the anterior cruciate ligament–loading factor over time (P = .03). The LELM had an additional influence only on women who possessed less LELM (P = .03 and .02, respectively). Lower extremity lean mass influenced knee loading during landing in women but not in men. The effect appeared to be stronger in women with less LELM. Continually decreasing knee loading over time may reflect a strategy chosen to avoid injury. A minimal threshold of LELM may be needed to safely perform landing maneuvers, especially during prolonged exercise when the injury risk increases.Context:
Objectives:
Design:
Setting:
Patients or Other Participants:
Intervention(s):
Main Outcome Measure(s):
Results:
Conclusions:
Cold-water immersion (CWI; 10°C) can effectively reduce body core temperature even if a hyperthermic human is wearing a full American football uniform (PADS) during treatment. Temperate-water immersion (TWI; 21°C) may be an effective alternative to CWI if resources for the latter (eg, ice) are unavailable. To measure rectal temperature (Trec) cooling rates, thermal sensation, and Environmental Symptoms Questionnaire (ESQ) scores of participants wearing PADS or shorts, undergarments, and socks (NOpads) before, during, and after TWI. Crossover study. Laboratory. Thirteen physically active, unacclimatized men (age = 22 ± 2 years, height = 182.3 ± 5.2 cm, mass = 82.5 ± 13.4 kg, body fat = 10% ± 4%, body surface area = 2.04 ± 0.16 m2). Participants exercised in the heat (40°C, 50% relative humidity) on 2 days while wearing PADS until Trec reached 39.5°C. Participants then underwent TWI while wearing either NOpads or PADS until Trec reached 38°C. Thermal sensation and ESQ responses were collected at various times before and after exercise. Temperate-water immersion duration (minutes), Trec cooling rates (°C/min), thermal sensation, and ESQ scores. Participants had similar exercise times (NOpads = 38.1 ± 8.1 minutes, PADS = 38.1 ± 8.5 minutes), hypohydration levels (NOpads = 1.1% ± 0.2%, PADS = 1.2% ± 0.2%), and thermal sensation ratings (NOpads = 7.1 ± 0.4, PADS = 7.3 ± 0.4) before TWI. Rectal temperature cooling rates were similar between conditions (NOpads = 0.12°C/min ± 0.05°C/min, PADS = 0.13°C/min ± 0.05°C/min; t12 = 0.82, P = .79). Thermal sensation and ESQ scores were unremarkable between conditions over time. Temperate-water immersion produced acceptable (ie, >0.08°C/min), though not ideal, cooling rates regardless of whether PADS or NOpads were worn. If a football uniform is difficult to remove or the patient is noncompliant, clinicians should begin water-immersion treatment with the athlete fully equipped. Clinicians should strive to use CWI to treat severe hyperthermia, but when CWI is not feasible, TWI should be the next treatment option because its cooling rate was higher than the rates of other common modalities (eg, ice packs, fanning).Context:
Objective:
Design:
Setting:
Patients or Other Participants:
Intervention(s):
Main Outcome Measure(s):
Results:
Conclusions:
To conduct a systematic review with meta-analysis assessing the effectiveness of conservative rehabilitation programs for improving health-related quality of life (HRQL) in individuals with chronic ankle instability (CAI). PubMed, MEDLINE, CINAHL, and SPORTDiscus were searched from inception to January 2016. Studies were included if the researchers examined the effects of a conservative rehabilitation protocol in individuals with CAI, used validated patient-reported outcomes (PROs) to quantify participant-perceived HRQL, and provided adequate data to calculate the effect sizes (ESs) and 95% confidence intervals (CIs). Studies were excluded if the authors evaluated surgical interventions, prophylactic taping, or bracing applications or examined only the immediate effects of 1 treatment session. Two investigators independently assessed methodologic quality using the Physiotherapy Evidence Database (PEDro) Scale. Studies were considered low quality if fewer than 60% of the criteria were met. Level of evidence was assessed using the Strength of Recommendation Taxonomy. Preintervention and postintervention sample sizes, means, and standard deviations of PROs were extracted. A total of 15 studies provided 24 participant groups that were included in the analysis. Seven high-quality studies with a median PEDro score of 50% (range = 10%−80%) and a median level of evidence of 2 (range = 1−2) were identified. The magnitudes of preintervention to postintervention PRO differences were examined using bias-corrected Hedges g ESs. Random-effects meta-analysis was performed to synthesize PRO changes across all participant groups. Positive ES values indicated better PRO scores at postintervention than at preintervention. The α level was set at .05. Meta-analysis revealed a strong ES with a nonoverlapping 95% CI (ES = 1.20, CI = 0.80, 1.60; P < .001), indicating HRQL improved after conservative rehabilitation. Based on the quality of the evidence and the results of the meta-analysis, grade A evidence showed that conservative rehabilitation produces large improvements in HRQL for people with CAI.Objective:
Data Sources:
Study Selection:
Data Extraction:
Data Synthesis:
Conclusions:
A change in reaction time is one of various clinical measures of neurocognitive function that can be monitored after concussion and has been reported to be among the most sensitive indicators of cognitive impairment. To determine the timeline for clinically assessed simple reaction time to return to baseline after a concussion in high school athletes. Observational study. Athletic training room. Twenty-one high school-aged volunteers. Participants completed 8 trials of the ruler-drop test during each session. Along with baseline measures, a total of 6 additional test sessions were completed over the course of 4 weeks after a concussion (days 3, 7, 10, 14, 21, and 28). The mean reaction times calculated for all participants from each of the 7 test sessions were analyzed to assess the change in reaction time over the 7 time intervals. After a concussion and compared with baseline, simple reaction time was, on average, 26 milliseconds slower at 48 to 72 hours postinjury (P < .001), almost 18 milliseconds slower on day 7 (P < .001), and about 9 milliseconds slower on day 10 (P < .001). Simple reaction time did not return to baseline levels until day 14 postinjury. Clinically assessed simple reaction time appeared to return to baseline levels within a timeframe that mirrors other measures of cognitive performance (approximately 14 days).Context:
Objective:
Design:
Setting:
Patients or Other Participants:
Intervention(s):
Main Outcome Measure(s):
Results:
Conclusions:
Despite the growing popularity of ice hockey among female youth and interest in the biomechanics of head impacts in sport, the head impacts sustained by this population have yet to be characterized. To describe the number of, biomechanical characteristics of, and exposure to head impacts of female youth ice hockey players during competition and to investigate the influences of player and game characteristics on head impacts. Cohort study. Twenty-seven female youth ice hockey players (mean age = 12.5 ± 0.52 years) wore instrumented ice hockey helmets during 66 ice hockey games over a 3-year period. Data specific to player, game, and biomechanical head impact characteristics were recorded. A multiple regression analysis identified factors most associated with head impacts of greater frequency and severity. A total of 436 total head impacts were sustained during 6924 minutes of active ice hockey participation (0.9 ± 0.6 impacts per player per game; range, 0–2.1). A higher body mass index (BMI) significantly predicted a higher number of head impacts sustained per game (P = .008). Linear acceleration of head impacts was greater in older players and those who played the forward position, had a greater BMI, and spent more time on the ice (P = .008), whereas greater rotational acceleration was present in older players who had a greater BMI and played the forward position (P = .008). During tournament games, increased ice time predicted increased severity of head impacts (P = .03). This study reveals for the first time that head impacts are occurring in female youth ice hockey players, albeit at a lower rate and severity than in male youth ice hockey players, despite the lack of intentional body checking.Context:
Objectives:
Design:
Methods:
Results:
Conclusions:
Ice hockey is a high-speed, full-contact sport with a high risk of head/face/neck (HFN) injuries. However, men's and women's ice hockey differ; checking is allowed only among men. To describe the epidemiology of HFN injuries in collegiate men's and women's ice hockey during the 2009−2010 through 2013−2014 academic years. Descriptive epidemiology study. Ice hockey data from the National Collegiate Athletic Association (NCAA) Injury Surveillance Program during the 2009−2010 through 2013−2014 academic years. Fifty-seven men's and 26 women's collegiate ice hockey programs from all NCAA divisions provided 106 and 51 team-seasons of data, respectively. Injury rates per 1000 athlete-exposures and rate ratios with 95% confidence intervals (CIs). The NCAA Injury Surveillance Program reported 496 and 131 HFN injuries in men's and women's ice hockey, respectively. The HFN injury rate was higher in men than in women (1.75 versus 1.16/1000 athlete-exposures; incidence rate ratio = 1.51; 95% CI = 1.25, 1.84). The proportion of HFN injuries from checking was higher in men than in women for competitions (38.5% versus 13.6%; injury proportion ratio = 2.82; 95% CI = 1.64, 4.85) and practices (21.9% versus 2.3%; injury proportion ratio = 9.41; 95% CI = 1.31, 67.69). The most common HFN injury diagnosis was concussion; most concussions occurred in men's competitions from player contact while checking (25.9%). Player contact during general play comprised the largest proportion of concussions in men's practices (25.9%), women's competitions (25.0%), and women's practices (24.0%). While 166 lacerations were reported in men, none were reported in women. In men, most lacerations occurred from player contact during checking in competitions (41.8%) and player contact during general play in practices (15.0%). A larger proportion of HFN injuries in ice hockey occurred during checking in men versus women. Concussion was the most common HFN injury and was most often due to player contact. Lacerations were reported only among men and were mostly due to checking. Injury-prevention programs should aim to reduce checking-related injuries.Context:
Objectives:
Design:
Setting:
Patients or Other Participants:
Main Outcome Measure(s):
Results:
Conclusions:
Athletic training facilities have been described in terms of general design concepts and from operational perspectives. However, the size and scope of athletic training facilities, along with staffing at different levels of intercollegiate competition, have not been quantified. To define the size and scope of athletic training facilities and staffing levels at various levels of intercollegiate competition. To determine if differences existed in facilities (eg, number of facilities, size of facilities) and staffing (eg, full time, part time) based on the level of intercollegiate competition. Cross-sectional study. Web-based survey. Athletic trainers (ATs) who were knowledgeable about the size and scope of athletic training programs. Athletic training facility size in square footage; the AT's overall facility satisfaction; athletic training facility component spaces, including satellite facilities, game-day facilities, offices, and storage areas; and staffing levels, including full-time ATs, part-time ATs, and undergraduate students. The survey was completed by 478 ATs (response rate = 38.7%) from all levels of competition. Sample means for facilities were 3124.7 ± 4425 ft2 (290.3 ± 411 m2) for the central athletic training facility, 1013 ± 1521 ft2 (94 ± 141 m2) for satellite athletic training facilities, 1272 ± 1334 ft2 (118 ± 124 m2) for game-day athletic training facilities, 388 ± 575 ft2 (36 ± 53 m2) for athletic training offices, and 424 ± 884 ft2 (39 ± 82 m2) for storage space. Sample staffing means were 3.8 ± 2.5 full-time ATs, 1.6 ± 2.5 part-time ATs, 25 ± 17.6 athletic training students, and 6.8 ± 7.2 work-study students. Division I schools had greater resources in multiple categories (P < .001). Differences among other levels of competition were not as well defined. Expansion or renovation of facilities in recent years was common, and almost half of ATs reported that upgrades have been approved for the near future. This study provides benchmark descriptive data on athletic training staffing and facilities. The results (1) suggest that the ATs were satisfied with their facilities and (2) highlight the differences in resources among competition levels.Context:
Objective:
Design:
Setting:
Patients or Other Participants:
Main Outcome Measure(s):
Results:
Conclusions:
JAT eISSN: 1938-162X
JAT ISSN: 1062-6050
ATEJ ISSN: 1947-380X