It has been proposed that altered dynamic-control strategies during functional activity such as jump landings may partially explain recurrent instability in individuals with functional ankle instability (FAI). To capture jump-landing time to stabilization (TTS) and ankle motion using a multisegment foot model among FAI, coper, and healthy control individuals. Cross-sectional study. Laboratory. Participants were 23 individuals with a history of at least 1 ankle sprain and at least 2 episodes of giving way in the past year (FAI), 23 individuals with a history of a single ankle sprain and no subsequent episodes of instability (copers), and 23 individuals with no history of ankle sprain or instability in their lifetime (controls). Participants were matched for age, height, and weight (age = 23.3 ± 3.8 years, height = 1.71 ± 0.09 m, weight = 69.0 ± 13.7 kg). Ten single-legged drop jumps were recorded using a 12-camera Vicon MX motion-capture system and a strain-gauge force plate. Mediolateral (ML) and anteroposterior (AP) TTS in seconds, as well as forefoot and hindfoot sagittal- and frontal-plane angles at jump-landing initial contact and at the point of maximum vertical ground reaction force were calculated. For the forefoot and hindfoot in the sagittal plane, group differences were present at initial contact (forefoot: P = .043, hindfoot: P = .004). At the hindfoot, individuals with FAI displayed more dorsiflexion than the control and coper groups. Time to stabilization differed among groups (AP TTS: P < .001; ML TTS: P = .040). Anteroposterior TTS was longer in the coper group than in the FAI or control groups, and ML TTS was longer in the FAI group than in the control group. During jump landings, copers showed differences in sagittal-plane control, including less plantar flexion at initial contact and increased AP sway during stabilization, which may contribute to increased dynamic stability.Context
Objective
Design
Setting
Patients or Other Participants
Intervention(s)
Main Outcome Measures
Results
Conclusions
Participants with chronic ankle instability (CAI) have been shown to have balance deficits related to decreased proprioception and neuromuscular control. Kinesiology tape (KT) has been proposed to have many benefits, including increased proprioception. To determine if KT can help with balance deficits associated with CAI. Cohort study. Research laboratory. Thirty participants with CAI were recruited for this study. Balance was assessed using the Balance Error Scoring System (BESS). Participants were pretested and then randomly assigned to either the control or KT group. The participants in the KT group had 4 strips applied to the foot and lower leg and were instructed to leave the tape on until they returned for testing. All participants returned 48 hours later for another BESS assessment. The tape was then removed, and all participants returned 72 hours later to complete the final BESS assessment. Total BESS errors. Differences between the groups occurred at 48 hours post–application of the tape (mean difference = 4.7 ± 1.4 errors, P < .01; 95% confidence interval = 2.0, 7.5) and at 72 hours post–removal of the tape (mean difference = 2.3 ± 1.1 errors, P = .04; 95% confidence interval = 0.1, 4.6). The KT improved balance after it had been applied for 48 hours when compared with the pretest and with the control group. One of the most clinically important findings is that balance improvements were retained even after the tape had been removed for 72 hours.Context
Objective
Design
Setting
Patients or Other Participants
Intervention(s)
Main Outcome Measure(s)
Results
Conclusions
When returning to physical activity, patients with a history of anterior cruciate ligament reconstruction (ACL-R) often experience limitations in knee-joint function that may be due to chronic impairments in quadriceps motor control. Assessment of knee-extension torque variability may demonstrate underlying impairments in quadriceps motor control in patients with a history of ACL-R. To identify differences in maximal isometric knee-extension torque variability between knees that have undergone ACL-R and healthy knees and to determine the relationship between knee-extension torque variability and self-reported knee function in patients with a history of ACL-R. Descriptive laboratory study. Laboratory. A total of 53 individuals with primary, unilateral ACL-R (age = 23.4 ± 4.9 years, height = 1.7 ± 0.1 m, mass = 74.6 ± 14.8 kg) and 50 individuals with no history of substantial lower extremity injury or surgery who served as controls (age = 23.3 ± 4.4 years, height = 1.7 ± 0.1 m, mass = 67.4 ± 13.2 kg). Torque variability, strength, and central activation ratio (CAR) were calculated from 3-second maximal knee-extension contraction trials (90° of flexion) with a superimposed electrical stimulus. All participants completed the International Knee Documentation Committee (IKDC) Subjective Knee Evaluation Form, and we determined the number of months after surgery. Group differences were assessed using independent-samples t tests. Correlation coefficients were calculated among torque variability, strength, CAR, months after surgery, and IKDC scores. Torque variability, strength, CAR, and months after surgery were regressed on IKDC scores using stepwise, multiple linear regression. Torque variability was greater and strength, CAR, and IKDC scores were lower in the ACL-R group than in the control group (P < .05). Torque variability and strength were correlated with IKDC scores (P < .05). Torque variability, strength, and CAR were correlated with each other (P < .05). Torque variability alone accounted for 14.3% of the variance in IKDC scores. The combination of torque variability and number of months after surgery accounted for 21% of the variance in IKDC scores. Strength and CAR were excluded from the regression model. Knee-extension torque variability was moderately associated with IKDC scores in patients with a history of ACL-R. Torque variability combined with months after surgery predicted 21% of the variance in IKDC scores in these patients.Context
Objective
Design
Setting
Patients or Other Participants
Main Outcome Measure(s)
Results
Conclusions
Burnout is an important psychological health concern for working professionals. Understanding how psychological stress and markers of workload contribute to athletic trainers' (ATs') perceptions of burnout is highly valuable. Both positive (social support) and negative social interactions should be considered when examining relationships among markers of ATs' health and wellbeing. To examine the potential effects of social interactions on the relationships between (1) burnout and perceived stress and (2) burnout and workload incongruence in ATs. Cross-sectional study. Participating ATs completed a computer-based survey during the fall sports season. Responding participants were ATs randomly sampled from the National Athletic Trainers' Association membership (N = 154; men = 78, women = 76; age = 36.8 ± 9.5 years). Participants completed self-report assessments (Perceived Stress Scale, Social Support Questionnaire, Positive and Negative Social Exchanges, Maslach Burnout Inventory–Human Services Survey) via a secure e-mail link. Workload incongruence was calculated by subtracting anticipated work hours from actual current work hours (6.0 ± 9.6 hours). We used hierarchical multiple regression analyses to examine hypothesized relationships among study variables. Social interactions did not affect the relationships between burnout and perceived stress or workload incongruence at the global or dimensional level. However, perceived stress (β = .47, P < .001), workload incongruence (β = .12, P < .05), and social support (β = −.25, P < .001) predicted global AT burnout. Negative social interactions trended toward significance (β = .12, P = .055). Our findings suggest that stress perceptions and social support drive the dimensional AT burnout experience, whereas workload incongruence (emotional exhaustion) and negative social interactions (depersonalization) were linked to specific burnout dimensions. Social interactions and markers of stress and workload should be considered when seeking to understand ATs' experiences with burnout and to design workplace interventions.Context
Objective
Design
Setting
Patients or Other Participants
Main Outcome Measure(s)
Results
Conclusions
Understanding the beliefs about and use of evidence-based practice (EBP) among athletic trainers (ATs) will help to determine appropriate strategies to improve implementation. To examine the ATs' beliefs about and use of EBP. Cross-sectional study. Online survey instrument. A total of 467 ATs responded to the survey request, a response rate of 11.67%. A total of 385 (9.6%) completed the EBP Beliefs Scale and 342 (8.5%) completed the EBP Implementation Scale. The EBP Beliefs Scale and EBP Implementation Scale were administered. The surveys collected demographic information in addition to information about participants' beliefs regarding EBP and implementation of EBP in clinical practice. The ATs demonstrated a level of neither agree nor disagree (56.00 ± 7.86) on the EBP Beliefs Scale. Belief scores were higher among those ATs required to document for third-party reimbursement (P = .001), those with access to current research through professional journals other than the Journal of Athletic Training (P = .02), and those with a doctoral degree (P = .01). A low level of implementation (9.00 ± 11.38), representing the implementation of EBP approximately 0 times in the previous 8 weeks, was found on the EBP Implementation Scale. Implementation scores were higher among preceptors (P = .01), those required to document for third-party reimbursement (P < .001), those with access to current research through professional journals (P = .002), and those with a doctoral degree (P = .01). Participants had a positive attitude toward EBP; however, they were not implementing EBP concepts when providing patient care. This suggests that additional information and EBP resources are needed so ATs can better implement EBP in practice. To provide the best patient care and to promote EBP within the profession, clinicians should make EBP a priority and advocate for EBP implementation.Context
Objective
Design
Setting
Patients or Other Participants
Main Outcome Measure(s)
Results
Conclusions
Women are 2 to 8 times more likely to sustain an anterior cruciate ligament (ACL) injury than men, and previous studies indicated an increased risk for injury during the preovulatory phase of the menstrual cycle (MC). However, investigations of risk rely on retrospective classification of MC phase, and no tools for this have been validated. To evaluate the accuracy of an algorithm for retrospectively classifying MC phase at the time of a mock injury based on MC history and salivary progesterone (P4) concentration. Descriptive laboratory study. Research laboratory. Thirty-one healthy female collegiate athletes (age range, 18−24 years) provided serum or saliva (or both) samples at 8 visits over 1 complete MC. Self-reported MC information was obtained on a randomized date (1−45 days) after mock injury, which is the typical timeframe in which researchers have access to ACL-injured study participants. The MC phase was classified using the algorithm as applied in a stand-alone computational fashion and also by 4 clinical experts using the algorithm and additional subjective hormonal history information to help inform their decision. To assess algorithm accuracy, phase classifications were compared with the actual MC phase at the time of mock injury (ascertained using urinary luteinizing hormone tests and serial serum P4 samples). Clinical expert and computed classifications were compared using κ statistics. Fourteen participants (45%) experienced anovulatory cycles. The algorithm correctly classified MC phase for 23 participants (74%): 22 (76%) of 29 who were preovulatory/anovulatory and 1 (50%) of 2 who were postovulatory. Agreement between expert and algorithm classifications ranged from 80.6% (κ = 0.50) to 93% (κ = 0.83). Classifications based on same-day saliva sample and optimal P4 threshold were the same as those based on MC history alone (87.1% correct). Algorithm accuracy varied during the MC but at no time were both sensitivity and specificity levels acceptable. These findings raise concerns about the accuracy of previous retrospective MC-phase classification systems, particularly in a population with a high occurrence of anovulatory cycles.Context
Objective
Design
Setting
Participants
Main Outcome Measure(s)
Results
Conclusions
Recent injury-surveillance data for collegiate-level cross-country athletes are limited. To describe the epidemiology of National Collegiate Athletic Association (NCAA) men's and women's cross-country injuries during the 2009–2010 through 2013–2014 academic years. Descriptive epidemiology study. Aggregate injury and exposure data collected from 25 men's and 22 women's cross-country programs, providing 47 and 43 seasons of data, respectively. Collegiate student-athletes participating in men's and women's cross-country during the 2009–2010 through 2013–2014 academic years. Injury rates; injury rate ratios (RRs); injury proportions by body site, diagnosis, and apparatus; and injury proportion ratios were reported with 95% confidence intervals (CIs). The Injury Surveillance Program captured 216 injuries from men's cross-country and 260 injuries from women's cross-country, leading to injury rates of 4.66/1000 athlete-exposures (AEs) for men (95% CI = 4.04, 5.28) and 5.85/1000 AEs for women (95% CI = 5.14, 6.56). The injury rate in women's cross-country was 1.25 times that of men's cross-country (95% CI = 1.05, 1.50). Most injuries affected the lower extremity (men = 90.3%, women = 81.9%). The hip/groin-injury rate in women (0.65/1000 AEs) was higher than that in men (0.15/1000 AEs; RR = 4.32; 95% CI = 1.89, 9.85). The ankle-injury rate in men (0.60/1000 AEs) was higher than that in women (0.29/1000 AEs; RR = 2.07; 95% CI = 1.07, 3.99). Common diagnoses were strains (men = 19.9%, women = 20.4%) and inflammation (men = 18.1%, women = 23.8%). The majority of injuries were classified as overuse (men = 57.6%, women = 53.3%). Consistent with prior research, injury distributions varied between male and female athletes, and the injury rate among females was higher. Understanding the epidemiology of these cross-country injuries may be important for developing appropriate preventive interventions.Context
Objective
Design
Setting
Patients or Other Participants
Main Outcome Measure(s)
Results
Conclusions
Alterations to upper extremity physical characteristics of competitive swimmers (posture, range of motion [ROM], and subacromial-space distance) are commonly attributed to cumulative training load during a swimmer's competitive career. However, this accepted clinical belief has not been established in the literature. It is important to understand whether alterations in posture and associated physical characteristics occur as a result of sport training or factors other than swimming participation to better understand injury risk and possible interventions. To compare posture, subacromial-space distance, and glenohumeral external-rotation, internal-rotation, and horizontal-adduction ROM between adolescent competitive swimmers and nonoverhead athletes. Cross-sectional study. Local swimming pools and high school athletic training rooms. Forty-four competitive adolescent swimmers and 31 nonoverhead athletes who were not currently experiencing any elbow, shoulder, neck, or back pain that limited their sport activity. Posture, subacromial-space distance, and glenohumeral ROM were measured using photography, diagnostic ultrasound, and a digital inclinometer, respectively. Forward shoulder posture, forward head posture, normalized subacromial-space distance, internal-rotation ROM, and external-rotation ROM. No clinically significant differences existed between swimmers and nonoverhead athletes for posture, normalized subacromial-space distance, or external- or internal-rotation ROM. Swimmers presented with less horizontal-adduction ROM than nonoverhead athletes. Factors other than swimming participation, such as school and technology use, play important roles in the adaptation of physical characteristics in adolescents. Adolescents, regardless of swimming participation, presented with postural deviations. It is important to consider factors other than swimming participation that contribute to alterations in physical characteristics to understand injury risk and injury-prevention strategies in competitive adolescent swimmers.Context
Objective
Design
Setting
Patients or Other Participants
Intervention(s)
Main Outcome Measure(s)
Results
Conclusions
Reported injury rates and services in sports injury surveillance may be influenced by the employment setting of the certified athletic trainers (ATs) reporting these data. To determine whether injury rates and the average number of AT services per injury in high school football varied by AT employment status. Cross-sectional study. We used data from the National Athletic Treatment, Injury and Outcomes Network and surveyed ATs about their employment setting. Forty-four responding ATs (37.9% of all National Athletic Treatment, Injury and Outcomes Network participants) worked at high schools with football programs and were included in this study. Fourteen ATs were full-time employees of the high school, and 30 ATs were employed as outreach ATs (ie, full-time and part-time ATs from nearby clinics, hospitals, and graduate school programs). We calculated injury rates per 1000 athlete-exposures and average number of AT services per injury. Reported injury rates and services per injury were greater among full-time school employees compared with outreach ATs. However, injury rates did not differ when restricted to time-loss injuries only. Our findings suggest that ATs who are full-time school employees may be able to identify and care for more patients with injuries.Context
Objective
Design
Setting
Patients or Other Participants
Main Outcome Measure(s)
Results
Conclusions
Analysis of injury and illness data collected at large international competitions provides the US Olympic Committee and the national governing bodies for each sport with information to best prepare for future competitions. Research in which authors have evaluated medical contacts to provide the expected level of medical care and sports medicine services at international competitions is limited. To analyze the medical-contact data for athletes, staff, and coaches who participated in the 2011 Pan American Games in Guadalajara, Mexico, using unsupervised modeling techniques to identify underlying treatment patterns. Descriptive epidemiology study. Pan American Games. A total of 618 US athletes (337 males, 281 females) participated in the 2011 Pan American Games. Medical data were recorded from the injury-evaluation and injury-treatment forms used by clinicians assigned to the central US Olympic Committee Sport Medicine Clinic and satellite locations during the operational 17-day period of the 2011 Pan American Games. We used principal components analysis and agglomerative clustering algorithms to identify and define grouped modalities. Lift statistics were calculated for within-cluster subgroups. Principal component analyses identified 3 components, accounting for 72.3% of the variability in datasets. Plots of the principal components showed that individual contacts focused on 4 treatment clusters: massage, paired manipulation and mobilization, soft tissue therapy, and general medical. Unsupervised modeling techniques were useful for visualizing complex treatment data and provided insights for improved treatment modeling in athletes. Given its ability to detect clinically relevant treatment pairings in large datasets, unsupervised modeling should be considered a feasible option for future analyses of medical-contact data from international competitions.Context
Objective
Design
Setting
Patients or Other Participants
Main Outcome Measure(s)
Results
Conclusions
College sport organizations and associations endorse concussion-management protocols and policies. To date, little information is available on concussion policies and practices at community college institutions. To assess and describe current practices and policies regarding the assessment, management, and return-to-play criteria for sport-related concussion (SRC) among member institutions of the California Community College Athletic Association (CCCAA). Cross-sectional study. Web-based survey. A total of 55 head athletic trainers (ATs) at CCCAA institutions. Data about policies, procedures, and practices regarding SRC were collected over a 3-week period in March 2012 and analyzed using descriptive statistics, the Fisher exact test, and the Spearman test. Almost half (47%) of ATs stated they had a policy for SRC assessment, management, and return to play at their institution. They reported being in compliance with baseline testing guidelines (25%), management guidelines (34.5%), and return-to-play guidelines (30%). Nearly 31% of ATs described having an SRC policy in place for academic accommodations. Conference attendance was positively correlated with institutional use of academic accommodations after SRC (r = 0.44, P = .01). The number of meetings ATs attended and their use of baseline testing were also positively correlated (r = 0.38, P = .01). At the time of this study, nearly half of CCCAA institutions had concussion policies and 31% had academic-accommodation policies. However, only 18% of ATs at CCCAA institutions were in compliance with all of their concussion policies. Our findings demonstrate improvements in the management of SRCs by ATs at California community colleges compared with previous research but a need for better compliance with SRC policies.Context
Objective
Design
Setting
Patients or Other Participants
Main Outcome Measure(s)
Results
Conclusions
Many athletes fail to obtain the optimal levels of energy and nutrients to support health and performance. The constructs underlying the Theory of Planned Behavior (TPB) may help identify barriers to healthful eating that can be addressed in nutrition-education programs. To use the TPB to examine factors regarding collegiate male and female student-athletes' intentions of eating a healthful diet. Cross-sectional study. Online survey tool. The survey was taken by 244 male and female National Collegiate Athletic Association Division II athletes, and data from 201 were analyzed. Mean age of the athletes was 20 ± 1.31 years (range, 18–24 years); most were white (86.1%) and female (78.6%). We assessed predictive strength of attitude, subjective norms, and perceived behavioral control on behavioral intentions. Regression analysis evaluated how the variables of TPB were valued and how they predict behavioral intentions. The combination of attitude, subjective norms, and perceived behavioral controls accounted for 73.4% (R 2) of the variance in behavioral intention (F = 180.82, P < .001). Attitude had the greatest influence on behavioral intentions (β = .534, P < .001). Understanding both the intentions of collegiate athletes to eat healthfully and how highly they value nutrition is crucial for the development of effective nutrition education and counseling programs.Context
Objective
Design
Setting
Patients or Other Participants
Main Outcome Measure(s)
Results
Conclusions
JAT eISSN: 1938-162X
JAT ISSN: 1062-6050
ATEJ ISSN: 1947-380X