Sodium replacement during prolonged exercise in the heat may be critically important to maintaining fluid and electrolyte balance and muscle contractility. To examine the effectiveness of sodium-containing sports drinks in preventing hyponatremia and muscle cramping during prolonged exercise in the heat. Randomized crossover study. Thirteen active men. Participants completed 4 trials of an exercise protocol in the heat (30°C) consisting of 3 hours of exercise (alternating 30 minutes of walking and cycling at a heart rate of 130 and 140 beats per minute, respectively); a set of standing calf raises (8 sets of 30 repetitions); and 45 minutes of steep, brisk walking (5.5 kmṡh−1 on a 12% grade). During exercise, participants consumed fluids to match body mass loss. A different drink was consumed for each trial: carbohydrate-electrolyte drink containing 36.2 mmol/L sodium (HNa), carbohydrate-electrolyte drink containing 19.9 mmol/L sodium (LNa), mineral water (W), and colored and flavored distilled water (PL). Serum sodium, plasma osmolality, plasma volume changes, and muscle cramping frequency. During both HNa and LNa trials, serum sodium remained relatively constant (serum sodium concentration at the end of the protocol was 137.3 mmol/L and 136.7 mmol/L, respectively). However, a clear decrease was observed in W (134.5 ± 0.8 mmol/L) and PL (134.4 ± 0.8 mmol/L) trials compared with HNa and LNa trials (P < .05). The same trends were observed for plasma osmolality (P < .05). Albeit not significant, plasma volume was preserved during the HNa and LNa trials, but a reduction of 2.5% was observed in the W and PL trials. None of the volunteers experienced cramping. The data suggest that sodium intake during prolonged exercise in the heat plays a significant role in preventing sodium losses that may lead to hyponatremia when fluid intake matches sweat losses.Abstract
Context:
Objective:
Design:
Patients or Other Participants:
Intervention(s):
Main Outcome Measure(s):
Results:
Conclusions:
When assessing exercise hyperthermia outdoors, the validity of certain commonly used body temperature measuring devices has been questioned. A controlled laboratory environment is generally less influenced by environmental factors (eg, ambient temperature, solar radiation, wind) than an outdoor setting. The validity of these temperature measuring devices in a controlled environment may be more acceptable. To assess the validity and reliability of commonly used temperature devices compared with rectal temperature in individuals exercising in a controlled, high environmental temperature indoor setting and then resting in a cool environment. Time series study. Laboratory environmental chamber (temperature = 36.4 ± 1.2°C [97.5 ± 2.16°F], relative humidity = 52%) and cool laboratory (temperature = approximately 23.3°C [74.0°F], relative humidity = 40%). Fifteen males and 10 females. Rectal, gastrointestinal, forehead, oral, aural, temporal, and axillary temperatures were measured with commonly used temperature devices. Temperature was measured before and 20 minutes after entering the environmental chamber, every 30 minutes during a 90-minute treadmill walk in the heat, and every 20 minutes during a 60-minute rest in mild conditions. Device validity and reliability were assessed with various statistical measures to compare the measurements using each device with rectal temperature. A device was considered invalid if the mean bias (average difference between rectal and device temperatures) was more than ±0.27°C (±0.50°F). Measured temperature from each device (mean and across time). The following devices provided invalid estimates of rectal temperature: forehead sticker (0.29°C [0.52°F]), oral temperature using an inexpensive device (−1.13°C [−2.03°F]), temporal temperature measured according to the instruction manual (−0.87°C [−1.56°F]), temporal temperature using a modified technique (−0.63°C [−1.13°F]), oral temperature using an expensive device (−0.86°C, [−1.55°F]), aural temperature (−0.67°C, [−1.20°F]), axillary temperature using an inexpensive device (−1.25°C, [−2.24°F]), and axillary temperature using an expensive device (−0.94°F [−1.70°F]). Measurement of intestinal temperature (mean bias of −0.02°C [−0.03°F]) was the only device considered valid. Devices measured in succession (intestinal, forehead, temporal, and aural) showed acceptable reliability (all had a mean bias = 0.09°C [0.16°F] and r ≥ 0.94]). Even during laboratory exercise in a controlled environment, devices used to measure forehead, temporal, oral, aural, and axillary body sites did not provide valid estimates of rectal temperature. Only intestinal temperature measurement met the criterion. Therefore, we recommend that rectal or intestinal temperature be used to assess hyperthermia in individuals exercising indoors in the heat.Abstract
Context:
Objective:
Design:
Setting:
Patients or Other Participants:
Intervention(s):
Main Outcome Measure(s):
Results:
Conclusions:
Many researchers have investigated the effectiveness of different types of cold application, including cold whirlpools, ice packs, and chemical packs. However, few have investigated the effectiveness of different types of ice used in ice packs, even though ice is one of the most common forms of cold application. To evaluate and compare the cooling effectiveness of ice packs made with cubed, crushed, and wetted ice on intramuscular and skin surface temperatures. Repeated-measures counterbalanced design. Human performance research laboratory. Twelve healthy participants (6 men, 6 women) with no history of musculoskeletal disease and no known preexisting inflammatory conditions or recent orthopaedic injuries to the lower extremities. Ice packs made with cubed, crushed, or wetted ice were applied to a standardized area on the posterior aspect of the right gastrocnemius for 20 minutes. Each participant was given separate ice pack treatments, with at least 4 days between treatment sessions. Cutaneous and intramuscular (2 cm plus one-half skinfold measurement) temperatures of the right gastrocnemius were measured every 30 seconds during a 20-minute baseline period, a 20-minute treatment period, and a 120-minute recovery period. Differences were observed among all treatments. Compared with the crushed-ice treatment, the cubed-ice and wetted-ice treatments produced lower surface and intramuscular temperatures. Wetted ice produced the greatest overall temperature change during treatment and recovery, and crushed ice produced the smallest change. As administered in our protocol, wetted ice was superior to cubed or crushed ice at reducing surface temperatures, whereas both cubed ice and wetted ice were superior to crushed ice at reducing intramuscular temperatures.Abstract
Context:
Objective:
Design:
Setting:
Patients or Other Participants:
Intervention(s):
Main Outcome Measure(s):
Results:
Conclusions:
For athletes in disciplines with weight categories, it is important to assess body composition and weight fluctuations. To evaluate the accuracy of measuring body fat percentage with a portable ultrasound device possessing high accuracy and reliability versus fan-beam, dual-energy X-ray absorptiometry (DEXA). Cross-validation study. Research laboratory. A total of 93 athletes (24 women, 69 men), aged 23.5 ± 3.7 years, with body mass index = 24.0 ± 4.2 and body fat percentage via DEXA = 9.41 ± 8.1 participated. All participants were elite athletes selected from the Institut National des Sports et de l'Education Physique. These participants practiced a variety of weight-category sports. We measured body fat and body fat percentage using an ultrasound technique associated with anthropometric values and the DEXA reference technique. Cross-validation between the ultrasound technique and DEXA was then performed. Ultrasound estimates of body fat percentage were correlated closely with those of DEXA in both females (r = 0.97, standard error of the estimate = 1.79) and males (r = 0.98, standard error of the estimate = 0.96). The ultrasound technique in both sexes had a low total error (0.93). The 95% limit of agreement was −0.06 ± 1.2 for all athletes and did not show an overprediction or underprediction bias. We developed a new model to produce body fat estimates with ultrasound and anthropometric dimensions. The limits of agreement with the ultrasound technique compared with DEXA measurements were very good. Consequently, the use of a portable ultrasound device produced accurate body fat and body fat percentage estimates in relation to the fan-beam DEXA technique.Abstract
Context:
Objective:
Design:
Setting:
Patients or Other Participants:
Main Outcome Measure(s):
Results:
Conclusions:
The body of knowledge concerning shoulder kinematics in patients with rotator cuff tears is increasing. However, the level of understanding regarding how pain and tear size affect these kinematic patterns is minimal. To identify relationships between pain associated with a full-thickness rotator cuff tear, tear size, and scapulohumeral rhythm (SHR) and to determine whether pain and tear size serve as predictors of SHR. A test-retest design was used to quantify pain and SHR before and after a subacromial lidocaine injection. Correlation and multivariate analyses were used to identify relationships among pain, tear size, and SHR. Orthopaedic biomechanics research laboratory. Fifteen patients (age range, 40–75 years) with diagnosed full-thickness rotator cuff tears participated. They were experiencing pain at the time of testing. Shoulder kinematic data were collected with an electromagnetic tracking system before and after the patient received a lidocaine injection. Pain was rated using a visual analog scale. Three-dimensional scapular kinematics and glenohumeral elevation were assessed. Scapular kinematics included anterior-posterior tilt, medial-lateral tilt, and upward-downward rotation. A regression model was used to calculate SHR (scapular kinematics to glenohumeral elevation) for phases of humeral elevation and lowering. Linear relationships were identified between initial pain scores and SHR and between tear size and SHR, representing an increased reliance on scapular motion with increasing pain and tear size. Pain was identified as an independent predictor of SHR, whereas significant findings for the effect of tear size on SHR and the interaction between pain and tear size were limited. We noted an increased reliance on scapular contributions to overall humeral elevation with increasing levels of pain and rotator cuff tear size. Pain associated with a rotator cuff tear serves as a primary contributor to the kinematic patterns exhibited in patients with rotator cuff tears.Abstract
Context:
Objective:
Design:
Setting:
Patients or Other Participants:
Intervention(s):
Main Outcome Measure(s):
Results:
Conclusions:
Shoulder injuries are common in athletes involved in overhead sports, and scapular dyskinesis is believed to be one causative factor in these injuries. Many authors assert that abnormal scapular motion, so-called dyskinesis, is related to shoulder injury, but evidence from 3-dimensional measurement studies regarding this relationship is mixed. Reliable and valid clinical methods for detecting scapular dyskinesis are lacking. To determine the interrater reliability of a new test designed to detect abnormal scapular motion. Correlation design using ratings from multiple pairs of testers. University athletic training facilities. A sample of 142 athletes (from National Collegiate Athletic Association Divisions I and III) participating in sports requiring intense overhead arm use. Participants were videotaped from the posterior aspect while performing 5 repetitions of bilateral, weighted (1.4-kg [3-lb] or 2.3-kg [5-lb]) shoulder flexion and frontal-plane abduction. Videotapes from randomly chosen participants were subsequently viewed and independently rated for the presence of scapular dyskinesis by 6 raters (3 pairs), with each pair rating 30 different participants. Raters were trained to detect scapular dyskinesis using a self-instructional format with standardized operational definitions and videotaped examples of normal and abnormal motion. Scapular dyskinesis was defined as the presence of either winging or dysrhythmia. Right and left sides were rated independently as normal, subtle, or obvious dyskinesis. We calculated percentage of agreement and weighted kappa (κw) coefficients to determine reliability. Percentage of agreement was between 75% and 82%, and κw ranged from 0.48 to 0.61. The test for scapular dyskinesis showed satisfactory reliability for clinical use in a sample of overhead athletes known to be at increased risk for shoulder symptoms.Abstract
Context:
Objective:
Design:
Setting:
Patients or Other Participants:
Intervention(s):
Main Outcome Measure(s):
Results:
Conclusions:
Although clinical methods for detecting scapular dyskinesis have been described, evidence supporting the validity of these methods is lacking. To determine the validity of the scapular dyskinesis test, a visually based method of identifying abnormal scapular motion. A secondary purpose was to explore the relationship between scapular dyskinesis and shoulder symptoms. Validation study comparing 3-dimensional measures of scapular motion among participants clinically judged as having either normal motion or scapular dyskinesis. University athletic training facilities. A sample of 142 collegiate athletes (National Collegiate Athletic Association Division I and Division III) participating in sports requiring overhead use of the arm was rated, and 66 of these underwent 3-dimensional testing. Volunteers were viewed by 2 raters while performing weighted shoulder flexion and abduction. The right and left sides were rated independently as normal, subtle dyskinesis, or obvious dyskinesis using the scapular dyskinesis test. Symptoms were assessed using the Penn Shoulder Score. Athletes judged as having either normal motion or obvious dyskinesis underwent 3-dimensional electromagnetic kinematic testing while performing the same movements. The kinematic data from both groups were compared via multifactor analysis of variance with post hoc testing using the least significant difference procedure. The relationship between symptoms and scapular dyskinesis was evaluated by odds ratios. Differences were found between the normal and obvious dyskinesis groups. Participants with obvious dyskinesis showed less scapular upward rotation (P < .001), less clavicular elevation (P < .001), and greater clavicular protraction (P = .044). The presence of shoulder symptoms was not different between the normal and obvious dyskinesis volunteers (odds ratio = 0.79, 95% confidence interval = 0.33, 1.89). Shoulders visually judged as having dyskinesis showed distinct alterations in 3-dimensional scapular motion. However, the presence of scapular dyskinesis was not related to shoulder symptoms in athletes engaged in overhead sports.Abstract
Context:
Objective:
Design:
Setting:
Patients or Other Participants:
Intervention(s):
Main Outcome Measure(s):
Results:
Conclusions:
Researchers have suggested that large landing forces, excessive quadriceps activity, and an erect posture during landing are risk factors for anterior cruciate ligament (ACL) injury. The influence of knee kinematics on these risk factors has been investigated extensively, but trunk positioning has received little attention. To determine the effect of trunk flexion on landing forces and quadriceps activation during landing. Two (sex) × 2 (task) repeated-measures design. Research laboratory. Forty healthy, physically active volunteers (20 men, 20 women). Participants performed 2 drop-landing tasks. The first task represented the natural, or preferred, landing strategy. The second task was identical to the first except that participants flexed the trunk during landing. We measured peak vertical and posterior ground reaction forces and mean quadriceps electromyographic amplitude during the loading phase of landing (ie, the interval from initial ground contact to peak knee flexion). Trunk flexion decreased the vertical ground reaction force (P < .001) and quadriceps electromyographic amplitude (P < .001). The effect of trunk flexion did not differ across sex for landing forces or quadriceps electromyographic activity. We found that trunk flexion during landing reduced landing forces and quadriceps activity, thus potentially reducing the force imparted to the ACL. Research has indicated that trunk flexion during landing also increases knee and hip flexion, resulting in a less erect landing posture. In combination, these findings support emphasis on trunk flexion during landing as part of ACL injury-prevention programs.Abstract
Context:
Objective:
Design:
Setting:
Patients or Other Participants:
Intervention(s):
Main Outcome Measure(s):
Results:
Conclusions:
Epidemiologic findings of higher incidences of hamstrings muscle strains during the latter stages of soccer match play have been attributed to fatigue. To investigate the influence of soccer-specific fatigue on the peak eccentric torque of the knee flexor muscles. Descriptive laboratory study. Controlled laboratory environment. Ten male professional soccer players (age = 24.7 ± 4.4 years, mass = 77.1 ± 8.3 kg, V˙o2max = 63.0 ± 4.8 mL·kg−1·min−1). Participants completed an intermittent treadmill protocol replicating the activity profile of soccer match play, with a passive halftime interval. Before exercise and at 15-minute intervals, each player completed isokinetic dynamometer trials. Peak eccentric knee flexor torque was quantified at isokinetic speeds of 180° · s−1, 300° · s−1, and 60° · s−1, with 5 repetitions at each speed. Peak eccentric knee flexor torque at the end of the game (T300eccH105 = 127 ± 25 Nm) and at the end of the passive halftime interval (T300eccH60 = 133 ± 32 Nm) was reduced relative to T300eccH00 (167 ± 35 Nm, P < .01) and T300eccH15 (161 ± 35 Nm, P = .02). Eccentric hamstrings strength decreased as a function of time and after the halftime interval. This finding indicates a greater risk of injuries at these specific times, especially for explosive movements, in accordance with epidemiologic observations. Incorporating eccentric knee flexor exercises into resistance training sessions that follow soccer-specific conditioning is warranted to try to reduce the incidence or recurrence of hamstrings strains.Abstract
Context:
Objective:
Design:
Setting:
Patients or Other Participants:
Intervention(s):
Main Outcome Measure(s):
Results:
Conclusions:
Only a few scales measure confidence within sport; however, these scales are insufficient to measure confidence after athletic injuries. Therefore, better measures are needed to determine the psychological readiness of injured athletes to return to sport participation. To develop a scale that measures the psychological readiness of injured athletes to return to sport participation and to provide preliminary evidence of reliability and validity for the scale. The Delphi method was used to develop the Injury-Psychological Readiness to Return to Sport scale (I-PRRS). Two 1-way analyses of variance with repeated measures and 6 Pearson product moment correlations were computed to help validate the scale. Athletic training clinics at 3 National Collegiate Athletic Association (NCAA) schools. Four certified athletic trainers (ATs) and professors of Commission on Accreditation of Athletic Training Education-accredited athletic training programs and 3 NCAA Division III coaches made up a panel of experts that participated in the Delphi portion of the study to develop the I-PRRS. In the second part of the study, 22 injured athletes, who missed a minimum of 1 week of practice, from 3 NCAA schools in Divisions II and III were surveyed along with their respective ATs. The injured athletes and ATs participated in the validation of the I-PRRS. The injured athlete completed the Profile of Mood States (POMS) short form and the I-PRRS shortly after injury, before returning to the first practice, before returning to competition, and immediately after competition. The respective AT completed the I-PRRS before and after competition. The I-PRRS is a 6-item scale that measures the psychological readiness of injured athletes to return to sport, and the POMS short form is a 30-item scale that measures mood states. I added the negative moods of the POMS and subtracted the positive moods of the POMS to calculate a Total Mood Disturbance (TMD) score. The I-PRRS scores were negatively correlated with the TMD scores of the POMS short form at all 4 time intervals, showing concurrent validity. The I-PRRS scores were lowest after injury, increased before practice, increased again before competition, and had no change after competition. The I-PRRS as completed by the athlete and respective AT was positively correlated both before and after practice, demonstrating external validity. Preliminary evidence for reliability and validity of the I-PRRS was demonstrated. The I-PRRS can be a beneficial tool for ATs to assess an athlete's psychological readiness to return to sport participation after injury.Abstract
Context:
Objective:
Design:
Setting:
Patients or Other Participants:
Main Outcome Measure(s):
Results:
Conclusions:
Athletic training education program directors (ATEPDs) often manage their time among students, program administration, and patient care. To assess the level of burnout in ATEPDs and to determine the relationship between burnout and various demographics of ATEPDs. Cross-sectional study. Public and private colleges and universities nationwide. Two hundred forty-nine ATEPDs of undergraduate athletic training education programs accredited by the Commission on Accreditation of Athletic Training Education. We administered the Maslach Burnout Inventory (MBI) to all participants. The MBI consisted of 21 items assessing 3 characteristics of burnout: emotional exhaustion, depersonalization, and personal accomplishment. Another component of the survey requested demographic information about the ATEPDs. We used univariate, multivariate, and factorial analyses of variance with the α level set a priori at .05. We also calculated Pearson product moment correlation coefficients. Women had greater emotional exhaustion than men (20.67 ± 9.43 and 16.47 ± 9.64, respectively) (P = .001). The difference between tenure-status groups for emotional exhaustion was significant (P = .014), with tenure-track ATEPDs scoring higher on emotional exhaustion than tenured ATEPDs. Pearson product moment correlation coefficients revealed a weak negative relationship among emotional exhaustion and age (r = −0.263, P < .001), years of program director experience (r = −0.157, P = .013), and years at current job (r = −0.162, P = .010), indicating that as ATEPDs aged, gained more experience, and stayed in their current jobs, their emotional exhaustion scores decreased. There was also a weak negative relationship between age and depersonalization (r = −0.171, P = .007). There was a weak positive relationship between years at current job and personal accomplishment (r = 0.197, P = .002). We found that ATEPDs experienced a moderate form of emotional exhaustion burnout and low depersonalization and personal accomplishment burnout, with women experiencing greater emotional exhaustion than males. Additionally, ATEPDs in tenure-track positions experienced greater emotional exhaustion than tenured ATEPDs. The ATEPDs need to obtain healthy coping strategies early within their directorships to manage components related to burnout.Abstract
Context:
Objective:
Design:
Setting:
Patients or Other Participants:
Intervention(s):
Main Outcome Measure(s):
Results:
Conclusions:
The success of any academic program, including athletic training, depends upon attracting and keeping quality students. The nature of persistent students versus students who prematurely leave the athletic training major is not known. Understanding the profiles of athletic training students who persist or leave is important. To (1) explore the relationships among the following variables: anticipatory factors, academic integration, clinical integration, social integration, and motivation; (2) determine which of the aforementioned variables discriminate between senior athletic training students and major changers; and (3) identify which variable is the strongest predictor of persistence in athletic training education programs. Descriptive study using a qualitative and quantitative mixed-methods approach. Thirteen athletic training education programs located in District 3 of the National Athletic Trainers' Association. Ninety-four senior-level athletic training students and 31 college students who changed majors from athletic training to another degree option. Data were collected with the Athletic Training Education Program Student Retention Questionnaire (ATEPSRQ). Data from the ATEPSRQ were analyzed via Pearson correlations, multivariate analysis of variance, univariate analysis of variance, and a stepwise discriminant analysis. Open-ended questions were transcribed and analyzed using open, axial, and selective coding procedures. Member checks and peer debriefing techniques ensured trustworthiness of the study. Pearson correlations identified moderate relationships among motivation and clinical integration (r = 0.515, P < .01) and motivation and academic integration (r = 0.509, P < .01). Univariate analyses of variance showed that academic integration (F1,122 = 8.483, P < .004), clinical integration (F1,119 = 30.214, P < .001), and motivation (F1,121 = 68.887, P < .001) discriminated between seniors and major changers. Discriminant analysis indicated that motivation was the strongest predictor of persistence in athletic training education, accounting for 37.2% of the variance between groups. The theoretic model accurately classified 95.7% of the seniors and 53.8% of the major changers. A common theme emerging from the qualitative data was the presence of a strong peer-support group that surrounded many of the senior-level students. Understanding student retention in athletic training is important for our profession. Results from this study suggest 3 key factors associated with student persistence in athletic training education programs: (1) student motivation, (2) clinical and academic integration, and (3) the presence of a peer-support system. Educators and program directors must create comprehensive recruitment and retention strategies that address factors influencing students' decisions to stay in the athletic training profession.Abstract
Context:
Objective:
Design:
Setting:
Patients or Other Participants:
Data Collection:
Analysis:
Results:
Conclusions:
As the Asian Ice Hockey League gradually expands and becomes more competitive, ice hockey-related injuries may increase. However, no reports have been published on ice hockey injuries in Japan, including the method of injury and the daily supervision of the players during the regular season. To prospectively study the incidence, types, and mechanisms of ice hockey injuries in an elite Japanese ice hockey team. Prospective observational cohort study design. An elite ice hockey team, Tokyo, Japan. Ninety-four players during the 2002–2005 seasons. Data were collected for 3 consecutive seasons using an injury reporting form. The overall game injury rate was 74.3 per 1000 player-game hours and 11.7 per 1000 player-game hours for injuries resulting in any time loss. The overall practice injury rates were 11.2 per 1000 player-practice hours and 1.1 per 1000 player-practice hours for injuries resulting in any time loss. Forwards had the highest rate of injury, followed by defensemen and then goalkeepers. Contusions were the most common injury, followed by strains, lacerations, and sprains. Most injuries among Japanese ice hockey players occurred during games. Game or play intensity may influence the injury rate during games.Abstract
Context:
Objective:
Design:
Setting:
Patients or Other Participants:
Main Outcome Measure(s):
Results:
Conclusions:
To critically assess original research addressing the effect of creatine supplementation on exercise heat tolerance and hydration status. We searched the electronic databases PubMed, Scopus, Web of Science, SPORTDiscus, and Rehabilitation & Physical Medicine, without date limitations, for the following key words: creatine, exercise, thermoregulation, dehydration, hyperthermia, heat tolerance, exertional heat illnesses, and renal function. Our goal was to identify randomized clinical trials investigating the effect of creatine supplementation on hydration status and thermoregulation. Citations from related articles also were identified and retrieved. Original research was reviewed using the Physiotherapy Evidence Database (PEDro) Scale. One author initially screened all articles. Fifteen of 95 articles examined the effects of creatine on thermoregulation or hydration status (or both). Two independent reviewers then reviewed these articles. Ten studies were selected on the basis of inclusion and exclusion criteria. The PEDro scores for the 10 studies ranged from 7 to 10 points (maximum possible score = 10 points). No evidence supports the concept that creatine supplementation either hinders the body's ability to dissipate heat or negatively affects the athlete's body fluid balance. Controlled experimental trials of athletes exercising in the heat resulted in no adverse effects from creatine supplementation at recommended dosages.Abstract
Objective:
Data Sources:
Data Synthesis:
Conclusions:
JAT eISSN: 1938-162X
JAT ISSN: 1062-6050
ATEJ ISSN: 1947-380X