Content Validation of the Athletic Training Milestones: A Report from the Association for Athletic Training Education Research Network
The Athletic Training Milestones were developed as a comprehensive framework to assess athletic trainers' knowledge, skill, and behavior acquisition across the continuum of athletic training practice. However, without established content validity, it is unclear whether the Athletic Training Milestones can be used effectively as a clinical evaluation and research tool to evaluate competence and performance across multiple users and sites. We conducted a highly conservative content validity index (CVI) with data from 12 content experts. Our findings revealed an extremely high overall scale CVI of 0.99, and CVI scores of the 28 individual subcompetency items assessed ranged from 0.83 to 1.00. For the athletic training profession to truly embrace competency-based evaluation and performance assessments, we need a highly valid and comprehensive instrument, such as the Athletic Training Milestones.
The Athletic Training Milestones1 were designed to provide a comprehensive framework for assessing competence in athletic training by measuring clinical practice behaviors, from novice to expert, across the continuum of athletic training practice. The Athletic Training Milestones were based on the Accreditation Council for Graduate Medical Education milestones framework and were adapted to include behaviors of contemporary athletic training practice.1 The Athletic Training Milestones consist of 6 general competency (patient care and procedural skills, medical knowledge, practice-based learning and improvement, interpersonal and communication skills, professionalism, systems-based practice) and 8 specialty competency (prevention and wellness, urgent and emergent care, primary care, orthopaedics, rehabilitation, behavioral health, pediatrics, performance enhancement) areas, with each competency measured on a scale ranging from critically deficient to expert practice (Figure).



Citation: Journal of Athletic Training 58, 5; 10.4085/1062-6050-0332.22
The Athletic Training Milestones can be used for assessment of athletic training practice across the continuum of a career, beginning with professional knowledge and skill development in athletic training educational programs, measuring progression of specialized knowledge and skills in a focused area of practice, and culminating in assessment of expert or aspirational practice. The Athletic Training Milestones can be used for self- and peer assessment in athletic training education and residency programs, for employee or employer assessments, and as a measure of learning over time and development of contemporary expertise among clinicians. Furthermore, profession-wide adoption of the Athletic Training Milestones could elevate athletic training recognition and respect among peer health care professionals who use a similar framework to assess competence and growth (https://www.atmilestones.com). However, the content validity of the Athletic Training Milestones has not been established to date.
Content validity is an essential measure to establish how well an instrument or test represents the construct being measured.2 In the case of the Athletic Training Milestones, content validity establishes the relevance of each of the general competencies, subcompetencies, and associated milestone levels to actual athletic training practice. However, before the Athletic Training Milestones can truly be endorsed for widespread adoption as an effective evaluation tool to assess clinical progression of knowledge and skill acquisition, it is critical to establish the content validity. Therefore, our purpose was to determine the relevance and clarity of the Athletic Training Milestones. For this study, we specifically focused on the content validity index (CVI) of the 6 general competencies.
METHODS
Design
We used a formal CVI, as described by previous researchers,2–5 to determine the content validity of the Athletic Training Milestones. Content validity is a critical component for scale development of high-quality instruments to ensure a sufficient number of items to appropriately assess the content of interest.7 Our intent in conducting a CVI of the Athletic Training Milestones were to assess whether this evaluative tool is a valuable measure to appropriately measure ATs' progression of independent knowledge, skill, and behavior acquisition from novice to proficient with aspirational goals of becoming an expert.3 This study was deemed exempt research by the A.T. Still University Institutional Review Board.
Participants
As highlighted by Grant and Davis,4 the validation process is heavily influenced by how the content experts are chosen and used. Content experts are often selected on the basis of their training, experience, and qualifications. More specifically, content experts may be selected due to their own research experience on the topic,3 clinical expertise,4 or expertise related to the conceptual framework of the instrument.2,4 Whereas some debate exists about the number of experts needed, earlier investigators have suggested between 3 and 10 content experts are necessary to conduct a CVI2,3; including more than 10 content experts is considered highly conservative. Because it can be difficult to find content experts who meet all criteria—that is, research experience, clinical expertise, and theoretical expertise of the content included in the instrument—a range of representation across the expertise criteria should be sought among those asked to serve as content experts.9
To ensure we had enough content experts to conduct a highly conservative CVI and account for participant attrition for this study, we recruited a purposeful criterion sample of 31 health care professionals who primarily served as clinicians and educators and were known among our professional network for their experience in implementing the Athletic Training Milestones for personal-, student-, or peer-evaluation use. Of the 31 individuals recruited, 24 were willing to participate, 1 indicated unwillingness to participate, and 6 did not respond to the initial request.
Instrumentation
To conduct the CVI, a rating score document was developed using Excel (version 16.61; Microsoft Corp). Each general competency (patient care and procedural skills, medical knowledge, practice-based learning and improvement, interpersonal and communication skills, professionalism, systems-based practice) was set up as an individual tab in the document. Within each tab, the individual subcompetency items (eg, demonstrates humanism and cultural competency, responds to each patient's unique characteristics, needs, and goals) were listed, followed by a field to input the rating score and a field to insert comments if warranted. For scoring, we adopted the 4-point item-rating continuum advocated by Davis9: 1 = not relevant, 2 = somewhat relevant, 3 = quite relevant, 4 = highly relevant (Table 1).

Procedures
In February 2021, we sent a recruitment email to the criterion sample. The email contained a brief introduction and purpose of the study, a general overview of the CVI process, the estimated time to complete the CVI (approximately 2 hours), and a request to participate. Once individuals responded and indicated a willingness to participate, they were sent an email with 9 file attachments: CVI process instructions, CVI scoring template, the introduction and preface of the Athletic Training Milestones document (https://www.atmilestones.com), and the subcompetency milestones of each of the 6 general competencies separated into individual files. The email also included a URL link to a brief, web-based demographic questionnaire aimed at capturing general demographic information about each content expert who completed the CVI. Each person was given 4 weeks to complete the CVI, and we sent 2 reminder emails during the data collection period.
To begin the CVI process, the content experts were asked to familiarize themselves with the 9 documents and to review the introduction and preface of the Athletic Training Milestones. The intent of this review was as a refresher to ensure that all content experts were extremely familiar with how the Athletic Training Milestones document was formatted, the terminology used, and the descriptions of the different performance levels included. Once familiarized, content experts were asked to rate each general competency. For each subcompetency item within the general competency, individuals were instructed to read every criterion listed in each column, starting with the criteria in the critical deficiencies column, and then progress to the right through the criteria in the level 5 column. Next, they were instructed to identify a rating score on the basis of whether the criteria listed for progression from 1 level to the next were relevant or whether critical knowledge or skills were missing from the criteria. Individuals were told to base their score only on the overall progression from the lowest level to the highest. Furthermore, if a score was <4, they were asked to use the comment box next to that score to provide more information regarding the score selection. This process continued until all 28 subcompetencies within the 6 general competency areas were scored. The final rating score document was emailed to the principal investigator (C.W.B.).
Data Analysis
We received completed rating score documents from 21 of the 24 individuals who agreed to participate. Before data analysis, all individual rating scores were deidentified and reviewed for appropriateness. For inclusion in the final data analysis, we used the following criteria: (1) a self-perceived rating of moderately or extremely familiar with the Athletic Training Milestones, (2) submitted rating scores for all 28 subcompetency items, and (3) followed instructions to base scores only on the overall progression from the lowest level to the highest. Upon review, we found that the rating scores from 9 people did not meet all criteria and were excluded; 1 respondent self-reported being minimally familiar with the Athletic Training Milestones, 2 respondents did not submit rating scores for all 28 subcompetency items, and 6 respondents did not follow the instructions for scoring. Common reasons given for not following the instructions included lowering of the score for grammatical considerations or items unrelated to clinical relevance. All final rating scores were merged into a single Microsoft Excel document for data analysis.
We conducted CVI analyses at the individual item, section, and overall scale levels. To compute the individual CVI (I-CVI) for each subcompetency item, we tallied the number of experts who gave that item a rating of 3 or 4 and then divided by the total number of experts (n = 12).2,6 When ≤5 experts are involved, the I-CVI must be 1.002,6; with >5 experts, an acceptable I-CVI must be above 0.78.3 To calculate the scores for the 6 general competency sections, we tallied the individual I-CVI scores and then computed the average based on the number of items in the respective general competency section.
The CVI of the overall scale (S-CVI) is often determined using 1 of 2 methods. The most straightforward approach is to calculate the average (S-CVI-Ave) from all individual-item I-CVI scores. In the past, authors3,4,9 have indicated that an S-CVI-Ave of ≥0.80 is acceptable; however, Waltz et al5 noted that the standard criterion for the S-CVI-Ave, also called the average congruence percentage, should be 0.90 to help account for chance agreement. Another, more conservative approach to calculating the S-CVI is to require universal agreement (S-CVI-UA) among experts. The S-CVI-UA therefore focuses on the proportion of items within an instrument that achieved an acceptable rating (ie, 3 or 4) by all content experts.2 This approach is considered overly stringent and is less likely to be used with >2 content experts given that universal agreement by all content experts becomes more difficult as their number increases.3 Furthermore, because it is a much more conservative approach for determining scale validity, experts indicate a minimum of 0.80 is acceptable for S-CVI-UA.5
To ensure transparency of all possible calculations for the S-CVI of the Athletic Training Milestones, we calculated both the S-CVI-Ave and S-CVI-UA. We computed the S-CVI-Ave by tallying the I-CVI scores for all 28 subcompetency items and then dividing by the total number of individual items. We used the 0.90 threshold identified by Waltz et al5 as our minimal acceptable limit for the S-CVI-Ave. We determined the S-CVI-UA by tallying the number of individual items that had an I-CVI score of 1 and then divided by the total number of individual items.
RESULTS
The individual CVI scores from 11 athletic trainers (ATs) and 1 physician were included in the data analysis. Demographics of the 12 content experts are presented in Table 2, and the research, clinical, or theoretical expertise of each person is displayed in Table 3. The S-CVI-Ave of the entire Athletic Training Milestones tool was 0.99 and the S-CVI-UA was 0.86. The I-CVI scores for each individual item (range = 0.83–1) as well as for each subsection (range = 0.95–1) are shown in Table 4. These values indicate that all individual items and the 6 subsections of the Athletic Training Milestones are considered highly acceptable.



DISCUSSION
Our findings revealed that even when using an overly stringent approach, the Athletic Training Milestones meet the threshold of excellent content validity for both individual-item and scale scores. The high CVI scores for both the individual items and the overall scale confirm that the Athletic Training Milestones are a high-quality, valid tool to assess ATs' progression of independent knowledge, skill, and behavior acquisition.
A valid and comprehensive instrument is imperative if the profession of athletic training is to truly embrace competency-based educational evaluations and performance assessments.10 In athletic training practice, we regularly seek validated measures to evaluate and track the performance of patients relative to treatments applied. The same is not only critical but must be required for evaluation instruments to assess and track our own competence and performance as health care professionals. The Athletic Training Milestones provide added value to the athletic training profession in the form of a highly valid instrument for assessing competence and growth, both within athletic training education and across the continuum of professional practice. Widespread adoption and use of the Athletic Training Milestones as a clinical evaluation and research tool may provide opportunities to evaluate and compare competency across multiple sites and users.
Now that excellent content validity for the general competency milestones has been established, the Athletic Training Milestones should be used across the spectrum of athletic training practice and education to assess the competence and growth of ATs across the continuum of their careers. For example, the Athletic Training Milestones could be used as an annual performance evaluation for clinically practicing ATs to identify areas for improvement to maintain competence or areas of strength to demonstrate contemporary expertise. Alternatively, the Athletic Training Milestones could be used across athletic training educational programs to evaluate student progression and growth. In addition, ATs should aim to identify contributory factors to changes in milestone performance standards across practitioners to better understand how ATs or athletic training students can improve their personal clinical practice performance in the scale domains. Finally, once the subspecialty milestones have been developed, further research should be pursued to assess the content validity of those subspecialty areas.

Breakdown of the levels in the Athletic Training Milestones. Abbreviation: CAATE, Commission on Accreditation of Athletic Training Education. a Individuals can both progress and regress across the continuum of levels (eg, a student may progress beyond level 2, just as a credentialed professional may regress below level 3 for any given competency).
Contributor Notes