The Certification Exam Process Demystified
Many prospective and existing diplomates have asked about the revised process for the Certification examination and how the examination is constructed. As Chairperson of the Certification Examination Committee (CEC), I would like to brief our membership on the current state of the process.
The CEC committee meets approximately three times each year. At these meetings, potential cases, often as many as twenty are reviewed as candidates to be presented in the new, computer based format. However, since not every case vignette lends itself to the new format, each prospect is vetted by the committee for content (i.e. Medicine or Orthopedics), relevance (i.e. is this esoteric trivia or pertinent to the practice of podiatry), level of difficulty (i.e. something a diplomate would be expected to know), and feasibility (i.e. can this case be used in the computer based format). About ten of the reviewed cases are ultimately selected as possible candidate questions.
Once cases pass this initial screening, a number of them are disseminated to the committee members to begin working on the elements necessary to make them compatible with the new computer-based format. Each of the committee members is typically assigned one or two such cases to focus on. Right and wrong answers are recorded and the case is worked through from beginning to end in a “story board” format, often with multiple outcomes–after all, a story can have different endings depending on how the elements of the story unfold.
At each of our meetings, each committee member presents their story board along with the right and wrong answers that he or she has developed. The committee as a whole then critiques the content, relevance, and degree of difficulty to insure that the case would work in the current format. Approximately 75% of the ten cases that survived the initial evaluation make it through this additional scrutiny (e.g., from the original pool of twenty candidate items, about seven make it to this level).
Once seven to ten “good” questions are developed, they are entered into a web-based user interface that puts the question into the online format certification candidates would see in their local test centers. After this a number of national venues are chosen to minimize the potential impact of regional variations and volunteer diplomates from these areas are solicited for the subsequent field testing.
Three to five times per year diplomates are asked to assist in the validation phase of case-question development. During this phase diplomates use laptop computers on which the cases are presented in a manner consistent with what certification candidates will actually experience at Prometric test centers. Each diplomate works through the cases as if they were taking the Certification examination. The data collected from this process is analyzed to verify that a number of parameters are met. An obvious parameter of concern is how long the cases take to complete. The item level performance statistics are evaluated to insure that the questions perform in a manner consistent with our requirements; not only from a mechanical prospective (computer errors, image quality, etc.), but from the other elements mentioned earlier (i.e. content, relevance, level of difficulty, feasibility, and reliability).
On completion of field testing, the critical data from all of the field tests is compiled and scrutinized. At another committee meeting, this data is reviewed and questions “tweaked” when necessary so they will potentially perform better on an actual certification examination. The reworked questions are field tested once again and the data reviewed one last time. About 90% of the cases entering field testing pass this scrutiny and become part of the certification examination pool, so from the original group of twenty candidate items only five to eight are eventually judged to be of sufficiently good quality for use in actual certification examinations.
Considerable team effort is required to make a “good” question. The process is continually reviewed to make sure it is fair and consistent. This is critically important because, while each candidate is administered eight questions, those questions may differ from candidate to candidate. Each “version” of the question, even though similar to others, may differ in outcome and/or treatment, depending on the story board used to create it and it is important that the difficulty level of items administered to different candidates remains static.
Following the “live” test administration, each question is again statistically validated and the performance of actual candidates is compared to the control group of diplomates that participated in field testing it to insure that the question performed as expected.
Creating good examination requires care, skill, and dedication. It is a great deal of work, but only through such efforts can a certifying examination that meets the standards of this Board be developed.