CBDCE develops the Certification Exam for Diabetes Care and Education Specialists (formerly Certification Exam for Diabetes Educators), with the technical assistance of a testing agency. The two organizations work together to construct and validate the exam. The Exam consists of 200 multiple choice questions given over a four (4) hour period.
Practice or Job Analysis Survey
CBDCE periodically conducts a survey of diabetes care and education specialists' (formerly known as diabetes educators) practice – often called a practice or job analysis. The study surveys the health professionals to determine the significance of specific tasks to a CDCES’s practice. The practice analysis information is used to develop the exam content outline and to determine the percent distribution of the items for the role. Therefore, the subject matter and importance of each item on the exam reflects data validated by this periodic study.
Certified Diabetes Care and Education Specialists who represent the multidisciplinary aspects of profession serve on CBDCE’s Exam Committee. The Exam Committee drafts the exam’s multiple-choice items, which are then edited and validated by the testing agency, and approved by the Committee for inclusion on the exam.
The Exam Committee and the testing agency review all the exam items for subject matter, validity, difficulty, relevance, bias, and importance for current practice. All items are evaluated, classified, and revised by the Exam Committee and the testing agency for conformance to psychometric principles. Each item is pretested prior to its use and must meet statistical parameters prior to being used as a scored item.
On the basis of a completed practice analysis, it is usually necessary to develop a new exam form to reflect the updated exam content outline and to review the minimum passing point/score.
A Passing Point Study is conducted by a panel of experts in the field. The methodology used to set the minimum passing score is the Angoff method. CBDCE’s most recent analysis was completed in 2018, with the exam content outline being implemented starting with July 1, 2019 exams.
The exam questions are developed and reviewed for relevancy, consistency, accuracy, and appropriateness by individuals with expertise in diabetes education and then are tested on current exams.
Twenty-five of the 200 questions are new questions that have not been used on previous exams. Inclusion of these questions allows for collection of meaningful statistics about new questions but are not used in the determination of individual exam scores.
These questions are not identified and are scattered throughout the exam so that candidates will answer them with the same care as the questions that make up the scored portion of the exam. This methodology assures candidates that their scores are the result of sound measurement practices and that scored questions are reflective of current practice.
About scaled scores
Scores are reported as raw scores and scaled scores. A raw score is the number of correctly answered questions; a scaled score is statistically derived from the raw score. The total score determines whether candidate passes or fails; it is reported as a scaled score ranging between 0 and 99. The minimum scaled score needed to pass the examination has been set at 70 scaled score units.
The reason for reporting scaled scores is that different forms of the examination may vary in difficulty. Because new forms of the exam are introduced each year, a certain number of questions in each content area are replaced. These changes may cause one form of the exam to be slightly easier or harder than another form. To adjust for differences in difficulty, a procedure called “equating” is used.
The goal of equating is to ensure fairness to all candidates. In the equating process, the minimum raw score (number of correctly answered questions) required to equal the scaled passing score of 70 is statistically adjusted (or equated).