CLP, Inc. contracted with CASTLE Worldwide, Inc. to develop the certification examination. CASTLE Worldwide is a full-service testing company providing licensure, certification, and specialty examinations, including practical and simulation tests, for associations, state boards, government agencies, and corporations.
The development of a valid examination for the CLP certification process began with a clear and concise definition of the knowledge, skills, and abilities needed for competent job performance. Using interviews, surveys, observation, and group discussions, CASTLE worked with licensing professionals to delineate critical job components, the knowledge and skill basis for the questions on the multiple-choice examination.
Steps in Developing the CLP Exam
There are nine (9) general steps in the development of the CLP exam.
Development of Test Blueprint
Item Development & Validation
Plan the Field Testing Program
Role Delineation Study
A role delineation study determines the knowledge and skills that define a minimally proficient licensing professional. Linking the knowledge and skills defined in the role delineation study to examination content ensures that the examination is content valid. In psychometric terms, validation is how a test developer documents the competence inferred from an examination score.
During the role delineation study process, a committee of experts defines the overall performance domains associated with minimum competence. These performance domains are further broken down into distinct tasks, knowledge, and skills required in licensing. The domains and tasks developed by the subject matter experts are then validated through a survey of licensing professionals. The participants in the survey review and rate the domains and tasks according to their importance, criticality, and relevance on the job.
Development of Test Blueprint
In the next step, the results from the role delineation survey will be used to develop the blueprint, or test specifications, for the examination. Information regarding the importance, criticality, and relevance of each domain and task will be translated directly into the number of items that should be included in the examination for each area. The blueprint guides the item development and examination assembly processes and ensures that the examination accounts for the relative importance of the required knowledge and skills.
Item Development and Validation
All examination items will be written by subject matter experts. Each item writer will be trained in writing, reviewing, and editing high-quality questions. Each item will be classified by content category, assigned a cognitive level, and validated according to its appropriateness to the minimally proficient licensing professional. After development, items will be reviewed to ensure they are psychometrically sound and grammatically correct.
Plan the Field Testing Program
In order for items to be scored, it is important to know that the items demonstrate desirable statistical quality. Following the first administration of the examination, careful item statistical analysis will be completed before scores are released. In addition, issues of statistical quality will be addressed in scoring. Afterward, revisions will be made to ensure the items demonstrate desirable statistical quality.
Each version of the examination will be created by selecting the appropriate number of items from each content area, as specified in test specifications and in accordance with the psychometric design of the examination. Each question will be reviewed carefully and validated by a panel of subject matter experts. In addition, each item must be supported by published references. At this time, item performance data or field test data will be available. Based on the statistical item analyses, inappropriate items will be revised or omitted from the examination.
Examination Review and Revision
Following assembly, the draft examination will be reviewed again for technical accuracy by psychometric experts to ensure its psychometric integrity. Of particular concern will be the independence of items on the examination and the balance of the examination’s content. Each step in the test construction process will be carefully documented. Multiple reviews by content and psychometric experts and the use of stringent criteria strengthen the validity of the examination. Continuous evaluation of each examination’s reliability maintains the consistency of the examination to measure examinees’ skills accurately.
A valid certification examination must have a defensible passing score. The cutoff score that separates examinees who pass from examinees who fail must be based on the minimum competence required to protect the public from harm. Whatever procedure is selected, it must be criterion-referenced, which will ensure that the passing point is linked to minimum proficiency in licensing and applied without a quota to all candidates.
Test administration procedures must ensure consistent, comfortable testing conditions for all examinees. As the examination is secure, procedures must address examinee admission into the room, display of information signs, security, time allocation, and other aspects of the administration. Testing facilities must meet guidelines that ensure security, proper room size, ventilation, rest room facilities, noise control, and accessibility for candidates.
Following each test administration, item statistics will be reviewed to ensure quality and validity. If there are items with poor performance statistics, then these items will be evaluated by subject matter experts prior to scoring. These items then will be tagged for review and revision. Reliability studies also will be conducted to ensure overall quality and precision. Equating studies will be performed when new versions of the examination are introduced to ensure equivalence in difficulty across all forms of the examination.