II . DEVELOPEMENT OF TEST ITEMS

    A. Process Issues

     

  • 1. Were Subject Matter Experts (SMEs) involved in item writing and review?
  • 2. If so, were the SMEs qualified?
  • 3. If so, did they represent the characteristics of the certified population?
  • 4. Has an item pool been developed?
  • 5. Is the item bank reviewed on a regular basis? If so, what is the nature of the group making the review?
  • 6. Have the items been written so as to clearly relate to the test specifications?
  • 7. Does the item pool include items testing various levels of knowledge (application and analysis) and not just the recall of facts?
  • 8. Have a sufficient number of items been developed for each element of the content qualifications (2-4 times the number of items needed)?
  • 9. Are the items coded to reflect the content area of the specifications to which they correspond?

    B. Item Quality Issues

  • 1. Is each item referenced to available published material that confirms the correct response?
  • 2. Are the items worded clearly and concisely?
  • 3. Are items stated so that obtaining the correct answer does not rely on the response to another item (bad pair)? If not, are records of “bad pairs” maintained so that both members of such a pair will not appear on the form of an examination?
  • 4. Have clues or tricks been eliminated from the item?
  • 5. Do the items contain superfluous information in the stem?
  • 6. Is the stem longer than any of the response options?
  • 7. Have negative stated items been used sparingly?
  • 8. Are key words in the stem used consistently (i.e.: underlined and/or capitalized NOT, MOST, BEST)?
  • 9. Has the use of specific words (always, all, never) been avoided?
  • 10. Have double negatives been omitted from the stem?
  • 11. Has unfamiliar, figurative, literary or textbook language been avoided?
  • 12. Are words which may have different meanings to different persons (some, often) been eliminated?
  • 13. Are response options arranged in a reasonable order (systematic fashion)?
  • 14. Has the response option all of the above been used only sparingly (if used it should NOT be correct all of the time)?
  • 15. If none of the above is used, is it the correct response option at least some of the time?
  • 16. Have repetitious words been added to the stem and not included in each response option?
  • 17. Are response options consistent with the structure of the stem (singular versus plural, male versus female)?
  • 18. Have overlapping response options been eliminated? For example, if option A provides the response “Less than 104” and option B is “Less than 110”, then if A is the correct response, B is also correct.
  • 19. Are all response options parallel in length, form and content?
  • 20. Are all response options plausible?
  • 21. Is there only ONE correct response option?
  • 22. Is the item grammatically correct?
  • 23. Have all words and phrases that may be considered offensive or harmful to any racial, ethnic or gender subgroup been eliminated?
  • 24. Have the items been revised and approved?