"Use intrinsically true or at least plausible statements. Test-wise students recognize ridiculous statements as wrong. To see if your test has such statements, ask a friend who has never studied the subject to take the test. His or her score should be roughly what would be earned from guessing randomly on every item (25 percent for a four-option multiple-choice test). "

Tomorrow's Professor Msg.#1108 Writing Good Multiple-Choice Items

 

Folks:

The posting below, as the title indicates, gives some excellent advice on writing goo multiple-choice questions. It is from Chapter 11, Writing a Traditional Test in the book, Assessing Student Learning: A Common Sense Guide by Linda Suskie. Published by Jossey-Bass, A Wiley Imprint 989 Market Street, San Francisco, CA 94103-1741 - www.josseybass.com Copyright © 2009 by John Wiley & Sons, Inc. All rights reserved. Reprinted with permission.

Regards,

Rick Reis
reis@stanford.edu
UP NEXT:  Translating "Grad Student" Into English

                                                                                    Tomorrow's Teaching and Learning

                                                ------------------------------------ 1,291 words ---------------------------------

                                                                                          Writing Good Multiple-Choice Items

As with any other assessment, multiple-choice tests should yield fair and truthful information on what students have learned (Chapter Three). There are just two basic precepts to writing fair and truthful multiple-choice items.

• Remove all the barriers that will keep a knowledgeable student from answering the item correctly. Students who have truly learned the concept or skill that a particular item tests should choose the correct answer.
• Remove all clues that will help a less-than-knowledgeable student answer the item correctly. Students who truly haven't learned the concept or skill that a particular item tests should answer the item incorrectly.

The suggestions in Table 11.1 follow these two precepts. Exhibit 11.2 gives some examples of multiple-choice items that follow most of these suggestions and assess thinking skills as well as conceptual understanding.

Writing good multiple-choice items can be difficult. Test publishers write, try out, and discard many, many items for each one that ends up in a published test. Even the examples in Exhibit 11.2 don't completely follow the suggestions in Table 11.1. So don't expect to be able to follow all these suggestions all the time, and don't expect your test questions to work perfectly the first time you use them. Analyze the results (Chapter Sixteen), revise the test accordingly, and within just a few cycles you will have a truly good test.

Table 11.1 Tips for Writing Good Multiple-Choice Questions

General Tips

Keep each item as concise as possible. Short, straightforward items are usually easier to understand than complex statements. Avoid irrelevant material, digressions, and qualifying information unless you are specifically assessing the skill of identifying needed information. Don't repeat the same words over and over in the options; put them in the stem.

Define all terms carefully. If you ask, "Which of the following birds is largest?" make clear whether you mean largest in terms of wingspan or weight. What do you mean by "sometimes," "usually or "regularly"?

Don't make the vocabulary unnecessarily difficult. Except for terms you are specifically assessing, keep the vocabulary simple-perhaps high school level. Otherwise you may unfairly penalize students who know the material but don't have a strong general vocabulary.

Watch out for "interlocking" items: items in which a student can discern the answer to one question from the content of another. Review carefully all items that share similar options. In a similar vein, don't ask students to use their answer to one question to answer another. If they get the first question wrong, they will automatically get the other question wrong as well, even if they understand the concept tested in the second question.

Writing a Good Stem

The Stem should ask a complete question. The student shouldn't have to read the options to discern the question. To check this, ask yourself if students would be able to answer the question posed in the stem correctly if no options were provided.

Avoid "Which of the following" items. They require students to read every option and can penalize slow readers in a timed-testing situation.

Don't ask questions that can be answered from common knowledge. Someone who hasn't studied the material shouldn't be able to answer the questions correctly.

Avoid negative items. In a stressful testing situation, students can miss the word not or no. If you must have negative items, underline, capitalize, or boldface words like NOT or EXCEPT.

Avoid grammatical clues to the correct answer. Test-wise students know that grammatically incorrect
options are wrong. Use expressions like "a/an," "is/are," or "cause(s)."

Writing Good Options

You needn't have the same number of options for every question. Four options are fine. A good fifth option is often hard to come up with, takes extra reading time, and reduces the chances of guessing the correct answer only from 25 to 20 percent. Some questions may have only three plausible options (for example, "Increases," "Decreases," and "Remains unchanged").

Order responses logically. Order responses numerically if they are numbers and alphabetically if they are single words. This helps students who know the answer find it quickly. If the options have no intuitive order, insert the correct answer into the responses randomly.

Line up responses vertically rather than horizontally. It's much easier-and less confusing-to scan down a column than across a line to find the correct answer. If you are using a paper test and your options are so short that this seems to waste paper, arrange the test in two columns.

Make all options roughly the same length. Test-wise students know mat the longest option is often the properly qualified, correct one. (For this reason, a relatively long option can make a good distracter!)

Avoid repeating words between the stem and the correct response. Test-wise students will pick up this clue. (On the other hand, verbal associations between the stem and a distracter can create an effective distracter.)

Avoid using "None of the above." A student may correctly recognize wrong answers without knowing the right answer. Use this option only when it is important that the student know what not to do. If you use "none of these," use it in more than one question, both as a correct answer and an incorrect answer.

Avoid using "All of the above." This option requires students to read every option, penalizing those in a timed testing situation who know the material but are slow readers. Students who recognize option A as correct and choose it without reading further are also penalized. "All of the above" also gives full credit for incomplete understanding; some students may recognize options A and B as correct and therefore correctly choose "All of the above" even though they don't recognize option C as correct.

Writing Good Distracters

The best distracters help diagnose where each student went wrong in his or her thinking. Identify each mental task that students need to do to answer a question correctly, and create a distracter for the answer students would arrive at if they completed each step incorrectly.

Use intrinsically true or at least plausible statements. Test-wise students recognize ridiculous statements as wrong. To see if your test has such statements, ask a friend who has never studied the subject to take the test. His or her score should be roughly what would be earned from guessing randomly on every item (25 percent for a four-option multiple-choice test).

Exhibit 11.2. Multiple-Choice Questions on Assessment Concepts

Correct answers are in italics [with an *].

1.    Which statement refers to measurement as opposed to evaluation?
A.*   Emily got 90 percent correct on the math test.
B.    Lin's test scores have increased satisfactorily this year.
C.    Justin's score of 20 on this test indicates that his study habits are ineffective.
D.    Keesha got straight A's in her history courses this year.

2.    Alyssa took a test on Tuesday after a big fight with her parents Monday night. She scored a 72. Her professor let her retake the same test on Thursday when things cooled off. She scored 75. The difference in her scores may be attributed to:
A.    chance or luck.
B.    lack of discrimination.
C.    lack of validity.
D.*   measurement error.

3.    People who score high on the Meyers Musical Aptitude Scale usually score low on the Briggs Biologists Aptitude Test. People who score low on the Meyers usually score high on the Briggs. Which of the figures below most likely represents the correlation between the two tests?
A.    .80
B.    .00
C.    -.10
D.*   -.60

4.    Choose the most likely correct answer to this nonsense question, based on what you know about informed guessing on tests. A drabble will coagulate under what circumstances?
A.    Only when pics increase
B.    Only when pics change color
C.    By drawing itself into a circle
D.*    Usually when pics increase, but occasionally when pics decrease

----------------------------------------------------------------------------------------------------
TOMORROW'S PROFESSOR MAILING LIST
Is sponsored by the STANFORD CENTER FOR TEACHING AND LEARNING
----------------------------------------------------------------------------------------------------