Life Story of a Multiple-Choice Question

By Jay Mathews

Washington Post Staff Writer

Tuesday , May 30, 2000

Every year, test-making companies churn out tens of thousands of questions to help schools assess how well America's school children are learning their lessons. Here's how one of those questions made its way from a science teacher's office to a student exam room.

The mission: Write a question for the Virginia Standards of Learning (SOL) high school exam in earth science that will show whether students understand "how geologic processes are evidenced in the physiographic provinces of Virginia..." (language taken from the state's curriculum standards).

Fall 1996: A science teacher who works at a school outside Virginia and also serves as an "item writer" for Harcourt Educational Measurement, the San Antonio-based company that produces the SOL tests for Virginia, types out this question on a computer:


Mud cracks found in the limestone road cuts along the Gate City, Virginia, bypass would show that before the sediments were buried the Gate City area was

A. a migration route for animals.

B. an active earthquake zone.

C. a wet area exposed to air.

D. a sand-covered beachfront.


The correct answer is C, the choice that indicates a student understands the conditions required to form mud cracks.

Each of the three wrong answers, known as "distractors," is designed to trap students whose knowledge of the subject is incomplete.

Answer A: Would be chosen by students who think that animals caused the earth to crack as they repeatedly followed the same trail. The conclusion is somewhat logical, but fails to take into account the regular pattern formed by mud cracks.

Answer B: Would be picked by students who don't realize that earthquake damage is much greater than the size of mud cracks.

Answer D: Would be chosen by students who believe that sand can form and retain cracks. They fail to realize that drying sand tends to fill in cracks.

Fall 1996: The question is reviewed by a copy editor, a senior content specialist and several other testing executives at Harcourt.

Winter 1997: The Virginia Department of Education's earth science review committee meets in the conference room of a Richmond hotel to consider the Gate City question and about 200 other proposed earth science items. The committee includes high school teachers, school district science specialists and college professors.

They approve the question with one change, substituting the word "shale" for "limestone."

May 1997: The question is included in a "field test" given to 4,262 high school students across Virginia. Here is how they answer it (figures are percentages):

Males: 6 percent--A, 25 percent--B, 55 percent--C, 11 percent--D

Females: 7 percent--A, 22 percent--B, 58 percent--C, 12 percent--D

Whites: 6 percent--A, 24 percent--B, 60 percent--C, 10 percent--D

Blacks: 7 percent--A, 23 percent--B, 52 percent--C, 17 percent--D

Hispanics: 15 percent--A, 28 percent--B, 50 percent--C, 7 percent--D

Summer 1997: The earth science review committee meets again in Richmond to consider the field test results. They note that the percentage of students who chose the correct answer fell between 35 percent and 85 percent, which, to testing experts, shows that the question is neither too hard nor too easy. The pattern of responses also suggests that the item is not sexually or racially biased. The committee approves the question.

Spring 1998: The question is given to some 54,000 students taking the earth science SOL test, and this time the results count. The portion of students who answer it correctly drops from 58 percent to 51 percent.

Fall 1999: The item is placed in the bank of used SOL questions that Virginia officials will release to the public.

Here are comments on the test item from three earth science teachers who weren't involved in developing it:

Bob Nicholson, T.C. Williams High School, Alexandria: "We talk [in our class] about reading the past even though we weren't there. And we talk about things like mud cracks and we talk about ripple marks and how they can indicate whether an area has a steady tide or a steady set of waves. I thought it was a good question."

Tricia Pease, Yorktown High School, Arlington: "It does not fairly cover the [curriculum standard] noted above, which is far more generalized. Taking a tiny sample question from a curriculum, it is hard to see that a student is achieving a particular [standard of learning]. It is far better to know that a student can apply concepts he/she has learned to make better sense of his world."

Michael D. Dyre, Potomac High School, Prince William: "Is the question valid as to its ability to test how geologic processes are evidenced in the physiographic provinces of Virginia? Yes, it is. A better question would be whether this is a good tool to help determine Earth Science mastery. The Virginia SOL test is designed to determine a level of learning by using very specific and structured questions. Wouldn't it be better to test the application of the standards using open-ended questions?"