Thursday, August 4, 2011

What Makes a Good Diagnostic Question?

I just read a great post by John Kleeman on the Questionmark Blog that I wanted to share with you concerning what makes a good diagnostic question?

I think Kleeman's post provides further proof that assessments are a good thing and not an "evil" mandated by law. I spend a lot of my time defending assessments and I will agree that an assessment is "evil" when we assess just for the sake of assessing. ANY assessment should be built with a purpose and with much thought and the interpretation of results should be afforded the same treatment (of purpose and thought).

Assessments should be used to assist students with their individual relative strengths and relative weaknesses and to assist a teacher with their instruction or instructional materials. Please read the following post by John Kleeman:
What makes a good diagnostic question?

First, it should be almost impossible for someone to get the right answer for the wrong reason: A participant will only get the question right if he/she has the right idea about whatever the instructor wants them to be able to know, understand or do.

Second, wrong answers should be interpretable: If a participant chooses a particular wrong response, the instructor should be able to guess why the person has done so and what misconception he/she has.

So suggests Dr. Dylan William in his excellent new book, Embedded Formative Assessment (published by Solution Tree Press, and recommended). A common use for diagnostic questions is to find out whether participants have understood your instruction – telling you whether you can go onto another topic or need to spend more time on this topic. And if participants get it wrong, you want to understand why they have done so in order to correct the misconception. Good diagnostic questions involve deep domain knowledge, as you have to understand why learners are likely to answer in a particular way.

One tactic for creating diagnostic questions is to look at answers that students give in open-ended questions and choose common ones as distractors in multiple choice questions.

Here is an example of a multiple response diagnostic question quoted in the book:

There are 64 possible answers to the question; the right answer is B. and D. It’s pretty unlikely that someone who does not understand Pythagoras’ rule will get the question right, and if they get it wrong, there will be good clues as to why.

Questions like this can be hinge-points in instruction – they give you evidence as to what you do next. Do you need to continue instruction and practice on this topic, or can you move on to the next topic?
John Kleeman is the founder and current Chairman of Questionmark. He wrote the first version of the Questionmark assessment software system and founded Questionmark in 1988 to market, develop and support it. Kleeman has a degree from Trinity College, Cambridge, and is a Member of the British Computer Society and a Chartered Engineer. Having been involved in assessment software for more than 20 years; he has participated in several standards initiatives and was one of the original team who created IMS QTI. He also gathered valuable experience being the instigator and chairman of the panel that produced the British Standard BS 7988, which has now become ISO 23988.

No comments:

Post a Comment

Related Posts Plugin for WordPress, Blogger...
 
MDZE3SGDZH9Y