I’ve just read a thought provoking article on diagnostic tests written by Simon Bates and Ross Galloway from the University of Edinburgh Physics Education Research Group and published by the UK Physical Sciences Centre (see the article at pages 10-20 here).
The authors are particularly concerned with diagnostic tests that measure conceptual understanding and identify mis-conceptions. So rather than testing for facts or knowledge or particular skills, their interest in diagnostic assessments is primarily around whether students understand some key concepts in the Physical Sciences. If students don’t understand them, they as instructors need to correct this in their teaching and feedback.
The article gives examples of use of diagnostic tests and also gives some good and detailed guidance on how to construct them, including which statistics to look at for good results. They recommend (as proposed by other authors in the Physics Education Research literature) a p-value or difficulty index of 0.3 to 0.9, a discrimination index of 0.3 or better or a point biserial correlation of 0.2 or better, and a reliability index of 0.7 or better.
They also explain how to write questions that test why people don’t understand something as well as what they don’t understand. And they give the example below (from the Lawson Classroom Test of Scientific Thinking) as something they have used in their teaching. Here is a what-why question, which asks for a fact and also asks why that fact is the case.
Bates and Galloway report that the first, “what” part of the question is answered just as well by students coming into university as those who have completed their first year at university, but that there is significantly better performance in the “why” part by those who’ve been at university for a year.
Getting to the root of learner misconceptions is a key challenge for all of us in learning and assessment, and I recommend this article as a good read.
Wednesday, November 3, 2010
Diagnostic Tests That Measure Conceptual Understanding
I just read a very interesting post from Questionmark's founder John Kleeman. I think it's a great way to measure conceptual understanding and to identify misconceptions. I hope you will read John's post below and use the information to improve your assessments.
Subscribe to:
Post Comments (Atom)
Labels
- ACTE (20)
- Apps (25)
- assessment (23)
- Blended Learning (6)
- BYOD (1)
- CareerTech Focus (1)
- CareerTech Testing Center (81)
- CareerTechEd (110)
- CareerTechTesting Center (4)
- certification (7)
- CIMC (31)
- CTE (37)
- CTE Education (97)
- ctYOUniverse (4)
- Curriculum (69)
- Education (83)
- Endorsements (6)
- Entrepreneur (1)
- Exhibit Schedule (4)
- Health (10)
- Instruction (244)
- Leadership (10)
- MAVCC (17)
- Mobile Learning (3)
- MOOCs (1)
- Motivational (55)
- NCHSTE (2)
- ODCTE (18)
- OkACTE (2)
- Online Learning (36)
- Policies and Procedures (7)
- Questionmark (33)
- Research (47)
- Skills Standards (4)
- Special Education (21)
- SREB (12)
- Statistics (16)
- STEM (6)
- Study Guides (4)
- Technology (229)
- Technology Apps (4)
- TechnologyApps (3)
- Test Development (53)
- Test Interpretation (41)
- Test Taking Tips (25)
- Testing in the News (102)
- Testing Liaisons (43)
- Training and Resources (149)
- Twitter (27)
- Web 2.0 (141)
- Who Cheats? (38)
- YouTube (61)
No comments:
Post a Comment