Hapgood

Mike Caulfield's latest web incarnation. Networked Learning, Open Education, and Online Digital Literacy


Can People Designing Multiple Choice Tests for MOOCs Please Study Designing Multiple Choice Tests?

David Kernohan has a post up titled How I got a “first” on a FutureLearn MOOC with one weird old trick… over at FOTA, and it does just what it says on the tin. In the post, David details how he was able to get an 87.4% on a FutureLearn test for a course module without studying any materials. The “old trick” referenced is not actually one old trick, but the bag of tricks that most people come to learn taking multiple choice tests. You know, for example, that when “all of the above” appears, there’s a high chance that’s the correct option. You know that sentences that are not parallel in structure are usually wrong options. You know the longest, most qualified answer is usually right.

I’ve talked about this before, regarding a Coursera course my boss took, writing a scathing treatment of it in Sometimes Failure is Just Failure. In that case the malformed questions did not make it easy for a person with no knowledge to ace the test, but rather, it made it difficult for someone with good knowledge to pass it.

The reason these sorts of errors are so mind-blowing frustrating is:

  1. You see it everywhere, in all these products.
  2. It’s a really serious matter.
  3. It could be solved with a four-page checklist.

It’s really hard to write excellent multiple choice questions, questions that go beyond simple recall and comprehension and test higher order, conceptual understanding. On the other hand, many decades of research have made writing good multiple choice questions relatively easy. You don’t even have to read the research — you just need to follow the rules distilled from the research.

I don’t know if the problem in this case was a FutureLearn designer or a subject matter expert they worked with. Perhaps one of the problems is that it’s unclear in the arrangement who exactly is responsible for test validity. But somewhere in the quality assurance chain there needs to be someone who can read the checklist and spend the amount of time necessary to reformat these questions. We can debate all day whether multiple choice is the best way to assess students, but making sure the multiple choice questions we do use are well designed is a no-brainer. Do like I do, and don’t trust your instincts. Print out the sheet and keep it by you while you write your questions. Check each question. Then get someone else to double check it for you.

When you’re rolling this out to 100 students, this sort of care is important. When you’re rolling it out to tens of thousands, this sort of care is essential.



2 responses to “Can People Designing Multiple Choice Tests for MOOCs Please Study Designing Multiple Choice Tests?”

  1. Thanks for this post. I have yet to see MCQs that follow these excellent guidelines you link to. I suspect that many MOOCs intentionally leave te MCQs easy to hitch up completion rates. Unsure it works, but i definitely completed a couple of MOOCs without learning much because their quizzes were so easy

  2. […] some TED you’d like, Not higher ed you like Venture Cap. you like, And free crap you like, M.C.Q.s you like, Money too you’d like, Well, see how it flows! When investors are always hoping Your […]

Leave a comment