![](https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiFAzzgKgolr9iEhXEZVx26_YLWvGEtIvJnvnCk2d4D2tf1RpZVPSPcWQRgDywn6iQ8QlrsOxECljx187ffQa6IpY4fhfRy4Sjlrg6PrkOUgh_HH7U-sxw2dIItBnggFGAxoVtYnU1pAuw/s200/apple_newton.jpg)
Okay at some point we all seem to feel the urge to truly test our users if they understand what we've been forcing them to read (or maybe even, if we're really advanced, interact) with. So like a Newtonian apple we feel ourselves dragged towards the all-important and often all-consuming quiz. It's a necessary evil sometimes, a key part of the learning at others and just there to justify why we put the package together all too often.
The first thing is to determine if your e-learning actually requires an assessment. If it's about doing a job or function, is there a better way than the quiz to achieve this? Okay maybe you don't have the capability to put out a full task-trainer, but is this multiple guess thing you're about to stick in place really the same as being able to do what you are asking? The simple question is really around your aims and what you intend to achieve. If you have approached the learning with the 'by the end of this xxx you will be able to xxxx' (don't laugh, most designers still end up with some sort of objectives listed) the simple thing is to link your assessment to this. The problem this creates is that your assessments are often insulting to the intelligence of your average adult learner. So if you have an objective for the learning (and if not, why bother?), even if it's not stated, surely you should test it? Yes pretty much, but that doesn't necessary mean a simple Q&A that most 5 year olds could do.
The Challenge: Linking assessment to objectives without patronising your patrons
Possibly the easiest way around this is to cleverly disguise your assessment by making it scenario based. This is a fairly standard technique where instead of asking the user to pick the correct option from a list of possibles, we ask them to put themselves in the position of someone performing the task shown and ask 'what should they do?'
This may not be full on Simulation, but it certainly still meets our objectives and doesn't make our user feel like so pointless in the learning. It also allows us to test and retest the same objectives with different scenarios. It takes some practice but is a way to really improve your assessments. Another bonus is that it actually allows us to bring in multimedia; usually a picture or video/audio to enhance our quiz without it being just clip-art to a boring question. I like to run scenarios into several questions like a running metaphor (hold on, shouldn't I try and bring the aforementioned apple back in here?). In the first instance I may have a worker doing something and test for the correct course of action, if I can I may branch the user depending on their response, or I may show that they have done the wrong thing and then look at what their next response should be to correct the problem at this stage. Just like e-learning in general this takes a lot of planning to do well, but the results can be great when you experiment.
The Solution: Scenario based questions
So the question types all look good but I end up choosing multiple choice; is this a bad thing? Yes and no. Multiple choice may be the solution to most apple-eating chimp's challenges, but for us it is a staple part of our diet too, but just because it is multiple choice doesn't mean it has to look like multiple choice! Users can select from pictures or even voices from a 'list' that doesn't look like one. One good example is to set out your map and get the user to choose a direction, sure it's essentially the same as multiple choice but it creates a theme and we're back to that sort of running metaphor that can help to engage our users. I use some variety when writing questions, I like the hotspots and the mix and match (especially the drag and drop type), but I think there's still space for those multiple choice just write them better!
So what tool do I write my quiz in?
Yes a big question and the eternal challenge for developers. If your LMS has an internal tool don't necessary write it off straight away and fall for the pretty box on the e-learning quiz thingey you've been dying to buy. If you're using Moodle or Totara for example, the built in quiz thing is pretty decent, especially with the ability (like Blogger!) to let you use html. But the real advantage of using these quiz tools is that the analysis is way and above that which SCORM contains. Surely not, I hear them cry, SCORM is the standard! SCORM is great for allowing basic level of launch and track and getting some standard analysis from questions, but if you're into the deep analysis stuff you'll probably find more options in your LMS.
That said, I love a couple of the tools; Articulate Quizmaker is very good and much easier to use than most internal tools. It looks pretty, particularly if you have properly designed or get designed for you a decent skin and can really help in the rate you can produce the quiz (but please read the challenge first). I like Captivate too when it comes to putting in complex branching quizzes, but this is probably for more advanced users. Captivate also has some advantages in software simulation as essentially that's what it originated from. Most importantly the tool is nowhere near as important as using it properly and perhaps more importantly getting the design right.
The last tip for your assessment is probably so obvious that I don't need to say it.. but how many times has this happened I wonder? Test it. Not yourself, you'll think it's great, get someone else to test your work, take a sample, and listen to their feedback. First thing you'll be surprised at is how different some people see the thing to yourself! You won't please everyone but if you get several of the same criticisms from people then there's probably something worth changing on your assessment.
Finally please click on the assessment link at the bottom to begin (and go beyond the t-shirt bit!):
http://youtu.be/SkNyCGkWcFE