Wednesday, January 25, 2012

ReviewLink: Online Review Tool for Lectora

by Joseph Suarez

An important step in eLearning course development is getting feedback and approval from key stakeholders such as subject matter experts and other team members. Sometimes however, just getting a copy of a course to those stakeholders can be a challenge. Perhaps you have experienced one or more of the following:
  • Emailed a course that bounced back due to a file size limit 
  • Lost a USB drive distributing courses by hand 
  • Had to explain how to unzip a folder or find and open “index.html” 
  • Been asked to print out of a course loaded with Flash animations and interactivity 
Then there is the issue of submitting and tracking feedback and change requests. Without a central location where everyone involved can see and respond to each other’s comments, extra work will be required to filter all the incoming responses.

To address these types of problems, Trivantis has included a web-based review service with Lectora X.6 called ReviewLink. Previous versions of Lectora typically have included some handy additional features, but nothing quite as big and useful as ReviewLink. Here’s how it works:

The course developer publishes their Lectora X.6 built course to ReviewLink. This is now an option just like Publishing to HTML or SCORM.

The ReviewLink publishing options allow the developer to add the email addresses of any intended course reviewers (regardless of whether they have Lectora or not). This list can be added to or changed later as well.

Since ReviewLink is entirely web-based (in the cloud), both the developer and reviewers with an email invite log into the ReviewLink website, view the course, add comments with optional due dates, and respond to comments others have left.

The course developer can then address the comments, make any necessary changes to the course and republish to ReviewLink for as many rounds of review as needed. Comments can also be marked as “fixed” or be archived.

All comments are nicely listed in table form and can be filtered by status, content (course name), and reviewer. In addition, an email is automatically sent to invited reviewers if the course is updated.

After a couple weeks of use, both as a developer and reviewer, my personal experience with ReviewLink has been beyond satisfactory. Solving the course distribution problem alone makes it well worth the upgrade in my opinion. Plus, I hadn’t realized just how problematic organizing reviewer feedback was prior to using ReviewLink’s comment system.

Overall, ReviewLink is a great tool for Lectora published courses with one drawback; the developer currently does not receive an email when a reviewer leaves a comment. This was disappointing to experience given that all reviewers receive emails alerts if the developer updates the course.

At the time of this writing, ReviewLink is considered a beta version, so some bug fixes and improvements are likely coming down the road. And, since ReviewLink exists in the cloud, those updates won’t require downloading and installing a patch.

Wednesday, January 18, 2012

Writing Distractors for Multiple Choice Questions

By Shelley A. Gable

Multiple choice questions – whether used throughout an eLearning lesson or in a knowledge assessment – are often frowned upon as unrealistic and limiting. But when written well, multiple choice questions can be quite robust.

It’s like eLearning. Poorly designed eLearning lessons that are text-dominated, page-turners tend to be unpopular. Highly interactive eLearning lessons that present relevant content in realistic contexts tend to be well received and effective.

Let’s briefly look at two common multiple choice offenses to avoid...

One common offense is using a multiple choice question when you really ought to use a different type of interaction. For instance, if you need learners to identify what button to click on to begin a procedure in a system, don’t write a multiple choice question where the options are names of buttons. Instead, use a hotspot question in which an image of the system screen displays and the learner must physically click the correct button in the image. Clearly, the hotspot question more closely simulates the visual recognition and physical navigation that one must perform on the job.

Another common offense is offering distracters that are obviously flawed. A good option surrounded by three really bad ones is often easily recognized, even by someone who may not understand why the correct answer is the best option. Simply writing “bad” statements as distracters misses an opportunity to show learners valid examples and non-examples of applying a skill.

So how does one write good distracters?

First of all, it’s worth pointing out that the best multiple choice questions are typically scenario-based. In other words, they present learners with a situation and ask them what they should do next. Earlier posts on this blog explain how scenarios can make quiz questions more relevant and how to write realistic scenarios.

So when it comes to writing distracters...

Ask SMEs for common mistakes. Subject matter experts (SMEs) who are close to the action and frequently work with your target audience are ideal here. Simply ask what they’ve seen newbies do in the scenario’s situation. Or how they’ve heard colleagues say something incorrectly. And probe for specific examples. If you receive a response that simply reiterates pointers to keep in mind, you’ll still end up writing the distracters yourself. So, probe for examples that are specific, detailed, and concrete enough to immediately write as a distracter.

Writing distracters with a SME is really an ideal approach, because it ensures that your distracters are options that people might actually consider. And as an added bonus, SMEs tend to recall common mistakes quickly and easily, so you don’t spend as much time trying to create fiction from scratch.

Align distracters to cautions stated during training. If the training directed learners not to do certain things, write distracters that emulate those undesired behaviors. But, don’t lay it on too thick. For example, if the training included three things not to say while discussing employee compensation, don’t write a distracter that includes all three items – chances are, most people don’t make all three mistakes at once, making the incorrect answer unrealistically obvious. Instead, write a distracter that illustrates just one.

Omit best practices without making it sound bad. Suppose the training includes three things learners should say when closing a phone call with a customer. A well-written distracter might simply miss one of those items, but still sound polite. In other words, a distracter doesn’t have to be an obviously bad choice; it can simply be a less optimal (i.e., not intentionally terrible) choice because of an important component it lacks.

What other tips can you share?

Although instructional designers with plenty of experience may feel that the suggestions above are relatively basic, the truth is that most people have to work at developing this skill. And when it comes to writing multiple choice questions, writing feasible distracters is often the most challenging part. So, what other tips do you have for doing this well?

Wednesday, January 11, 2012

Gaming with the Nine Events of eLearning

By Shelley A. Gable

I recently joined a project that involves designing a series of eLearning games to teach sales skills. The intent is for learners to learn entirely by doing, in the context of a fun game.

At one point, the training project team discussed how to include Gagne’s nine events of instruction in each game, or whether it was even necessary to do so. For me, pondering that question reinforced the brilliance of using game for learning – games address most of the instructional events inherently.

I’ll take you through my thought process so you can see if you agree...

--1-- Gain attention.
If your game presents a compelling (and entertaining) story and challenge, you’ll easily capture learners’ attention and maintain it.

--2-- Inform learners of objectives.
Every game has an objective, regardless of whether it’s designed for learning. If learners must apply instructional objectives to succeed in the game, then this is easily cared for.

--3-- Stimulate recall.
The context of the game likely provides this intuitively. For instance, even if you’re new to selling, chances are that you’ve been a customer in a buying situation. So, building a game with relatable scenarios can tap into tacit knowledge related to how customers feel and what they may expect in a variety of situations.

--4-- Present content.
There are lots of clever ways to do this in a game, depending on the type of game you create. You might offer relevant content (e.g., sales tips) within scenarios or within the various challenges throughout the game. To encourage experimentation and discovery, prompt learners to make decisions based on their instincts, and then present all content within the game’s feedback structure.

--5-- Provide learning guidance.
This could occur by providing learners with a way to obtain hints throughout the game. If they need to reference job aids on the job, prompt them to reference those same job aids to succeed in the game. Or, you could create individual challenges within the game that illustrate examples and non-examples of a skill or concept they must learn.

--6 and 7-- Elicit performance (practice) and provide feedback.
This is what a learning game is all about! Constant performance, practice, and application. Feedback comes in the form of consequences throughout the game, such as favorable outcomes or additional points for making good decisions during gameplay.

Admittedly, the final two instructional events – assess performance and enhance transfer to the job – will likely occur in aspects of the training initiative that are separate from the game itself. Perhaps the game would be followed up by a relevant knowledge or skill assessment, along with a plan for managers to reinforce learnings on the job.

Have you done this?

At this point, I’m too early in the project to share my own concrete example of how all this plays out. But if you’ve dabbled in learning games in eLearning, please share your experience! Would you agree that the nature of a learning game’s design inherently accomplishes most of these instructional events?