Thursday, March 31, 2011

George Orwell's Advice for Writing eLearning Content

By Shelley A. Gable

Learners can be fickle, quickly slipping into distraction or boredom if we ask them to read too much.

Though highly interactive, problem-based eLearning can help maintain engagement, completing these activities usually requires learners to read words that we write. In a culture of multi-tasking skimmers, we must write as clearly and concisely as possible to help keep learners motivated.

Practical Writing Advice from George Orwell

Admittedly, when George Orwell wrote Politics and the English Language to advocate for writing in plain English, improved eLearning engagement was not his primary goal.

Fed up with the intentionally vague and misleading language common among politicians – which he saw bleeding into mainstream communication – Orwell responded with an essay that called out examples of poor writing and offered advice for communicating clearly.

Though he published the essay over half a century ago, the advice is still practical. Especially for those of us who write training materials.

His essay builds a case for six simple writing rules.

--1-- Never use a metaphor, simile, or other figure of speech which you are used to seeing in print.

Orwell explains that clich├ęs are often ineffective, because we tend to take their meaning for granted. They don’t prompt us to think.

--2-- Never use a long word where a short one will do.

The first example that comes to mind when I see this is utilize versus use. They mean the same thing.

And what about pulchritudinous versus beautiful? Okay, maybe nobody uses that one.

--3-- If it is possible to cut a word out, always cut it out.

Something that came to my attention about a year ago is our overuse of the word will. We often say things like, “When you click the Start button, Windows will display a menu.” But will is completely unnecessary here. It makes just as much sense to say, “When you click the Start button, Windows displays a menu.”

I know, that’s a REALLY simple example. But if you’re not already in the habit of avoiding the word will, I challenge you to watch for it in the next thing you compose – whether it’s instructional materials or something else. It’s shocking how often that pesky little word sneaks in.

Of course, Orwell’s point is to avoid unnecessary wordiness in general – he probably didn’t intend to specifically target the word will.

--4-- Never use the passive where you can use the active.

Using passive voice isn’t grammatically incorrect, but it can be vague and wordy, which is why it’s not ideal. Especially for instruction.

Compare...

PASSIVE: After entering an applicant’s data, proposed loan terms are displayed to discuss with the borrower.

ACTIVE: After entering an applicant’s data, the system displays proposed loan terms to discuss with the borrower.

Does one seem clearer?

--5-- Never use a foreign phrase, a scientific word, or a jargon word if you can think of an everyday English equivalent.

This is so relevant for training!

Subject matter experts often use department-specific jargon when providing information. Not only is jargon potentially unfamiliar to learners, but there’s really no need for them to expend effort learning a term if they’re not going to use it on the job. Plus, using avoidable terminology that is specific to a single department in a company limits the reusability of that training for other parts of the organization.

--6-- Break any of these rules sooner than say anything outright barbarous.

In other words, if breaking one of these rules actually allows you to express an idea more clearly, do it.

This makes me think of sentence fragments. Technically, a sentence fragment is not a grammatically correct sentence. But sometimes we get away with fragments when we use them to emphasize a point and the idea is still understandable. Rightly or not, I frequently use fragments in posts on this blog. In fact, I’ve done it at least once in this post.

Coming clean...

Okay, I certainly can’t claim to follow all these rules all the time. But I try. And you can see how these basic guidelines can help simplify our writing to improve its readability.

Do you have examples to share of following (or breaking) these rules? Or do you have other writing-related pet peeves...I mean, rules...to suggest? If so, please utilize the commenting function provided here by the blog.

Sunday, March 27, 2011

ADDIE isn't Dead; it's just more Agile

By Jay Lambert

Readers of this blog know that I've been a big defender of ADDIE (Adapting 20th Century Training Models for the Future, ADDIE isn't Dead, how can it be?, etc.)

As a reminder, ADDIE stands for Analyze, Design, Develop, Implement, and Evaluate. Of course, we are using DADDIE now, having added Define to the beginning of each project. No ADDIE isn't dead. But it is evolving.

ADDIE should be considered circular.

We should stop looking at training as a single point in time and rather view learning as never-ending. Once Evaluation ends, that data should go right back into Define and further Analysis. By this means, our learning initiatives will continously improve and, as business needs are met, evolve into addressing the next most critical need. With the advance of technology, we have never had as much data readily available to us as we have now. We must use this ability to improve our efforts at every level.


This enables ADDIE to be more Agile.

And that's why the Agile method is so appealing. It seems everywhere we go these days, a major aspect of a project is speed. How fast can it be built and rolled out? In our frantic world, this is likely true no matter which industry you are in.

A circular ADDIE model meshes well with the rapid Agile development techniques so popular in software development today.  The focus of an Agile effort is to keep satisfying needs on an iterative basis. At its essence, Agile development is simple evolutionary design; get enough out to be useful, then come back and improve it, if necessary.

By merging this with ADDIE, you'll determine next steps needed in Evaluation and use that data to start off the next round of Analysis. As your program continues, you'll continously be improving your offerings and addressing the most immediate organizational and learner needs. You'll be implementing evolving rapid eLearning.

The speed in which we can effectively perform this with ADDIE will always be a factor. But I believe that by acting on only the critical needs determined in Analysis, then we can move quickly using the technologies available to us today. Then we can go back and begin again or revise what we have. Needs that aren't critical might not require training at all; think of the Five Moments of Learning Need.

Imagine a scenario where we keep addressing critical needs until there aren't any. Wow, that would be performance improvement in an organization.

What do you think? Would Agile ADDIE work for you?

Wednesday, March 23, 2011

Technology Advancement and Learning – Help or Hindrance?

By Dean Hawkinson

On a recent trip to Jekyll Island, Georgia, I had the opportunity to reflect a bit on how far we have come with technology. On Jekyll Island, there is a monument set up by the Telephone Pioneers of America in remembrance of the first transcontinental phone call made from right there on the island on January 29, 1915, by then AT&T president Theodore Vail. The call included Mr. Vail, Alexander Graham Bell in New York, President Woodrow Wilson in Washington, D.C., and Thomas Watson in San Francisco. I reflected on how that one phone call changed our lives forever, and how much we take the telephone for granted today. I also thought about how far technology has come in just under 100 years of that first phone call.

I have worked in a learning environment for just over 10 years – not a very long time when you think about the grand scheme of things. However, in that 10 years, here are a few of the learning advancements that I have experienced in my career:

  • Transition from an overhead projector and transparencies to using a PC with PowerPoint and a connected projector
  • Transition from CBT (computer-based training) courses on CD ROMs to internet-based WBT (web-based training) courses
  • Alternatives to classroom-based instruction such as synchronous and asynchronous training courses via web technologies
  • Introduction of Learning Management Systems (LMS) to deliver and track training of all types
  • Transition from “Web 1.0” – consumption only internet – to the more recent “Web 2.0” – an internet full of two-way interaction and sharing
  • Integration of social media into the learning environment
  • Introduction of the iPhone, the smartphone and the iPad and other tablets and the integration of mLearning on these devices

Wow. There have been many leaps forward in technology for learning in the past 10 years.

What does all of this mean to learning?

There are some really neat tools out there, and it can be downright overwhelming sometimes. But guess what? Good ole’ Instructional Design principles and Adult Learning theories have not changed. Adults still learn in the same ways they did before all of this came about. So, why do we as learning professionals tend to ignore these principles just to use the latest and greatest trends in technology? For more information on this, read Adapting 20th Century Training Models for the Future.

It is crucial to set your instructional goals first - determine what it is that you are trying to accomplish through your learning initiative. Only when you have that established can you see how these technologies might support your end result. It does not work the other way around! Trying to select a technology because it is “cool” or the “latest thing” before understanding your instructional goals and how that technology will support it will ultimately fail/

Here are some guidelines to keep in mind as you select various technologies in your learning:

  1. Know your audience – Who is your audience? Think about what technologies might work for that audience before choosing a technology. For example, mLearning might work for a diverse sales organization that is spread throughout the country with the tools to receive that training (smartphone, tablet, etc.). However, it would probably not work for a group of call center representatives who do not have any mobile devices on which to receive the mLearning and tend to stay at one PC.
  2. Understand what “fits” to the technology – A 30-minute web-based course might work well on a PC over the web, but will it work on a smartphone? Will the media that you are using work on the smaller smartphone screen, or does it need a larger PC screen?
  3. Strategize for social media use – Understand your instructional goals first and then think about how social media can fit those goals, not the other way around. Look at how informal learning takes place today in your organization, and think about how social media can extend that. Have a written strategy in place and get buy-in from management or your client BEFORE planning any social media with your learning. There are a lot of great suggestions for Social Media use.
  4. Create Success Stories - After gaining initial buy-in from your management on the technology strategy, pilot it with a small group and create successes that you can share as you roll it out to the larger group. This will be key in getting everyone behind the new technology.

Bottom line – make your technology strategy fit your audience and instructional goals – not the other way around.

What are some of your success stories in implementing new technologies in your learning programs?

Sunday, March 20, 2011

ADDIE isn't dead; how can it be?

By Jay Lambert

There has been a lot of discussion, and an infamous article or two, in our field about the death of the ADDIE model. This came up again in the comments on my recent blog post, Adapting 20th Century Training Models for the Future.

As a reminder, ADDIE stands for Analyze, Design, Develop, Implement, and Evaluate. The naysayers have been saying that this process is too slow and archaic for our modern "this is due by 5pm today" times.

And if the attacks are based on the long timelines typically associated with ADDIE, then they make some valid points. It's sad, but true. Most of us have seen business sponsors' eyes glaze over when full project plans are introduced. Sometimes the business need warrants a more formal model approach; other times rushing to production is a necessity. But we still need to ensure that those rush projects meet their objectives.

As an industry, we must adapt. And models such as ADDIE help us do so.

ADDIE is the basic backbone of our processes.

As mentioned in an earlier post on this blog, ADDIE functions as the basic backbone of our industry processes. Just about everything we do and every model we might use fits within one of the phases of ADDIE.  Consider,

Analyze

Analysis enables us to identify the cause of a performance gap and formulate a plan for addressing it. It's during this phase that we define an organization's true business need and determine whether or not that need can be resolved via training. That's a pretty important task.

As learning practioners, we must still have some form of Analyze in place; we just can no longer take 3 - 6 months to complete it. We must gather our data quickly and react to it even faster. Ignoring this Analysis phase is a recipe for disaster. Perhaps, out of necessity, we gather what information we have, make a best guess, and go for it a la the Agile development method, but we must still take that step. If there is no Analysis, then it's likely that an initiative won't satisfy the true business need.

Design

In my mind, Design, together with Analysis, is the most critical part of the process. For true learning to take place, the recipient must be engaged, motivated, pulled in. Whether you are delivering a 5-minute training snippet or a full curriculum, it must be designed well for both the content and the audience. We all know this is true by the number of anti-PowerPoint bullet rants we see on a weekly basis.

I'm a big fan of scenario-based and simulation eLearning; these take longer to Design, but the outcome is worth it. And I'd argue that even a quick 3-minute software simulation takes instructional design time to accurately show exactly what the user should be doing.

Develop and Implement

These two phases are likely where a good number of rapid eLearning development projects find themselves confined. To have eLearning, you have to have development and implementation. If it is not built and subsequently rolled out, then no one is going to see it, much less benefit from it. But if a learning project starts and stops here, like is tempting for some timelines, then problems are likely to ensue.

Like with each step of our sacred training models, what has changed now in these phases, and must be considered, is the speed. I advocate holding pilots, but often that step is dropped from project plans to enable faster implementation. But if we consider joining ADDIE with the Agile development model, then perhaps each implementation is a form of pilot anyway.

Evaluate

Evaluation is the phase that is truly suffering in our fast-paced world. We implement; we move on. But as evidenced in many, many posts, articles, conversations, and presentations, we have to find some means to evaluate and effectively measure our programs. It's the only way to prove that our industry is making an impact within organizations; and, let's face it, it's the only way to keep our budgets intact so that we can continue with our efforts.

Why are training departments often gutted during down times? Because we don't do a good job of following through with evaluation and proving our critical business worth. Our intiatives are supposed to make things better for both the learner and the organization; evaluation is the only way to know if that is happening. So yes, the need for Evaluate is still alive in the 21st century.

And it's interesting here locally that the March and April ISPI Atlanta and ASTD Atlanta chapter meetings both focus on learning measurement. And the May ISPI Atlanta topic is 'Predictive Evaluation.' Think evaluation is on our minds?

Define

I also like to include a phase before Analysis, the Define phase, as taken from the DMAIC model. This is all the extremely important pre-learning project information that can come back to haunt you if you skip over it. Believe me, I know it's easy at today's pace to blow past and think you'll come back later; but don't.

By the way, adding Define makes this the DADDIE model; my team dislikes this acronym. If you know a better one, please let me know.

So what are your views on ADDIE? Are you still following its phases in some evolved state? Just please don't tell me it's dead.

Thursday, March 17, 2011

eLearning as Part of an Informal Learning Strategy

By Shelley A. Gable

How did most of your workplace learning occur over the past year?

Much of my learning came from brainstorming with peers, participating in online forums, reading (articles, books, blogs, etc.), observation, experimentation, trial and error...and, of course, reflection.

Do you have a similar story of unstructured, experiential, informal learning?

Jay Cross, known for his informal learning research, tells us that approximately 80% of workplace learning occurs informally. In contrast, only 20% of organizations’ learning budgets go toward enhancing informal learning.

Does this suggest that organizations are missing an opportunity to take learning to the next level?

And can eLearning become part of an investment in informal learning?

I think Ray Jimenez (among several others, I’m sure), would say yes. He was on to something with his 3-minute eLearning methodology.

The idea is to offer short, focused snippets of instruction. If it takes a few minutes to complete, then it feels easy to access in the flow of work...or between tasks or meetings. Compare that to making time to complete a 1-hour lesson.

How might this work?

We could make short eLearning lessons available as a supplement to new employee training, or any other formal course. The formal course builds the foundational knowledge and skills required for the job, with short eLearning lessons available to help people advance their skills later.

Short eLearning lessons could also complement electronic job aids or a knowledge management site. While a job aid might outline a procedure and provide guidance for decision-making, optional eLearning might offer conceptual background, examples, or short practice opportunities to help people confirm understanding.

Do Gagne’s nine events fit in?

Probably...though not necessarily in the eLearning lesson itself.

A 5-minute eLearning lesson is unlikely to work through all nine events in a meaningful way. But, if eLearning is just one part of an informal learning experience, then learners will probably experience those events through other means.

For instance, if I’m seeking information about something, the informal learning experience already: (1) has my attention, (2) is aimed at some objective, and (3) has probably prompted me to recall prior knowledge, which likely led me to the point where I’m encountering the short eLearning lesson and deciding whether to complete it.

A combination of job aids, short eLearning lessons, and other available resources might present content and provide learning guidance. And chances are, I’ll apply my learning to my job...and hopefully receive some form of feedback.

But that’s just one way the pieces might come together.

If an eLearning lesson offers background or conceptual information to help a job aid make sense, the lesson’s role might be to stimulate recall and present content.

If the lesson shares examples, scenarios, demonstrations, or stories that help convey tacit knowledge about a concept or task, then the lesson’s role would align more with guiding learning.

Or, if a lesson provides quick practice opportunities to help learners ensure that they’re correctly understanding something new, then perhaps its role is to elicit performance and provide feedback.

Are you doing this?

Once you start thinking about it, there are numerous ways eLearning can enhance informal workplace learning. Do you have examples you can share? Or challenges you’ve encountered?

Sunday, March 13, 2011

What you don’t need to know about SCORM

by Jonathan Shoaf

The first time I heard the term SCORM was about 10 years ago.  I never paid it much attention.  I worked in the higher education industry and SCORM just wasn’t a big player there.  About 5 years ago the company I was working for had a product called Wimba Create and the term SCORM started swirling around in my head again.  Wimba Create is an add-on for Microsoft Word and can convert content to a SCORM package that can be used in a learning management system (LMS).  Anyone who uses Word can create a SCORM package.  But in higher education, not many of our customers were asking for SCORM.

It wasn’t until I started developing content for corporate training that SCORM became important.  While academic institutions have been slow to adopt self-paced learning (without a real live instructor), corporate institutions have seen the benefits of it for adult learners who have very specific job skills they need to learn.

My SCORM Story

SCORM is a standard to help promote interoperability of content between learning management systems.  I have recently benefited from developing content for this standard.  My organization was ready for online training, but was still in the process of choosing an LMS.  I was able to start developing content (to the SCORM standard) and use it with a temporary learning management solution while my organization went through the process of choosing an LMS.  When the new LMS was up and running I was able to reuse the content I developed for the temporary LMS with the new LMS.  Reusable content...that is the major benefit to SCORM.

While there are several versions of SCORM (the most common being 1.2 and 2004), there are two common aspect of all versions of SCORM that are important to understand.  These are packaging and playing.  It is important to understand some of the basics of SCORM; however, if you use a rapid development tool like Lectora or Captivate, you really don’t need to understand the technical details of SCORM.

The SCORM Package

A SCORM package is how all the content you have developed is tied together into a single file.  This is really just a compressed file with XML data that contains an organized map to the content files.  This is known as a package interchange file or PIF.  When this content is uncompressed on the LMS content server, it is considered deployed SCORM.  When this content is on a completely different content server than your LMS, it is often referred to as distributed SCORM.

A SCORM package must contain the following contents:
  • An XML manifest file called imsmanifest.xml
  • Any Schema/definition files referenced by the manifest file (i.e. .xsd and .dtd)
  • All content resource files to be used for the learning activity
Most developers who use rapid e-learning development tools don’t need to work on the details of the package itself.  You just build the project and let the tool handle the packaging.  There have been only rare occasions where I have had to manually add resource files to the zipped package.  If you are not using a rapid e-learning development tool, then you will need to know the details about how to create a SCORM package yourself.

Running SCORM on an LMS

The SCORM standard provides a run-time Application Programming Interface, or API, for making your content work with an LMS.  SCORM-based content is broken down into Shareable Content Objects, or SCOs.  As the learner progresses through each SCO, the SCORM API tells your LMS where to start in your content and the status of the learner as they progress through the content.

The LMS provides the SCORM compatible API.  For example, if you LMS supports SCORM version 1.2, that means it provides the SCORM 1.2 API.  If you create a SCORM 1.2 package, it should work with your LMS.  Behind the scenes the API is implemented though javascript, also known as EMCAScript.

There are very basic calls you can make with this API.  These calls include intializing (or starting communication with a SCO), terminating (or ending communication with a SCO), getting or setting values (like the learner name, completion status, score, bookmarks, or other content variables), and committing data to the LMS (which essentially tells the LMS to save the data).

As with SCORM packaging, if you use a rapid development tool, you really don’t need to understand all the details here.  You just build the project and let the tool provide all the SCORM communication you need.  It is rare, when using a rapid e-learning development tool, that I have had to directly access the SCORM API.  The last time I made a direct call to the SCORM API was to call the LMSCommit function which saves the current learner data to the LMS in SCORM 1.2.  This was done in the middle of a lengthy lesson so that the learner wouldn’t lose all their work if the system timed out or crashed.  If you are not using a rapid e-learning development tool, you will need to either use a javascript SCORM library from a 3rd party or create it yourself (not recommended unless you are an experienced javascript developer!)

Conclusion

I hope this gives you a better idea of the “magic” of SCORM.  Using a rapid e-learning development tool means you likely will never need to sweat the details.  Oh, and in case you’re wondering, SCORM stands for Shareable Content Object Reference Model.  Yeah...you can forget that.

Wednesday, March 9, 2011

Adapting 20th Century Training Models for the Future

By Jay Lambert

The Technology Association of Georgia's (TAG) Workplace Learning Society tried an interesting experiment recently by holding a discussion-only meeting on the topic of "Adapting 20th Century Training Models for the Future: Technology's Impact?" Obviously this is a hot topic as probably close to 70 people attended to form a standing room only crowd.

The questions posed were:
  1. What is your learning philosophy? For example, does your organization have a preferred model such as ADDIE, Kirkpatrick...? Are you learner-centric or value-centric?
  2. How has technology either assisted or become a barrier to executing that philosophy?
But while the conversation was supposed to discuss adapting training models to our industry's current environment, instead it strayed off into utilizing new eLearning delivery tools, getting a seat at the C-level table, and philosophies on learning effectiveness. Looking back on it, I was one of the drivers of that detour.

So here are a few things I wish I had said about Adapting 20th Century Training Models for the Future.

We still use the ADDIE model.

For proof, see the 2010 recap of this blog organized around ADDIE.

Actually, we did modify it a bit and follow the DADDIE model. DADDIE stands for Define, Analyze, Design, Develop, Implement, and Evaluate. Define was taken from the DMAIC model and is basically all the important pre-project information that you need to understand before you can begin a learning project (things like goals, stakeholders, resources, where to begin, etc.).

The genius of the ADDIE model, in my opinion, is its adaptability. You can make it as complex or as simple as you need for a specific project. But in essence, how can you create eLearning, or any learning program, unless you have:
  1. Considered what content is necessary and gathered it (Analyze)
  2. Written the content to be effective and instruct what is necessary (Design)
  3. Actually built it (Develop)
  4. Put it somewhere where your intended audience can view it (Implement)
  5. Asked yourself whether what you did worked or not (Evaluate)
Perhaps what has changed with the new century is the speed in which we must perform these steps. Projects lasting 12+ months are mostly a thing of the past except in rare circumstances. We live in a "I need this tomorrow" type of world.

So we still use ADDIE; we just move through the phases on a compressed schedule.

Technology enables us to be both learner-centric and value-centric.

A corporate eLearning director and I discussed this recently. One of his internal clients has a preference of publishing PowerPoint-driven lectures online with audio. They're very templated and look good, but you couldn't really call them eLearning by most standards. Still, he's created a rather large library that interested associates can refer to as needed. It's specialized topics, so people that want the information are finding it. The director suggested that perhaps that guy is the true visionary; he's not creating anything extraordinary, but rather simply addressing the basic business need as efficiently as possible.

We write a lot of posts on this blog talking about using tried and true models and techniques to create better eLearning for the learner. It's vital that the learner be considered every step of the way. For example, what does he or she really need to know and apply to do their job successfully? Another example might be, what are their technology restraints for receiving training in the workplace? The list can go on and on.

Once we have the questions answered, technology enables us to quickly react and, as necessary, mass produce. To what level we mass produce should be part of the decision-making conducted in the Analyze phase. Some projects will demand that we fight for the learner and create true deep-dive learning experiences. For other projects, being value-centric might be sufficient and actually a good idea.

Think about this. Never before have we been able to so quickly respond to business learning needs and create training programs. If we can create process-driven solutions that address both the organization's and the learner's needs, then it's a win all around.

Value-centric advantages that are also learner-centric include:
  • Shorter and specific training has been proven more effective
  • A templated approach often makes it easier for the learner to focus on the content
  • Reusable learning objects can be pulled into a learner's stream on an as-needed, just-in-time basis

Technology both assists and hinders our learning philosophies.
 
Learning strategist Erick Allen moderated the TAG Workplace Learning event. His opinion has long been that the emergence of rapid authoring tools is ruining our profession. Things are too fast; anyone can click Publish and now call it eLearning. It says so on the software box.

I think that there is definitely some truth to that. We must be careful and stay true to our core models. We must use technology to enable, not replace.

But if we do stay true to our models, such as ADDIE or DADDIE or whatever you follow, then technology can help us create some amazing things. We stand on the cusp of almost being able to develop whatever type of eLearning you can design -- immersive, augmented reality, mobile, just-in-time, etc., etc.

Technology can make our training come to life. And we should embrace that throughout this century and beyond.

Sunday, March 6, 2011

Publishing in Adobe Presenter

By Dean Hawkinson

Adobe Presenter is a great option for creating SCORM-compliant eLearning material for a learning management system (LMS) using PowerPoint. You should, of course, follow all instructional design and adult learning principles, and be careful not to make your PowerPoint too wordy and un-interesting to your learners.

Your course should be engaging and keep the interest of your learners. Using audio narration with Presenter is a great option available for those presentations that might need it. In this post, we will look at some of the various LMS publishing options available in Presenter.

The Publish Menu

The Publish menu is located on the Adobe Presenter tab in PowerPoint. To publish your course, click the Publish button.

But wait!

Before you can publish, you need to set your SCORM settings. These are found under the Quiz button by selecting Manage. This can be misleading, as you always set your SCORM settings here, even if you don’t have an assessment included in your course.

    If you do have an assessment, Manage Quiz is where you add and edit your questions. However, once you are on the Quiz Manager page, selecting the Reporting tab takes you to the publishing and reporting options for the course.

    A little disclaimer: Each LMS differs in reporting and requirements. Check with your LMS administrator for the proper settings for your LMS.

    The first step is to ensure you have checked “Enable Reporting for This Presentation.” Then work through the following options (see image above):

    • Select AICC or SCORM (depending on your LMS) under the Learning Management System (LMS) section. More on the Manifest in a bit.

    • Select the appropriate option from Choose report data:

      • Report to Adobe Connect Pro – this is only if you have Adobe Connect Pro and want to report using that software

      • Report quiz results only – the quiz score will be passed on to the LMS

      • Report user access only – reports completion to your LMS if the learner accesses the course

      • Report quiz results and slide views – if your LMS can report both quiz score and slide views

      • Report slide views only – reports to your LMS the percentage of slides the user touched in the course - you can set the percentage for completion

    • Set your Report Pass/Fail options in the Report Pass or Fail section – depending on if you want pass/fail or complete/incomplete to show on the LMS for the course. This connects back to the report data option you just selected. If you are using a quiz score, for example, you’ll want pass/fail. If you are reporting slide views only, then complete/incomplete will be your choice. You can also set the status to be based on the report data (in the Choose report data section).

    • Select if you want to report a score or a percentage in the LMS.

    • The Reporting Level area allows you the capability to show the score or the interactions and the score. This is dependent upon the LMS. Many, including Adobe Connect Pro, won’t be looking for the interactions level and will only report the score.

    When you select SCORM in the Learning Management System (LMS) section, you need to input the course information into the Manifest by selecting the Manifest button (see image below). This is the information that will be passed off to your LMS for display.

    Select SCORM 1.2 or 2004, depending on your LMS. Add your identifier (usually something like a course code), the course title and description, and a version number if you are tracking versions. Your SCO identifier is used for the purpose of sharing content across courses, with an identifier and title for search ability. Check with your LMS administrator to see if this is needed.

    Once all of the information is set, you are ready to publish the course. Click Publish to open the dialog box (see image below).

    You can save the published files to your computer, Adobe Connect Pro or even a PDF file. The default location is the “My Adobe Presentations” folder on your own PC, which you can change. Once you select the location for the project, the location remains the same every time you publish the project. It is still a good idea to double-check the location before you publish – especially if you sometimes publish to external drives.

    Under Output Options, check “Zip package” to create a zip file with all of the necessary files for incorporation into an LMS. The first time you publish, however, it is a good idea to keep that unchecked and ensure that “View output after publishing” Is checked. This opens the course automatically from the drive after you publish it so you can perform quality assurance before creating the zip file for LMS integration.

    When you’re ready, click the Publish button. That’s all there is to it.

    Any other tips or tricks for publishing in Presenter? Feel free to share them below!

    Wednesday, March 2, 2011

    Inspiring the Passion to Learn

    By Donna Bryant

    Whether you’re in school or a professional on the job, learning something new can be challenging. Instinctively, we tend to want to stick to what we know rather than making the stretch to do something new. At any age, learning stretches the brain just as much as exercising stretches the muscles. This stretch is sometimes painful, but we all know it’s worthwhile.

    Motivation to learn

    When you set out to learn something new, what motivates you to even try? Why expend the effort, energy, and time? Often, learners have the same motivation to learn that designers have to design — it’s our job that we have to do. This attitude can sap the passion out of learning. How can we get that passion back?

    One major motivator to learn is by gaining confidence as you gain new knowledge. Gagne’s sixth event (of his Nine Events of Instruction) is eliciting performance/practice. You can elicit performance through imaginative, guided play/practice. When learners can play with new knowledge using guided practice within a lesson, they can safely confirm that they’ve “got it.” Learners tend to have more passion and interest in tasks they do well. Successful practice builds confidence and enthusiasm for new knowledge. Also, learners are more likely to remember what they learned because they have actually done it.

    How can you build imaginative practice into your eLearning lessons?

    Realistic scenarios that relate to your lesson’s subject are a good method, as are realistic games that support your learning objectives. Guided practice using online support information (such as a robust help lookup tool) can also help to provide enhanced practice time for learners. Why is play/practice time so important? Consider that learners:
    • Need a safe place to try out new knowledge and gain confidence using it.
    • Need a place where it’s ok not to have the right answers and they can safely learn to find those right answers.
    • Need an opportunity to develop strategy and ideas for handling situations with a new tool or process.

    Motivation to grow

    How can learners continue growing in their overall knowledge and confidence as they transition to using new knowledge in the “real” world? Gagne’s event nine, enhance retention and transfer, encourages providing ongoing support for learners as they transition to using their new knowledge in real situations. A few ideas to enhance retention and transfer include:
    • Offer follow-up support via an online Q&A or “community.” Most company intranets have a way this type of site could be set up. Check with your company to see what is possible. You could start the site by posting stimulating, mind-stretching questions related directly to the new knowledge.
    • Set up an expert panel about a month after the transition and arrange for them to be available for a mini webinar or Q&A session. Allow learners to call in with their questions or participate in the webinar.
    • Provide a play “sandbox” where transitioning learners can practice new skills as they have time to practice.

    You might have other ideas as to how to encourage learning growth and continued confidence. And if you do, please share them in the comments!