By Jay Lambert
There has been a lot of discussion, and an infamous article or two, in our field about the death of the ADDIE model. This came up again in the comments on my recent blog post, Adapting 20th Century Training Models for the Future.
As a reminder, ADDIE stands for Analyze, Design, Develop, Implement, and Evaluate. The naysayers have been saying that this process is too slow and archaic for our modern "this is due by 5pm today" times.
And if the attacks are based on the long timelines typically associated with ADDIE, then they make some valid points. It's sad, but true. Most of us have seen business sponsors' eyes glaze over when full project plans are introduced. Sometimes the business need warrants a more formal model approach; other times rushing to production is a necessity. But we still need to ensure that those rush projects meet their objectives.
As an industry, we must adapt. And models such as ADDIE help us do so.
ADDIE is the basic backbone of our processes.
As mentioned in an earlier post on this blog, ADDIE functions as the basic backbone of our industry processes. Just about everything we do and every model we might use fits within one of the phases of ADDIE. Consider,
Analysis enables us to identify the cause of a performance gap and formulate a plan for addressing it. It's during this phase that we define an organization's true business need and determine whether or not that need can be resolved via training. That's a pretty important task.
As learning practioners, we must still have some form of Analyze in place; we just can no longer take 3 - 6 months to complete it. We must gather our data quickly and react to it even faster. Ignoring this Analysis phase is a recipe for disaster. Perhaps, out of necessity, we gather what information we have, make a best guess, and go for it a la the Agile development method, but we must still take that step. If there is no Analysis, then it's likely that an initiative won't satisfy the true business need.
In my mind, Design, together with Analysis, is the most critical part of the process. For true learning to take place, the recipient must be engaged, motivated, pulled in. Whether you are delivering a 5-minute training snippet or a full curriculum, it must be designed well for both the content and the audience. We all know this is true by the number of anti-PowerPoint bullet rants we see on a weekly basis.
I'm a big fan of scenario-based and simulation eLearning; these take longer to Design, but the outcome is worth it. And I'd argue that even a quick 3-minute software simulation takes instructional design time to accurately show exactly what the user should be doing.
Develop and Implement
These two phases are likely where a good number of rapid eLearning development projects find themselves confined. To have eLearning, you have to have development and implementation. If it is not built and subsequently rolled out, then no one is going to see it, much less benefit from it. But if a learning project starts and stops here, like is tempting for some timelines, then problems are likely to ensue.
Like with each step of our sacred training models, what has changed now in these phases, and must be considered, is the speed. I advocate holding pilots, but often that step is dropped from project plans to enable faster implementation. But if we consider joining ADDIE with the Agile development model, then perhaps each implementation is a form of pilot anyway.
Evaluation is the phase that is truly suffering in our fast-paced world. We implement; we move on. But as evidenced in many, many posts, articles, conversations, and presentations, we have to find some means to evaluate and effectively measure our programs. It's the only way to prove that our industry is making an impact within organizations; and, let's face it, it's the only way to keep our budgets intact so that we can continue with our efforts.
Why are training departments often gutted during down times? Because we don't do a good job of following through with evaluation and proving our critical business worth. Our intiatives are supposed to make things better for both the learner and the organization; evaluation is the only way to know if that is happening. So yes, the need for Evaluate is still alive in the 21st century.
And it's interesting here locally that the March and April ISPI Atlanta and ASTD Atlanta chapter meetings both focus on learning measurement. And the May ISPI Atlanta topic is 'Predictive Evaluation.' Think evaluation is on our minds?
I also like to include a phase before Analysis, the Define phase, as taken from the DMAIC model. This is all the extremely important pre-learning project information that can come back to haunt you if you skip over it. Believe me, I know it's easy at today's pace to blow past and think you'll come back later; but don't.
By the way, adding Define makes this the DADDIE model; my team dislikes this acronym. If you know a better one, please let me know.
So what are your views on ADDIE? Are you still following its phases in some evolved state? Just please don't tell me it's dead.