All blog posts ……..Be sure to scroll down to subscribe!
In one very important way, instructional materials are similar to screenplays, concert set lists, and computer source code: whether we refer to it as a dress rehearsal, a run-through, or a quality check, each must be reviewed and tested thoroughly prior to “go live” if our goal is a high-quality result that meets our audience’s expectations.
For instructional materials targeted for asynchronous delivery, those tests fall into 3 categories. (Thinking of these tests as 3 leaves of a clover can help us remember them.) The tests are:
- A navigation test, the goal of which is to make sure the materials are accessible and navigable.
- A content test, the goal of which is to make sure the content itself (irrespective of layout and design) is as accurate, clear, concise, and complete as possible.
- A usability test, the goal of which is to make sure that our audience—learners unfamiliar with the content and presumably unschooled in test scripts and use cases—can access, navigate, and successfully complete the materials with minimum frustration and few or no helpdesk calls.
This article explains how to conduct all three tests prior to release and also how to test materials targeted for synchronous delivery—the Cadillac version of instructional testing approaches, if you will. (If all you’ve got time for is the Pinto version, however, head straight to the SME-conducted content test, which will generate maximum value for the time and effort spent.)
How to conduct reviews & testing for ASYNCHRONOUS materials
Instructional materials commonly delivered asynchronously (online self-serve) include text, images, videos, audio clips, ungraded interactivities, and graded assessments.
1. Package your instructional materials exactly as you intend to package them for learners. For example, if you’ll be delivering any part of your instruction online, upload those materials to your production environment. If your audience will be accessing your materials through a learning management system, upload them to your LMS and configure them. The goal here is to set up the instruction just as completely and accurately as though you expect a paying audience to access it.
2. Conduct a NAVIGATION TEST and apply the results. Ask either a fellow instructional designer or someone who works in IT at your organization to exercise the packaged materials by clicking every link/button/hotspot and jotting down (and then giving you) the results. Your technical reviewer isn’t particularly interested in accuracy from a content perspective, but instead is testing navigation and playback by looking for issues such as:
- Are there any userid/password issues with getting into the materials?
- Do all the links work?
- Do all the images display (and, if not, do sensible ALT tags appear)?
- Do navigation buttons such as “next” and “back” work predictably?
- For videos, is the audio synchronized with onscreen visuals?
- Is the audio loud and clear enough to make out?
- For videos with closed captions, are onscreen captions both accurate synchronized with audio?
- Are there any glaring typos (onscreen, in transcripts, or in any linked-to file) or other obvious errors, such as clearly mislabeled images or unexpected sequencing?
- Does anything else strike the tester as confusing? Good technical reviewers will jot down the point at which they struggle to figure out where to go or what to click next. Their job isn’t to explain how to fix the materials, but instead simply to call out the point confusion,
3. Conduct a CONTENT TEST and apply the results. After you’ve had the chance to fix any navigation or playback issues your technical reviewer identified, ask one or more SMEs to review your updated materials for content accuracy, clarity, and completeness. Ideally, ask the SME(s) who helped you create the materials; doing so increases your SMEs’ feelings of investment in the project and communicates unequivocally that their opinions matter. Ask your SMEs to work through the materials and identify any inaccuracies, gaps, or points of potential confusion:
- In any of the instructional content (including all text, images, and videos)
- In any of the graded or ungraded assessments (including the relevance of each assessment question/activity and the accuracy of the feedback that displays for both correct and incorrect performances). SMEs should also be on the lookout for assessment questions that aren’t covered in the instructional materials.
PRO TIP: If you only have time and resources to conduct one round of review and testing, make it a CONTENT TEST. Your SMEs will likely catch at least some technical errors along with content-related edits. And you’ll be strengthening your relationship with your SMEs, too, which will benefit both this project and the next.
4. Conduct a USABILITY TEST. After thoughtfully updating your instructional materials based on SME feedback, ask one or more people as similar to your target audience as possible to participate in a short review session.
- Ask your volunteer testers to access your materials under conditions similar to those you anticipate when your project goes live. For example, if your target audience will be using laptops, ask your testers to use laptops; if you expect your audience to use iPads or phones, ask your testers to use iPads or phones; if you expect your audience to be logged into a virtual private network, make sure they’re logged in prior to testing; and so on.
- Sit with each beta tester individually and ask him or her to access and work through your materials independently, reacting and thinking out loud as much as possible while you act as a “fly on the wall” (in other words, while you avoid interacting, influencing, or explaining). Your role here is to be unobtrusive and take good notes. Can your testers find your materials easily? What’s your testers’ first reaction? Do they know where to start? Can they identify next steps easily? Are your testers able to view all the videos, download all the documents, scroll through all the text content, and complete and submit all the assessments? Can they figure out how to view assessment grades or other feedback? Do they find interacting with the materials straightforward and appealing? If they run into technical issues, can they resolve them alone, or (if not) can they identify whom to contact for assistance? Look and listen for signs that your testers are frustrated, bored, confused, or skipping content.
How to test SYNCHRONOUS materials
Testing the delivery of synchronous materials, such as live lectures, saves your audience time and helps you deliver a more polished, effective presentation.
1. Complete at least two dry-runs (volunteer audience members optional) all the way through all slide decks. As you present your slide deck (to a human volunteer, your dog, or the mirror) you’ll likely find you need to adjust slide content or order. Pay special attention to sequencing: on-slide content should always read left-to-right and top-to-bottom, so if you notice yourself changing the order in which you present content as you practice your presentation, change the slide content to fit the presentation order that makes sense. (Order might seem like a minor issue, but it actually plays a significant role in driving audience understanding and recall.) In addition, if you find yourself saying “we’ll cover that later” more than once or twice as you practice the presentation, review your slide deck with an eye toward reshuffling slides. Slides should always be organized so that the presentation flows logically, anticipating and proactively addressing likely audience questions.
2. For virtual lectures, demos, or meetings: invite three people to a test meeting set up in Zoom, GoToMeeting, Google Meet, Webex, or whatever other web conferencing software you intend to use for your target learner audience. If you’d like to present another dry run of your material, this would be a good time to do it! But if you or your volunteer attendees are pressed for time, just make sure that all three attendees can:
- Locate the link for, access, and log into the virtual meeting.
- See and hear the presentation.
- Interact via microphone and webcam.
- Submit messages to the meeting chat.
- Take polls (if you intend to poll audiences as part of your presentation).
- Meet in breakout rooms (if you intend to use breakout rooms, which can be useful for small-group discussions, in your presentation).
- Collaborate using a white board (or any other collaboration app you intend to use during your presentation).
- Download files (if you intend to provide downloadable files during your presentation).
The bottom line
Many of us work in environments characterized by short deadlines, understaffed teams, and challenging workloads—all of which can make testing our instructional materials seem like an luxury option, not a bare-bones requirement.
But here’s the thing. While it’s understandably tempting to want to skip the time and effort required to test our instructional materials and jump straight to rollout, the truth is that it’s impossible to skip the testing step. That’s because the definition of “testers” is “the first people to interact with our materials and shake out the bugs.” So if we choose not to work with volunteer testers, we’re actually choosing to require our first learner cohort to serve as testers. And that’s not fair. (It also doesn’t do a lot for our reputation.)
Taking a few days to test all the work that went into our project deliberately, using volunteer testers, won’t just deliver a higher quality experience for our audience; it will also reflect well on us, our team, and our organization—and, as an important side benefit, it will also help to strengthen our relationships with our SMEs.
What’s YOUR take?
Do you regularly conduct reviews or tests of your instructional materials? If so, how long does the process take? Have you found getting manager buy-in a factor in justifying the time to conduct reviews? Please considering sharing your best tip (or worst frustration) in the comment box below.
Leave a comment