All blog posts ……..Start each work week with the latest issue. Scroll down to subscribe!
How to conduct qualitative evaluation interviews
When it comes time to evaluate the training we produce, many of us take a hands-off approach with quantitative instruments such as:
- Exit surveys
- Course completion reports
- Objective (and therefore auto-gradable) assessments
- Quantitative metrics from the field, if we’re very lucky
But quantitative data can only tell us so much.
Quantitative data—from the Latin quantus, meaning “of what amount” or “of what size”—can tell us there’s a problem. But it can’t tell us what kind of problem. For that, we need qualitative data (from the Latin qualis, meaning “what kind”). And one of the quickest, least expensive ways to gather qualitative data is to conduct real-time interviews with learners and their supervisors.
Quantitative data can tell us THAT our training failed. Only qualitative data can tell us WHY our training.
The value of qualitative interviews
Qualitative interviews conducted somewhere between a few days and a few weeks post-training can help us understand our learners’ knowledge and behavior gaps in a meaningful, actionable way. With the data we gather from qualitative interviews, we can determine where our training failed, and how to fix it.
For example, let’s say we know our latest training wasn’t particularly effective. Nearly all of our learners completed our training, earned 80% or more on our assessments, and—according to our exit surveys—enjoyed their time in training. Based on these quantitative measures alone, we could reasonably conclude that our training was a flying success.
And yet we know, from business outcomes, that learners aren’t applying the skills we trained in their actual jobs.
Unless we ask direct, open-ended questions of the only people who know why learners aren’t applying the skills we trained, we’re forced to guess—or, more likely, to shrug our shoulders and assume we’ve done all we can.
Instead, we might consider the following approach.
Step 1. Prepare to conduct a qualitative post-training evaluation interview.
- Plan to conduct the interviews a few days to a few weeks after training completion. Why such a short window? Because our goal is to identify knowledge and behavior gaps tied to training. The longer we wait past a couple of weeks, the harder it will be to tease out what’s related to training vs. to other factors.
- Plan to spend 15-30 minutes per interview. You’ll want to conduct one-on-one sessions, either face-to-face or virtually, with at least five learners* . If learners’ supervisors have been in a position to observe learner behavior post-training, interview them, too. *While this excellent article refers specifically to website usability, it’s applicable in this instance, as well.
- Send out calendar invitations.
- Type and print a two-page answer sheet for each interviewee. (Alternatively, use a pad and paper.) At the top, type (or handwrite) the interviewee’s name, date/time of interview, and each of the questions you’ll be asking, leaving a generous amount of space between each question. You’ll use this space during the interview to handwrite each interviewee’s answers.
Step 2. Write effective interview questions.
- Focus on applied behavior, not the training. GOOD: “Now that you’ve been on the floor for a while, what’s still fuzzy for you?” BAD: “What part of the training helped you most?” GOOD: “What resources have you found most useful for helping you complete process XYZ?” BAD: “What do you wish the training had included?”
- Write open-ended questions. If we attempt to guide or corral interviewees, they’ll be more likely to tell us what we expect to hear; but that defeats the point entirely. The goal of conducting a qualitative interview is to put interviewees in charge and let those critical-to-understanding-and-fixing-the-issue curve balls come out of left field. GOOD: “What are you still struggling with?” BAD: “Which steps about the XYZ process are still unclear for you?”
- Stick to a handful of questions. Aim for somewhere between three and eight.
- Use the language and terms your interviewees speak (vs. edspeak). GOOD: “What gaps are you still seeing in your direct reports’ knowledge?” BAD :“What Bloom’s verbs are you seeing applied in authentic contexts?”
- Don’t send your prepared questions to interviewees prior to the interview. Top-of-mind responses are much more likely to be both accurate and relevant.
3. Step 3: Conduct the interview.
- Begin by thanking each interviewee for taking the time to meet with you. They’re doing you a huge favor; let them know it.
- Explain that your goal is talk about their experiences and pain points in the field—not about the training. This helps interviewees focus on their daily gaps and successes vs. on training materials and facilitators. (It’s very common for employees to know and have positive feelings about training facilitators; but these feelings, while both positive and legitimate, are irrelevant to the discussion at hand and are likely to skew the interview results.)
- Reassure interviewees that there will be no blow-back. Experience has taught some employees that when they answer questions truthfully and express something that could be construed as negative, bad things happen. Reassure them that this will not be the case with this interview (and mean it).
- Ask clarifying questions if necessary, but don’t argue or defend. Your goal is to gather information, experiences, and opinions, not provide them.
- Don’t type answers while interviewees are talking. Look at them—engage—converse. Glance down briefly from time to time to jot their answers in pencil (just the main points, not entire sentences), taking advantage of the spaces you left on each answer sheet for that very purpose. Doing so exemplifies your interest in connecting with, listening to, and learning from interviewees (vs. filling in blanks for yet another report that will be ignored).
Step 4: Analyze feedback & act.
- Type up your notes as soon as possible. If you can, type them up right after the interview. If you scheduled back-to-backs, do so later, but ideally before the end of the day. The act of typing up your handwritten notes while they’re fresh helps you decipher your handwriting, recall nuances, make connections, and spot patterns more easily.
- Look for patterns. An easy way to do this is tally answers. For example, write down one answer (e.g., a call center employee struggled to figure out which system to use to answer a specific customer question). As you run across more interviewees who identify the same struggle with the same question, add to your tally. At the end of your analysis, you’ll have a list of specific gaps and a percentage of how many interviewees those gaps affect.
- Tie patterns back to training specifics. If several people are unclear about a specific concept or struggle with a specific task, revisit how this concept or task was supported in training. You may find you need to clarify and repeat specific concepts; articulate steps more clearly/completely; provide more authentic practice for specific tasks; and/or provide better reference materials for specific concepts and tasks.
- Change your instruction based on interview feedback, even if it means starting from scratch. Otherwise, interviewing was a waste of everyone’s time. Worse—it taught your learners that any future feedback they provide will likely be ignored, too.
The bottom line (TLDR)
At best, all quantitative data can tell us is the extent to which there’s a problem with our training. If we want to identify the problem and fix it, we must gather qualitative data, analyze it, and act on it.
What’s YOUR take?
Do you have a different point of view? Something to add? A request for an article on a different topic? Please considering sharing your thoughts, questions, or suggestions for future blog articles in the comment box below.
Leave a comment