7

Testing and Implementation

You’ve done your analysis and you’ve designed a job aid. Now comes the time to get feedback on the initial draft and build in improvements. This constitutes the validation and troubleshooting phases, also called formative evaluation.

Formative Evaluation

Formative evaluation is the process of identifying ways to improve the design and checking feedback on the job aid. It is critical to do a formative evaluation with any job aid you design. No matter how confident you are with the initial design, a job aid prototype always benefits from validation and troubleshooting. This is because the test of a job aid’s quality is not based on your comfort level with the design, but on the users’ reactions and how well the job aid fits the work circumstances. A good instructional designer of job aids is capable of seeing things from the performer’s perspective. But ultimately, no matter how empathic you are—no matter how good you are at visualizing the performer’s challenges—nothing beats having the user take your job aid and put it through its paces on the job. This is the time when you involve the users and seek their input regarding the initial design.

Basic Rule 10

Always test the initial design of a job aid. You must confirm how well the design fits the users’ needs and work demands.

To validate the design, you test the job aid prototype and get input from potential users. There are several factors involved in testing the job aid prototype and getting feedback. Specifically, you need to validate three elements of the job aid: accuracy, usability, and alignment.

Job Aid Accuracy

Accuracy looks at the job aid’s content. An accurate job aid is one that is technically correct; there are no factual or functional mistakes. Although all elements of the validation focus are important, this is the most critical. If your job aid isn’t usable, then it will be a wasted effort. If your job aid isn’t aligned, then it won’t solve the problem. If it is inaccurate, you will produce failure by having performers follow advice that is wrong or incomplete and could also generate legal or liability issues. As a result, job aid accuracy is incredibly important—it all starts with being sure that the job aid content describes the task correctly.

Surprisingly, achieving accuracy with the job aid isn’t as easy as you’d think. Often, the translation from SME to designer (and then conversion to job aid prototype) leads to errors or gaps in the content. Sometimes a SME unintentionally describes a job process incompletely or incorrectly. This may be because the process has changed since the SME learned to do the task. Remember, a SME is a good performer because of the ability to get the work done. It is a rare SME who can also explain the work process clearly, concisely, and explicitly.

You will often discover that an initial task analysis that you and the SME are happy with is incorrect because of the SME’s inability to explain how the work is actually done and the steps that are really part of that task. These lapses often occur because the SME or key performer is an unconscious performer, unaware of the critical steps of the work because they do them naturally and automatically. In any case, it’s critical to confirm the accuracy of the job aid. Unfortunately, accuracy is a frequent problem because of the difficulty of precisely identifying what is and is not part of the task that the job aid seeks to address.

Think About This

Once you’ve developed a job aid prototype, plan on showing it to a different key performer or expert from the one in your original task analysis. You might discover that a new set of eyes can identify gaps or inaccurate steps that the original creators failed to see.

Generally speaking, there are several issues you’ll want to consider when you validate the accuracy of the job aid. First, is it technically correct? Is there any way that a worker or professional could argue that the content is factually wrong? Second, is the job aid functionally correct? Job aid content can be correct technically but fail to acknowledge some of the practical realities of the job. The question of functional accuracy gets at whether the content is accurate in regard to how the job is actually done (rather than what the policy and procedure manual states). Recognize that sometimes there might be an internal conflict between the technical and functional accuracy; you’ll need to push management to resolve it. Additionally, you’ll check for completeness: Are there any gaps in the process? Sometimes what is on the job aid is accurate as far as it goes—but it’s incomplete.

Job Aid Usability

Usability in the validation and troubleshooting phases involves determining if the job aid is workable and user friendly. If you have done a good job on the task analysis, you are likely to find only minor problems in this area. However, absent a job aid that scores well in the usability column, you will have spent valuable time and resources on a piece of work that only collects dust.

One element of usability involves language. Obviously, you should check for correct spelling and grammar. You’ll want to make sure that the language and use of terms are consistent. Are abbreviations explained in the job aid? Is the language appropriate for not only the performer, but also the work setting? For instance, high-stress situations or ones in which the job aid is to be used during the task call for a much simpler and direct use of language. This may mean composing in key phrases rather than in sentences. Ultimately, the biggest test for language and usability is clarity: How clear is the job aid when it’s used in the performance environment?

You can identify usability concerns through a variety of tests. Some language issues can be handled individually by the designer, by showing the job aid to a new set of eyes, or by turning to a professional editor. It’s certainly possible to show the job aid to a performer and ask for feedback about how easy or difficult it will be to use at work. However, the best usability testing involves giving the job aid to performers and observing their use and application (or misuse and misapplication) of the job aid. A close second would be to have the performers use the job aid and then report back, often using a feedback form you’ve designed, to collect the data.

Think About This

I like to do user testing where I can observe the performer on the job with the job aid. All sorts of unanticipated topics come up through observation: Is there an easy place to conveniently store and access the job aid? How much fumbling goes on before the performer can make sense of what it says? What environmental factors did I fail to take into account? What other work setting elements would it be good for the job aid to integrate into?

Basic Rule 11

The best form of usability testing is to give the job aid prototype to actual performers to see how they use it.

It is important to understand that people change their behavior when they know they’re being observed. When you give a job aid to a worker, you can’t assume the performer’s behavior is natural. Performer feedback in an artificial setting has some value, but you also need to find out how naturally the worker turns to the job aid or how easily it is integrated into the task. For that kind of insight, you need to create as natural a work setting as possible. Try to have the performer actually use the job aid at work (rather than pulling the worker away from the job to provide feedback). Look for ways to minimize your presence during the data collection process. It may be preferable to give the worker the job aid, walk away, and then collect feedback after the task has been performed by debriefing the performer.

Noted

Don’t deploy a job aid without doing validation. No matter how good a job aid looks after the first draft, without user testing and refinement, you will be deploying a flawed product. The only justification for rolling out a job aid without seeking feedback first and revising it is if you face an emergency so great that the consequences of not correcting the performance are greater than the impact of mistakes that could be caused by the job aid or the impact of having no job aid.

When you are observing the worker use the job aid, there are several critical things to keep in mind. First, do not help the worker use the job aid. If the performer encounters difficulty or confusion, do not help them through the problems. You not only need to see what problems arise, but also how the performer overcomes (if possible) those barriers. If the job aid isn’t user friendly, you need to be able to observe this, as well as just how much of a barrier this is to any potential use. Answering questions or showing the user how it works defeats the point of this step. Do not rationalize the need to coach the user by arguing that unless the worker can open the help menu or decode the format, you can’t evaluate other aspects of the job aid. If the worker can’t figure out how to use it, you need to see how much frustration they’ll tolerate before discarding or ignoring the job aid. Second, always intervene if the worker is about to make a dangerous or costly mistake. Your job aid trial should not result in people being placed in danger or damage to work. Stop the job aid trial if either appears likely to occur.

Basic Rule 12

When you observe the performer using the job aid, do not help the worker with the job aid. Any struggles the user has provide invaluable data for you and your redesign of the prototype.

As you observe or debrief a performer, it is important to take notes during the trial or the debriefing. Do not plan on observing the user and jotting down your thoughts and observations after it is all over. One of the more common errors with observation of the demonstration is the failure to completely and accurately capture information, either through your observation or feedback provided by the user. It’s often a good idea to have a copy of the job aid with you as you observe the performer, assuming the job aid is a text-based document you can copy. That way, you can jot down notes and observations directly on the job aid.

Be sure to arrange participants for usability trials before you start the design phase and certainly before you get to the validation phase. As a designer, it is easy to assume that you won’t have trouble getting volunteers to try using the job aid. But, if you wait until after the first prototype is nearly finished to set up the trial and seek volunteer testers, you’ll delay the project significantly. You’ll almost always discover that everyone is in the middle of other work and it isn’t convenient for them to test your job aid draft. Consequently, you can add weeks to your project delivery by waiting until the last minute to pick testers and arrange the validation protocol. Do this work up front before you’ve begun the actual design so volunteers or assigned testers know to expect this commitment and can prepare for it.

Sometimes it’s not possible to actually observe performers use the job aid. That may be because much of the work is cognitive and observation may not tell you what is going through the worker’s head. Or, the work occurs in the field or in locations where an observer isn’t permitted. In those cases, you may want to consider providing a feedback form to be sure that performers are accurate and complete in the feedback they provide. This means it is critical for you to anticipate issues you want their reactions to and then design a feedback form that encourages a systematic approach instead of just asking for summary reactions.

It’s a good idea to provide a form for SMEs who are testing the job aid draft. Figure 7-1 is an example of a combination step-worksheet job aid developed to capture input for performers. Even though there are no calculations necessary, the worksheet helps channel user feedback into a particular format, making it easier for the designer to use that input. In this instance, the reviewers were examining a series of help screens and pull-down menus that had been added to a database to remind users of a series of functions. You’ll notice how the job aid lays out the steps necessary to use it. It also includes a “thank you” reminder directed to the reviewers. No matter how willing individuals are initially to serve as reviewers, when the time comes for their input, it is always a good idea for you to make it clear that you’re grateful for their help with this project.

Basic Rule 13

Always intervene if it appears that the trial of the job aid could result in a costly error or potential danger.

Figure 7-1. Example of a Feedback and Review Form

For usability testing, in particular, you’ll need to do some thinking about which users you want to get feedback from. It is tempting to sign up the first warm bodies who volunteer to help with the validation testing. It is true that sometimes you’ll struggle to get anyone at all to review the job aid draft. Nevertheless, it’s critical for you to get the right reviewers. At a minimum, you should look for a fresh set of eyes—someone different from the original SME used in your task analysis—who will be called upon to use the job aid at work, rather than another trainer.

Technology-based job aids require even more thought on user feedback. For instance, with an EPSS or technology-based job aid, you should identify a combination of users, including:

•  a novice user, who is uncomfortable or inexperienced with the software or system

•  a power user, who knows the technology well but is not necessarily an expert at the task

•  a SME, who is an expert at the task and perhaps a key performer or exemplar

•  an IT or systems representative, who might discover problems, not with the individual use of the job aid, but how it must be integrated into the technology or maintained by the system.

You need this range of testers for technology-based job aids because even simple job aids, such as a pull-down menu or initial information screen, usually have a wider range of potential “fit” issues to be sure it works both for the universe of possible performers and within the technology systems.

Job Aid Alignment

Alignment is the third element that you check during the validation phase. Alignment assesses whether the job aid addresses the performance gap or task originally identified. If you’ve done a good job on the task analysis, alignment is not likely to be a significant problem. Nevertheless, this does not mean that you can skip this part of the validation phase.

It is not uncommon to discover that workers are using your job aid in a manner you did not anticipate. For instance, a laminated pocket card might become a bookmark, a letter opener, or an ice remover. More typical examples involve situations where the job aid is appropriate for a particular task, but not the one it was originally designed for.

Let’s say a sales-closing job aid on how to answer customer objections has ended up as a job aid that provides reminders for potential buyers about program benefits. Because the focus of the job aid has changed (probably accidentally), the original performance problem—low sale-close rates because of weak responses to customer objections—hasn’t been addressed. Another instance might be where instead of using a job aid to reinforce performance on the job, it is given out at new employee orientation to reduce training time. But, the result is that existing performers don’t receive the new job aid and the new performers never get the chance to build the skills that allow the job aid to be effective. Therefore, it is important to re-examine the original justification for the job aid, the task or performance gap that was supposed to be addressed, and what the job aid actually ends up being used for.

Think About This

Alignment problems can happen easily with job aids. Sometimes this is a function of poor focus on the design end. It may be a result of user confusion as to the job aid’s purpose. A clear title and initial explanation at the top of the job aid can often eliminate many alignment problems that are a result of the performer using the job aid for the wrong task. If workers are intentionally using job aids for purposes they were not intended for, such alignment problems can only be solved by a good understanding of the work environment so that the job aid is not susceptible to misuse.

Part of the challenge in assessing alignment is that workers may hide how they are misusing the job aid. Or, perhaps it is not obvious that the job aid is being misapplied. For instance, consider the example mentioned earlier of a job aid designed to improve a sales-close rate by providing answers to common customer objections. You might observe the sales rep during a client meeting and see the sales rep refer to the job aid during the sales close. But, because the employee is using the job aid to talk about benefits of the product (and not to counter customer objections), it ends up being misapplied and the sales-closing problem isn’t solved. Although you would see the job aid referred to and the content used in the sales pitch, unless you listened carefully and knew the sales task well enough, you wouldn’t realize that the job aid had been applied incorrectly. This also illustrates how, if you had to depend solely upon performer feedback, you’d likely miss this issue because the employee probably isn’t aware that they’re using the job aid inappropriately.

The alignment assessment usually requires perceptiveness on the part of the designer. You need to determine if the job aid was used correctly and for the proper task. You’ll also need to judge whether the job aid is being misused covertly. In some instances, the way to obtain this information does not come from observing the worker with the job aid but by checking it after the workday is complete. For instance, are there strange stains or wear patterns on the job aid? Are job aids ending up in locations away from the work site or with employees who don’t do that task? Is the workforce running through the job aids at a faster rate than you expected? Should any of these scenarios be true, it’s not necessarily a negative consideration. Perhaps the job aid is more useful or needed across a broader spectrum than you and the client anticipated. You just won’t know that until you assess for alignment.

Building in Improvements

After you’ve tested the job aid design, you’ll need to incorporate the feedback into the design and make changes. This should be followed with another set of testing and user feedback. Sometimes you’ll discover that feedback you received, despite being well intentioned, is erroneous. Thus, the validation and troubleshooting phases of the job aid development process might involve some repetition as you work to get everything right.

Rollout

You’ve got a job aid you feel good about, and you’ve validated the design with a series of user tests and editing reviews. You’ve made a few modest changes based on the user input, and it looks like the job aid is ready to go. What now? You can’t just start handing it out to workers—or can you? Rollout and implementation strategies vary tremendously depending upon the workforce and nature of the job aid.

For starters, you may have to provide some kind of training to workers on how to use this job aid. As discussed in chapter 5, the third phase of the job aid development process is to determine if your job aid will require training and, if so, just how much and what kind. It is extremely rare that a job aid can simply be distributed to the workforce with no preamble or advance setup and preparation. Many designers think that their role is to create the job aid, and once that’s done, their role is over. Nothing could be further from the truth. Absent intelligent planning about rollout, the job aid will end up being misused or ignored by those it’s intended to benefit.

The choice of roll-out and implementation strategies depends upon many different variables. There is no standard approach for implementing a new job aid. Here are some of the questions that you need to answer to determine how to best implement a new job aid:

•  How widespread are the potential users? Is this job aid for a limited pool of performers, such as a team or one category of employees like administrative assistants, or will it be distributed across the workforce?

•  Does the job aid require some sort of training or orientation? If so, how extensive? Does this training require practice and role plays?

•  Is this a new job aid or a replacement of an existing one? If it’s a replacement, is it simply updated information, or does it call for workers to use the job aid in a different manner, such as moving from a paper copy to a tablet-based aid? What will be done with the old job aids? Is there a negative result, such as incorrect work, if performers continue to use the old job aid in preference over the new version?

•  Is the workforce used to relying on job aids or will this be a new experience for them? Is the job aid to be integrated into an existing family of job aids (so it’s consistent in look and feel to other products the workforce uses)?

•  Was the trigger for this job aid initiated by demands from the workers? Does the workforce have any idea that the job aid is coming? How enthusiastic do performers seem to be about the job aid? What did the user tests tell you about acceptance and buy-in?

•  What is the nature of the workforce? Are the workers centralized or spread out? Is there shift work or a strong reliance on part-time, seasonal, field-based, or co-located employees?

•  What did your task analysis and validation phases show to be the biggest barriers to effective use of the job aid? What kinds of resistance (intentional or unconscious) will you need to overcome? In what ways will employees need to change how they do things if they are to use the job aid appropriately?

•  Is there likely to be any embarrassment associated with using the job aid? Will there be peer pressure to avoid using the job aid?

•  Is the job aid to be stand-alone, integrated, or embedded? If it’s going to be integrated or stand-alone, what storage provisions need to be made, such as applying Velcro tape to all computer monitors to attach the job aid or making sure all employees have a place to put the laminated sheet? If it is to be attached to another piece of equipment, like a dashboard, computer, or ladder, is there another department that owns that tool? If so, do you need their agreement before you can attach the job aid?

•  Does frontline management know about this? If employees come to them with questions, will they be able to answer them? Will they be advocates for the job aid?

•  Are there any incentives for employees to use the job aid? What happens to the worker who doesn’t use it?

Think About This

If this job aid replaces a previously existing one (or is likely to be updated in the future), then version control is critical. Develop some kind of version index that is consistent with other job aids. You are probably not going to be able to track down every single copy of the old outdated job aid, so you’ll need to rely on workers to do some of that policing for you. That means the workforce will need to be clear on which version is more current.

Acceptance and Buy-In

You need to focus on acceptance and buy-in from several different levels. At the front end of the project, there is the issue of management support for the job aid process. You might face managers who want to buy new toys or spend a lot on training instead of going with a far more efficient job aid. You might have clients who attempt to short-circuit the process. They could argue there is no need for a task analysis or validation phase. Finally, there is the acceptance of the users themselves. It might be necessary to overcome user resistance to the job aid. How can you do this?

Noted

The key point to understand about any roll-out initiative is that just because the job aid is self-explanatory doesn’t mean implementation planning is not necessary. Even if a job aid is simple and straightforward, performers might push back if they showed up for work one day and found it attached to their computer screens. Or, management might feel that corporate headquarters was meddling. Roll-out issues are affected by the nature of the workforce, habits, organizational culture, and experience with other job aids. The rollout might be very simple (people come to work the next day to discover the job aid on their desks with a short note) or very complex and involved. Do not assume that the extent and type of implementation planning should be based solely upon how clear and intuitive the job aid is.

Start by selling the job aid to the highest level in the organization that you can. The person who contacts you about this work is your client, but you want to go above your client for more background and support. As a rule of thumb, go at least two levels higher than where your client is organizationally. Ideally you should go as high as you can in the organizational structure.

Once you start talking to people about the job aid, focus on performance. Talk about the current performance gap and what it’s costing the organization. Describe the performance the job aid is to focus on. Job aids usually have a very high ROI ratio, especially in comparison with many other solutions such as training, organization development activities, or equipment or resource acquisitions. Comparing potential ROI across a body of options is a very compelling way of generating support for job aids as well as boosting your perceived value within the organization.

On occasion, you might get some pushback from clients about the job aid development process. Some clients might argue that there is no need for a task analysis, validation, or even rollout. You will probably find that it is easier to sell clients on the need for those phases not by arguing they’re critical for a well-designed job aid, but by talking about them in terms of potential resistance and buy-in. Don’t tell management you need to talk to a SME to design it correctly because management is probably already sure they know what it needs to look like. Argue that by talking to a SME and involving the workers, they will be happier with the result and more likely to accept it. Most managers can probably think of plenty of times when resistance and the lack of buy-in were deadly. Therefore, referring to those issues is likely to be more persuasive than trying to sell them on the integrity of the process.

Basic Rule 14

Involve the union at the earliest possible stage of the job aid development process.

Besides going as high as you can organizationally, there are others in the organization who could help build acceptance and even generate excitement for the job aid. If the client organization is unionized, be sure to include the local union representative at the earliest possible opportunity. You can’t afford to make the union an enemy when it comes to job aids; employees simply have too much discretion about when to use or not use a job aid. If the union passes down a negative word, your job aid will not be used.

Conversely, if the union becomes an enthusiastic proponent of your job aid, you probably don’t even need official management support; bootleg copies would magically find their way to the shop floor and mysteriously reproduce themselves. Additionally, it’s ideal if the SME you get involved in the task analysis also happens to be highly regarded or well respected within the workforce. Having someone on board whom other workers look up to goes a long way toward gaining support for the job aid.

ROI analysis is usually a selling point for any job aid, especially with management. Simply put, a job aid normally is substantially less expensive than most other solutions. In securing support for your job aid, remember to sell to the interests of the particular person or group you’re dealing with. Consequently, a key performer might initially have little interest in cooperating for a task that’s extra work. Supervisors might throw up barriers, being reluctant to let you borrow a top performer and uncertain about changing how work is done. Performers might believe it’s a cut-rate solution or that they already know how to do the work. Management may have notions about what will solve the problem or how to create the job aid.

Basic Rule 15

If you have a choice, pick a SME for your task analysis and validation phases whom co-workers respect and look up to.

You need to look at what drives the interests of each party. For management, it might be ROI and a quick solution. For supervisors, it might be less worker downtime or the ability to couple the job aid with another supervisor priority. For performers, it might be making work easier. For a SME, it might be pride in work or an appeal to expertise.

In all instances, the client needs to own responsibility for the use of the job aid. As a designer, you can’t control whether the job aid is used by workers (or if supervisors support or discourage use). You can design the job aid to make it as easy as possible to use. You can create a roll-out plan that enhances the likelihood of use. Regardless, the client still has to own this issue. Otherwise, the client will focus on other, more exciting issues and the job aid will drop by the wayside.

How can you keep the client responsible for usage of the job aid? Part of this is in managing expectations from the very beginning of the job aid development process. You need to continually remind clients that the problem isn’t solved as soon as the job aid is created. There needs to be a roll-out process, workers need to use the job aid, and then it needs to be supported and maintained.

You can focus this responsibility through measurement. What gets measured gets done. As the designer, you’re likely to be held accountable (and evaluated) on whether the job aid is created. Push to have someone else accountable for the degree to which it is used. Additionally, evaluate the success of the job aid. Measure how much impact the job aid had in reduction of the performance gap. In conducting this evaluation, you inevitably get into looking at usage because you need to determine the degree of usage to assess how much impact the job aid is responsible for.

Getting It Done

Now that you’ve learned more about improving job aid designs and getting buy-in, you’ll have an opportunity to put some of this information to practice. Exercise 7-1 offers some things you can do that will give you a chance to apply what you learned in this chapter.

The next chapter will present evaluation as a means of measuring the success of your job aid in closing a performance gap.

Exercise 7-1. Thinking About Testing and Implementation of Job Aids

1. How could you identify a potential SME who also has a great deal of respect from co-workers?

2. Examine the job aid you developed in chapter 6. What would an appropriate validation process look like for the job aid? How would you collect the user input? What potential problems do you anticipate with the first draft?

3. How could you sell this job aid to management? What are possible barriers to use or acceptance that you anticipate with this job aid?

4. Given what you know about your client, what would an effective roll-out process look like for your job aid?

5. Can you think of any instances where it might make sense to have a “stealth” rollout with minimal publicity or advance discussion before job aids are disseminated to workers?