CHAPTER 9

The Three Management Reports

In this chapter we explore the three TDRp management reports—the program report, the operations report, and the summary report—which we introduced in chapter 1. These reports are specifically designed to enable L&D leaders to run learning like a business, so it will be helpful to begin by briefly describing what we mean by that.

Running Learning Like a Business

A foundational belief underlying TDRp is that learning, and all HR functions, should be run like a business. By this we mean that HR and L&D leaders should manage their functions with business-like discipline. In its simplest form this requires that leaders set a plan at the start of the year with specific, measurable goals, and then execute the plan with discipline throughout the year to come as close as possible to delivering the planned results. This is exactly the same discipline that our colleagues use in other departments such as sales and manufacturing. This approach creates accountability for creating a good plan and executing it, both of which are necessary if L&D is to earn and keep a seat at the table.

We also assume that the appropriate performance consulting and needs analysis are done at the beginning of each engagement and both formal and informal learning solutions are considered. The expectation is that many learning programs will contain a blend of both formal and informal learning, so the resulting recommendations may include ILT, vILT, e-learning, mobile, content on the employee’s portal, communities of practice, and performance support.

We examine this approach for strategic programs (aligned directly to organizational goals to deliver agreed-upon outcomes), non-strategic programs (important programs but not usually aligned directly to a top goal), and department initiatives (designed to improve the efficiency and effectiveness of all programs or department processes) as shown in Table 9-1. For each, we discuss creating the plan first, followed by its disciplined execution, including the recommended management report.

Table 9-1. Recommended Reports

Type of Program

Type of Report

Strategic programs

Program report

Non-strategic programs

Program report

Department initiatives

Operations report

Strategic Programs

We define strategic programs as learning programs that directly align to the top goals of the CEO and head of HR. These programs will address the most important priorities of the organization included in its business plan. Examples of these high-level goals typically involve revenue, quality, efficiency, leadership, engagement, and retention.

The Plan

Running learning like a business starts with creating a good business plan for the year. The plan should support the vision and mission statements for both the department and the organization. Further, since L&D is a support function, the plan needs to align with the organizational goals for the coming year. Some L&D departments have the opportunity to be more strategic than others, but every department should look for opportunities to directly support the organization’s goals for the coming year as well as future years.

The first step is for the CLO to meet with the CEO well before the new year begins to discover three things:

•  the organization’s goals for the new year

•  the goal owner for each goal

•  the CEO’s priorities for the goals.

Following this discussion, the CLO should have a prioritized list of goals and goal owners. Next, the CLO needs to talk with the head of HR to learn about the HR goals for next year and their priorities. (In many organizations, the L&D function dedicates considerable effort onboarding employees, or providing basic skills training to new employees, advanced skills training to experienced employees, and compliance training to all employees. These important activities should also be captured and will be discussed in the next section on non-strategic programs.) An example of the results from the first step for an L&D department focused just on business and HR goals is shown in Table 9-2.

The second step is to meet with each goal owner to determine if L&D has a role to play in achieving the goal or meeting the need. Sometimes it will, other times it will not. If it appears that learning programs may help achieve the goal, the L&D team can pursue it further with the goal owner’s staff.

Table 9-2. Top Organizational Goals for the Next Fiscal Year

Priority

Business Goals

Goal Owner

1

Increase sales by 10%

SVP of Sales: Kronenburg

2

Reduce injuries by 20%

SVP of Manufacturing: Swilthe

3

Reduce operating costs by 5%

COO: Goh

Priority

HR Goals

Goal Owner

1

Increase employee engagement by 3 points

SVP HR: Wang

2

Decrease regrettable turnover by 5 points

SVP HR: Wang

3

Implement new performance mgt system

SVP HR: Wang

The third step, assuming learning has a role to play, is for the CLO to meet again with the goal owner and share the recommendations of L&D and the goal owner’s staff. Several meetings may be required before all parties are comfortable with the agreed-upon learning initiatives.

The fourth step is to reach agreement on all the specifics (such as target audience, timing, learning objectives, and modality), roles and responsibilities of both parties, an outcome measure (or other measure of success), and a plan or target for all key efficiency and effectiveness measures required to achieve the outcome. This step is critical to running learning like a business because this is where the learning function and the business agree on specific, measurable goals. L&D cannot set these without input from others since the goal owner and supervisors in the goal owner’s organization will play a significant role in communicating the reasons for the training, selecting the right audience, ensuring the audience takes the program, and most importantly, reinforcing the learning after the participants return to their jobs. In other words, a strategic learning program (one aligned directly to a goal of the organization) will only succeed if both parties plan and execute it.

Table 9-3 on the next page illustrates a well-conceived learning plan to help achieve the sales goal of the organization. At the top is the organizational goal or outcome (increase sales by 10 percent) and the name of the goal owner. (Since we are running learning like a business, the plan is business-centric, meaning business measures come before learning measures.) Immediately below the organizational goal is the learning outcome measure, which is “impact of learning on sales.” This is the Phillips Level 4 isolated impact of training, which the goal owner and L&D program manager have agreed upon. In this case, both are comfortable with a plan for the learning to contribute 2 percent higher sales, assuming the content is properly designed, developed, communicated, delivered, and reinforced.

Notice that the learning plan contains informal learning (community of practice and content available on the employee portal) as well as formal.

Below the outcome measures are the key learning programs and their associated efficiency and effectiveness measures, which L&D and the goal owner will manage monthly to deliver the planned result of 2 percent higher sales. The first program is new product features training, which L&D will develop and deliver. The second program is consultative selling skills, which L&D must also develop and deliver. Key efficiency measures for both programs are the completion dates for development and delivery, unique and total participants, and completion rates. L&D will also capture cost for both programs combined. Key effectiveness measures for both programs are Levels 1–3. Note that the Level 4 isolated impact (2 percent higher sales due to learning) already appears as the outcome measure for the impact of both programs together at the top of the report. Likewise, Level 5 net benefit and ROI measures are at the bottom showing the impact of both programs combined.

A roles and responsibilities document should complete the plan. We have provided an example in appendix B.

Table 9-3. Learning Plan for Achieving an Organizational Goal for Sales

Ideally, L&D should work with the goal owner to complete the plan and have the business approve it before the design and development begin. For programs that need to start early in the year, the plan should be completed and approved before the new year begins.

Note

The larger the agreed-upon impact from learning, the earlier in the year L&D must deploy the programs to have sufficient time in the fiscal year to deliver results. A program deployed in the fourth quarter of the fiscal year cannot have much impact in that year. Also, the larger the agreed-upon impact from learning, the greater the effort must be from both L&D and the goal owner’s organization, especially the supervisors.

Disciplined Execution

The second part of running learning like a business is disciplined execution of the plan, and this is where management reporting comes in. Disciplined execution requires the active, ongoing management of learning to deliver planned results. No matter how good the plan is, it will not implement itself. Furthermore, there will usually be unforeseen challenges and problems that L&D and the business must address to deliver the planned results. The only way to successfully execute a program is to continually manage it by comparing YTD results to plan and forecasting how the year is likely to end without any additional, unplanned effort.

Once the year is underway, there are only two questions a leader should be asking:

•  Are we on plan year to date?

•  Are we going to make plan for the year?

The leader should ask both questions for each measure. Start with the outcome measure if reliable YTD information is available. Typically, it will be available for an organizational outcome measure like sales but not always for the learning outcome measure. In this case, proceed directly to the efficiency and effectiveness measures. By employing a chain of evidence approach, if all these measures are on plan, we may be more confident that the outcome measure must be as well. Typically, some measures will be on plan, some will be exceeding plan, and some will be behind plan. Management attention should first focus on any measures that are behind plan.

It is possible, however, to be on plan YTD but still miss plan by year-end. Conversely, sometimes the programs take longer to launch than planned so measures are behind plan YTD; however, they are expected to still end the year on plan. This is why the forecast is so important. It represents the team’s best thinking of how the measure is likely to end the year, which in turn will dictate whether leaders need to take corrective action now. We illustrate the possibilities in Table 9-4.

Table 9-4. Disciplined Action

YTD Results

Forecast

Below Plan

On or Ahead of Plan

Behind plan

Take action

No action required

On plan

Take action

No action required

Ahead of plan

Take action

No action required

There are three cases that warrant action, and all three are where the forecast indicates results will fall short of plan. The first step to take is typically an analysis to better understand the reason why the forecast is falling short of plan. Upon further examination, the leadership team may be comfortable waiting for further information or may change the forecast to be on plan. If leaders deem they need to act, the next step is to identify how to achieve plan for the year. The leadership team will have to evaluate the cost of implementing these steps and find the required additional resources, which may need to come from another program. This is the real work of active management: analyzing the issue, generating ideas to address it, identifying required resources, and reallocating resources from another program (if necessary) to get this program back on track, entailing trade-offs and opportunity costs.

As shown in Table 9-1, the measures for strategic programs are captured in the TDRp Program Report, which is explored in detail in the next section.

Non-Strategic Programs

Non-strategic programs are important, often enterprise-wide initiatives to address key organizational needs like onboarding, basic skills training, and compliance. Typically, neither the CEO nor head of HR will have a high-level goal for these. Nonetheless, they are very important to the organization’s success, and the learning department is usually tasked with them. In addition, non-strategic programs encompass other courses or programs that may not have the same high-level of visibility, but nonetheless are important and need to be managed. Examples might include courses for business acumen, communication, and innovation. Of course, many other courses will be offered, but our focus here is on just those that L&D and the business need to actively manage (set plans for all critical measures and a monthly review of progress against plan).

The Plan

The plan to address these needs will be similar to that for the strategic goals. The CEO or head of HR may have raised the need for learning’s support in the discussions about strategic goals. More often, learning has been responsible for these needs for years, so it is simply a matter of confirming next year’s plans with the program owner, who may be the head of HR, the chief operating officer (COO), or some other high-level leader. For some programs, the CLO may be the owner. Table 9-5 lists some of these types of organizational needs.

Table 9-5. Important Organizational Needs

Item

Important Needs

Program Owner

A

Provide basic training for new employees

COO: Goh

B

Onboard new employees

SVP HR: Wang

C

Provide compliance training

COO: Goh

D

Diversity and inclusion

SVP: Wang

E

Business acumen

CFO: Davis

F

Communication

CLO: Parks

G

Innovation

Chief Engineer: D’Agoto

While the next step for strategic programs is for the CLO to meet with the goal owners to determine if L&D has a role to play, here we generally already know that it does and is expected to meet the need. (If this is not the case, then L&D should conduct a needs analysis and discuss with the owner the appropriate role of learning.) The next step is to discuss whether L&D should change the program and whether the business has emerging or new requirements (for example, new compliance training). Even if the business requires no changes to the program, L&D may need to make changes based on the size or location of the target audience or the manner in which learners want to consume the content.

After the program owner and L&D agree on the program for the coming year, L&D will need to establish plans for each key effectiveness and efficiency measure, just like for strategic programs. For most non-strategic programs, however, there will not be an outcome measure. L&D and the program owner may chose an effectiveness or efficiency measure as a headline measure of success to appear at the top of the report.

Table 9-6 is an example of a learning plan for onboarding.

Table 9-6. Learning Plan for Onboarding

There is no outcome measure shown in Table 9-6 because it is not showing a strategic program. The goal here is to deliver the training as efficiently and effectively as possible. However, we do show two measures of success: number onboarded and leader satisfaction with the onboarding (the leaders receiving the onboarded employees). Note that the first is an efficiency measure and the second is an effectiveness measure. The program owner and L&D have decided to highlight these measures as the “headline” measures of success. The plan includes informal learning (content on the employee portal) as well as formal learning.

Table 9-7 shows a second example to illustrate the flexibility in creating a good business plan. In this case, the need is to increase the business acumen of leaders and associates throughout the enterprise. CFO Davis (the program owner) and L&D have chosen not to highlight any measures of success, but they have crafted a plan showing detail by program, with the first program focused on business acumen for leaders and the second focused on associates. Notice that the program includes efficiency and effectiveness measures for informal learning (performance support).

Disciplined Execution

Leaders should execute the plan for non-strategic learning just like the plan for strategic learning. Once the year is underway, leaders need monthly reports to compare YTD progress against plan and to compare forecast against plan. Whenever the forecast indicates that plan may not be achieved, learning leaders need to analyze the root causes, identify potential solutions, cost them out, and decide what action, if any, to take. The only difference is that with non-strategic programs, leaders will not be managing an outcome measure.

Non-strategic programs should be managed through the TDRp program report, just as strategic programs are. While there won’t be an outcome measure, an efficiency or effectiveness measure may be highlighted as a headline measure of success. Some organization leaders prefer to use owner satisfaction or leader satisfaction (both effectiveness measures) as measures of success in place of an outcome measure. If the program owner has a goal to reduce the duration of the program, duration may be used as a measure of success. For non-strategic programs it is possible to have multiple headline measures (like goal owner satisfaction and duration).

Department Initiatives

The purpose of department initiatives is to improve the efficiency and effectiveness of processes and systems (like informal learning, the LMS, or a help desk) or to improve efficiency and effectiveness measures across all programs (like the utilization rate of classrooms or e-learning, or Levels 1 and 3). While the management approach is different than that for strategic and non-strategic programs, department initiatives still require a plan and disciplined execution for success.

Table 9-7. Learning Plan to Increase Business Acumen

The Plan

At the department level, the CLO and leadership team will decide what efficiency or effectiveness improvements to make in processes or systems, or in all programs. These will be tactical rather than strategic decisions and are usually left up to the CLO. In other words, unlike the plan for strategic programs, the CLO does not need to start by asking the CEO for input, and the initiatives will not be aligned to organizational goals unless the CEO has a goal for increasing productivity. And, unlike strategic programs, there will be no outcome measures and no need to talk with goal owners. The focus here is on improvements internal to the L&D department or to aggregate efficiency and effectiveness measures.

To begin the planning process, the CLO simply needs to articulate what is targeted for improvement, perhaps after getting feedback from staff. If there are too many suggestions, the CLO will need to prioritize them.

The next step is to flesh out the initiatives so learning leadership has a clear picture of what they can accomplish, at what cost, and with what potential impact. This usually leads to some reprioritizing and eventually a short list of initiatives to include in the department plan for the coming year. A director, program manager, or staff person will be assigned responsibility to lead or at least coordinate the effort. Each department initiative should have specific, measurable goals to improve the primary measure and all important complementary and ancillary measures. (See chapter 7 for a discussion of primary, complementary, and ancillary measures.)

Table 9-8 shows a sample plan for department process and system improvement initiatives. This plan reflects the CLO’s desire to dramatically increase the use of three types of informal learning: communities of practice, portal content, and performance support. The CLO also wants much better performance metrics for the help desk and plans to provide additional training to four help desk employees. To improve the LMS, the IT function plans an upgrade at mid-year, which will require additional training for staff. As a result of both initiatives, L&D expects a significant improvement in user satisfaction.

This plan is organized by initiative, but it could also be organized just by efficiency and effectiveness measures.

While Table 9-8 focused on specific process and system improvements, the CLO also might choose to implement initiatives to improve efficiency or effectiveness measures across all programs. Table 9-9 shows a sample plan for achieving or improving key efficiency and effectiveness measures for all programs.

In this example, L&D has targeted five efficiency measures and three effectiveness measures for improvement in 2021. The CLO wants to significantly increase both unique and total participants and also wants to dramatically improve the on-time completion rates for development and delivery. L&D plans a small increase in reach as well. For the effectiveness measures, L&D has planned significant improvements at all three levels.

Table 9-8. Sample Department Plan for Process and System Improvements

Table 9-9. Sample Department Plan to Improve Efficiency and Effectiveness Measures for All Programs

Disciplined Execution

As with strategic programs, the success for department initiatives depends on disciplined execution as well as a good plan. The philosophy for disciplined execution is the same and requires reporting both YTD results and forecast so the initiatives can be managed to success. The CLO and department leaders each month must answer the same two questions:

•  Are we on plan?

•  Are we going to end the year on plan?

Inevitably, some, if not most, of the initiatives will show measures falling short of plan throughout the year. Leadership will then have to employ the same steps to decide whether to act:

1. Analyze the issue to determine the cause.

2. Identify possible solutions.

3. Identify the resources required to implement solutions.

4. Decide on action steps and re-allocate resources as needed.

5. Implement actions.

Measures to improve the efficiency and effectiveness through learning department initiatives are captured in the TDRp Operations Report, which is explored in detail in the next section.

The TDRp Management Reporting Framework

With this background on running learning like a business, we are now ready to examine the TDRp management reporting framework in greater detail. The three TDRp reports were specifically designed to provide leaders with the measures and information they need to execute with discipline. Each report includes the measures at the appropriate level of aggregation and the following information about each measure:

•  unit of measure

•  last year’s results (if available)

•  plan for this year

•  YTD results for this year

•  comparison of YTD results to plan

•  forecast for this year (optional but strongly recommended)

•  comparison of forecast to plan.

Some practitioners may wish to add YTD plan and a comparison to YTD results to make it easier to determine if a measure is on plan. Management reports sometimes also include results for the most current month and plan for the month. (All four of these require seasonal adjustments to the data, which is covered in chapter 11.)

While all three reports are designed for the purpose of management, each has a different use and audience as well as other characteristics. Table 9-10 presents an overview of the TDRp management reports.

Table 9-10. Comparison of the Three TDRp Management Reports

The program report is designed to help the program manager and CLO manage the programs in support of a particular organizational goal or need. For example, if learning can support the goal to increase sales, there should be a program report to manage that effort. The operations report is designed to help the CLO and senior L&D leaders as well as designated leads manage improvement initiatives. The operations report will include only those measures to be managed—not monitored or used to inform. Last, the summary report is intended to share high-level, aggregate data for key measures with senior organization leaders as well as employees. It also shows the strategic alignment of programs to organization goals.

With this background, we revisit the TDRp management reporting framework first shared in chapter 1 (Figure 9-1).

Figure 9-1. TDRp Management Reporting Framework

The summary report is at the top left with all three types of measures feeding it. It will be at an enterprise level with aggregated data and no learning jargon. The program and operations reports are to the right with a narrower audience. All three types of measures feed the program report, but the operations report contains only efficiency and effectiveness measures.

We are now ready to examine each TDRp management report in detail, beginning with the program report.

Program Reports

Program reports apply to both strategic and non-strategic programs. The template is the same except that most non-strategic programs will not have an outcome measure. We start with the strategic program report and then move to the non-strategic report.

Strategic Program Report

The program report is a natural place to start since nearly every L&D department has at least one organizational goal it can support or one important need to address through training. The program report brings together all the data for the key programs and key measures in support of one goal. A department should have one program report for each organizational goal supported by learning. L&D should generate reports monthly and provide them to the program managers, their teams, the CLO, the goal owners, and the goal owner’s staff to manage the programs.

A program report should include the organizational goal supported by learning, the goal owner’s name, and the outcome measure for learning—all at the top of the report. The programs to achieve the learning outcome are shown next. Each program will typically have at least one efficiency measure and one effectiveness measure. Usually, there will be several of each and the most common for strategic programs include:

•  Efficiency measures

  Unique participants

  Total participants

  Completion date

  Completion rate

  Cost

•  Effectiveness measures

  Participant reaction (Level 1)

  Goal owner reaction (Level 1)

  Learning (Level 2)

  Intent to apply (Level 3)

  Actual application (Level 3)

  Net benefit and ROI (Level 5).

Note

Level 4, impact, is the outcome measure for learning. It appears at the top of the report and is almost always reported for all the programs combined in support of the goal rather than for each program.

The program report needs to include all the key measures that must be actively managed to deliver the learning outcome, which is why the list of measures is so important. It is hard to imagine how programs could be managed without these measures.

Simple Strategic Program Report

A simple program report is displayed in Table 9-11. Following the philosophy of running learning like a business, the organizational goal is front and center at the top of the report. In this case, the goal owner, Swilthe, and L&D have agreed that training should help reduce injuries. The organizational goal or outcome measure is to reduce injuries by 25 percent, a significant increase over the 8 percent reduction made last year. Both parties (Swilthe and L&D) have agreed that it would be reasonable to plan for a high impact from the learning, assuming it is properly designed, developed, communicated, delivered, and reinforced. High impact is a qualitative outcome measure, meaning the learning will be responsible for most of the 25 percent reduction in injuries. (Note in the report that “impact of learning on injuries” in row 3 is the learning outcome measure and that there is usually just one learning outcome measure per program report.)

Table 9-11. Example of a Simple Program Report

The programs to reduce injuries come next, along with their key efficiency and effectiveness measures. In this simple example, there are just two programs, with the first consisting of two courses. First, the two courses must be developed with a completion date (efficiency measure) of January 31. Second, the two courses must be delivered to 3,000 unique participants and 6,000 total participants (each unique participant will take two courses, so 3,000 × 2 = 6,000) by March 31. Furthermore, the two courses must be very effective to have a high impact on injuries. Specifically, 80 percent of participants should rate them favorably, 90 percent should pass the knowledge test on the first attempt, and most importantly, 95 percent should indicate they will apply it and 90 percent actually should apply it.

The effort also calls for performance support tools to be developed, deployed, and used. The plan is for 20 tools to be deployed, and all 3,000 unique employees are expected to use at least one tool. The plan calls for an 80 percent favorable rating.

These plans or targets for each measure represent the best thinking of both parties about what will be required for training to have high impact and significantly reduce injuries. If the training is not deployed to the entire target audience, if it is deployed later in the year, or if it is not applied at a very high rate, the learning will likely not have a high impact, and both parties as well as the CEO will be disappointed. Since this is a strategic program in support of one of the CEO’s top goals, both parties will need to actively manage the program for success.

The program report contains the information they need to do just that. In this case, the learning outcome measure is not directly observable, so there is no YTD result for it. However, we can look at the key efficiency and effectiveness measures to get a sense for whether the outcome measure is on plan. The two courses were developed on time, and by March 31, 92 percent of the total participants had completed the courses, falling just short of the March 31 deadline. Participants have reacted very favorably to the courses, exceeding plan by five points. And they have had a higher than anticipated first-time pass rate at 95 percent, five points above plan. Use of performance support tools is also on plan. So far, so good. The bad news, however, is that intent to apply rate is only 75 percent (20 percentage points below plan), and the actual application rate is only 67 percent (23 percentage points below plan). Without higher applications rates, the training will not have its intended impact, and thus the YTD comparison to plan for the outcome measure is “below plan.” Furthermore, since the training was planned to be the dominant driver of reduced injuries, the organization is behind plan YTD with only a 10 percent reduction.

These results are for March, and it would have been apparent earlier in the year that application was below plan. Consequently, both parties would have already implemented additional steps to reinforce the learning and address any other causes of low application. The all-important forecast column shows that both parties are cautiously optimistic that the training may still deliver a reduction in injuries near plan (a 22 percent reduction, which would be only 3 percentage points below plan). The application rates are forecast to improve considerably, to 90 percent for intent and 85 percent for actual application, both 5 percentage points below plan but a big improvement from March. Swilthe has also identified an additional 200 employees for the training (for a total of 3,200 unique participants), which will help make up for the slow start.

So, if it had been done in February or March, the program report would have alerted both parties that the completion rate was a little slow, and more importantly, that the application rate was way below plan and likely to result in the organizational goal not being achieved. The report shows sample size, so the leaders would have known when the application rate became statistically significant and management action was required. (The March sample sizes allow a very high level of confidence about all effectiveness measures. In other words, the actual application rate may not be exactly the 67 percent reported from the sample, but since the sample is 2,145 out of 2,800, the error margin around 67 percent would be very small and likely less than 1 percent.)

Complex Strategic Program Report

Program reports often need to be more complex to contain all the important programs and measures. The format is the same but typically there will be multiple programs and more activities that need to be managed. More complex program reports also benefit from a summary at the bottom, which provides totals of some key measures as well as those like cost, net benefit, and ROI, which are best presented in total rather than by program.

A more complex program report with three programs and more measures is shown in Table 9-12.

Table 9-12. Example of a More Complex Program Report

In this example, the goal is to reduce injuries by 20 percent. Once again, Swilthe is the goal owner and both parties have agreed that learning will play a significant role in achieving the goal. This time, however, they agree on a quantitative outcome measure for learning; namely, that learning will contribute 70 percent to the reduction in injuries (Level 4 isolated impact), which translates to a 14 percent reduction in injuries due to learning alone (70% × 20% = 14%). Two formal learning programs are planned along with an informal learning program. The first is to deploy two existing courses to factory A. The second is to develop and deploy three different courses to factory B. Each program is designed to address the unique causes of injuries in the factories. Third, performance support tools will be developed and deployed in both factories to help reduce injuries.

Since Program A is already developed, there are no development measures. The efficiency measures for delivery focus on unique and total participants and completion date with a plan to be 100 percent complete by March 31. Effectiveness measures include participant and goal owner reaction, learning, and application. Program B has efficiency and effectiveness measures for both development and delivery. For development, the efficiency measure is the deadline of March 31 for all three courses and the effectiveness measure is goal owner reaction of 4.5 out of 5.0. For delivery, the plan calls for the same efficiency and effectiveness measures as Program A with similar plans for each except completion date, which for Program B is 100 percent by July 31. Program C to deploy performance support has two efficiency measures and one effectiveness measure. The efficiency measures focus on the number of performance support tools and the unique users.

The summary at the bottom captures the total number of courses developed as well as the total number of unique and total participants. The summary also shows the cost, net benefit, and ROI for both programs combined. Some program reports might also show one measure for goal owner satisfaction in the summary rather than for each program.

Unlike the case in the simple program report, the YTD results look good, except for application, which is slightly behind plan. The new courses were developed on time, participation rates are close to plan, and Levels 1 and 2 are near or better than plan. Given the YTD results for key efficiency and effectiveness measures, we would expect the outcome measure for learning to be close to plan. If no reliable data were available, we would probably characterize the YTD impact of learning as slightly below plan just due to the application rate. However, assume in this example, that reliable data do exist for learning outcome as a result of a question about impact being asked in the post-event and follow-up surveys. The confidence-adjusted isolated impact from participants so far is running at 73 percent, which multiplied by the observable 13 percent reduction in injuries yields a 9 percent YTD reduction in injuries due just to learning.

The forecast reflects that both Swilthe and L&D are confident the year will end on plan. Swilthe believes injuries will decline by 20 percent, and both parties agree that learning will contribute about 70 percent of the total or a 14 percent reduction in injuries due just to learning. They have based their forecast for outcomes on the YTD results for impact as well as the forecasts for all key program measures, which show being near plan. In this case, some actions were taken earlier in the year to address the application issue, but all other measures were on track without any special action. The summary shows that cost will be near plan, but net benefit will exceed plan, resulting in a slightly higher than plan ROI.

Creating the Strategic Program Report

Create the program report by working closely with the goal owner, as shown in Figure 9-2. The starting point, of course, is step 1, discuss with the goal owner (like the head of sales) whether learning has a role to play. If it appears that it does, then learning professionals can work with goal owner’s staff to develop a recommendation for learning programs to help achieve the goal. At this point, you should share the recommendation with the goal owner for feedback and modification, and this may take several iterations before everyone is comfortable. In step 2, reach agreement on the measures to use. Step 3 focuses on reaching agreement on a plan (a number) for the learning outcome measure as well as the key effectiveness and efficiency measures required to achieve the planned outcome. For example, how many participants must complete the training, by when, and with what level of application, if learning is to have the planned impact?

After you complete the first three steps, you can create the report. Start at the top and work your way down. In step 4, fill in the goal, the goal owner name, and the learning outcome measure (this is the impact of learning). Next, add last year’s actuals if they are available and the plans for organizational goal (like a 10 percent increase in sales) and the learning outcome measure (like 2 percent higher sales due to learning).

Figure 9-2. Eight Steps to Create a Strategic Program Report

With the top portion complete, in step 5, add the names of the programs and the effectiveness and efficiency measures associated with each, along with their units of measure. Next, in step 6, add the plan numbers for each effectiveness and efficiency measure.

If there is a summary section at the bottom, complete it. In step 7, for all rows in the report, set the forecast column to equal plan to start. So, at the start of the year, the report will show forecast being on plan. You can update the forecast for a measure whenever you have new information that leads you to believe the existing forecast is no longer the best guess about how the year is likely to end.

Once you have the first month’s results, fill in the YTD column in step 8. Data on the organizational goal will come from a company system or from the goal owner. Data on the learning outcome measure will come from post-event or follow-up surveys if a question was asked about estimated impact. If not, learning professionals and the goal owner can make an educated guess about whether the programs are on track to deliver planned outcome by looking at the effectiveness and efficiency measures. If they are all on plan, the assumption may be made that the outcome measure is also on track.

Non-Strategic Program Report

The non-strategic program report is just like the strategic report except that the program will address an important need rather than a high-level goal; consequently, there typically will not be an outcome measure. Instead of “goal owner,” the person with ultimate responsibility for the program’s success is the “program owner.” Generally, this will be a senior leader like the COO, head of manufacturing, or CFO, but in some cases may be the CLO. The program owner should not be the program director or manager within L&D.

Just like the strategic program report, the non-strategic program report is designed to bring together all the data for the key programs and key measures in support of the need. A department should have one program report for each organizational need supported by learning if the intent is to actively manage it. L&D would generate the reports monthly and provide them to program managers, their teams, the CLO, the program owners, and the program owner’s staff to manage the programs.

A program report should include the organizational need supported by learning and the program owner’s name—all at the top of the report. The programs to address the need are shown next. Each program will typically have several efficiency measures and several effectiveness measures. If there is only one program or course, the report may move immediately to the key effectiveness and efficiency measures. The most common include:

•  Efficiency measures

  Unique participants

  Total participants

  Completion date

  Completion rate

  Cost

•  Effectiveness measures

  Participant reaction (Level 1)

  Goal owner reaction (Level 1)

  Learning (Level 2)

  Intent to apply (Level 3)

  Actual application (Level 3).

The program report needs to include all the key measures, which leaders must actively manage to meet the need effectively and efficiently.

Simple Non-Strategic Program Reports

Next we show a simple non-strategic program report for onboarding (Table 9-13). In this example, the report highlights two headline measures of success: number successfully onboarded and an increase in the satisfaction of the business unit leaders with the onboarding. YTD results are available through June.

Table 9-13. Sample Non-strategic Program Report for Onboarding

In this example, the need is to onboard 700 employees, an increase over the previous year. Wang, SVP of HR, is the onboarding process owner. Wang and L&D have agreed to use the number onboarded (an efficiency measure) and the satisfaction of the leaders of the new employees (an effectiveness measure) as the “headline” measures of success.

Both parties agree that two new courses must be developed by January 31 and in use by February 28 to provide important new content. These will replace three older courses and help achieve the plan of reducing the program’s duration by 11 days. Five new instructors will have to be hired and trained, and existing instructors will need to be updated. Furthermore, the plan calls for content on the portal to play an important role in onboarding and three efficiency measures have been identified to manage that. A cost target has also been agreed upon.

For effectiveness measures, both Wang and L&D believe it is possible to increase the participant reaction score by 10 points and to increase the first-time pass rate by four points. L&D hopes all these actions will lead to higher satisfaction on the part of the SVP of HR.

June YTD results show that 46 percent of planned participants have been onboarded and forecast shows everything is on plan to onboard all 700 by December 31. Receiving unit leaders are already much happier with the newly onboarded employees (75 percent favorable versus 67 percent last year), and forecast shows the year should end at 80 percent favorable, on plan. The two new courses were deployed ahead of plan, but L&D is having difficulty hiring new instructors. Nonetheless, the forecast shows that the remaining two are expected to be hired. New facilitator training and refresher training for existing facilitators is all on plan. The program has already met the 50-day target for duration, and cost is in line with plan.

The participant reaction measure indicates that this year’s participants are slightly more satisfied than last year’s (75 percent versus 70 percent) but greater improvement has been targeted and the forecast indicates the effort will fall short for the year by 5 points. This may be caused by larger than planned class size due to the delays in hiring staff or there may be other issues the program manager needs to investigate to see what can be done to get closer to plan in the second half. Level 2 has made good progress in the first six months and is on target to reach plan by year-end. Program owner Wang is happy with the progress made so far this year (4.2 versus 3.9 in December), and L&D is still hoping to get a 4.5 rating at year-end.

The example in Table 9-14 shows a business acumen course. Note that no headline measures of success are featured, but two courses are shown, each with their own effectiveness and efficiency measures. Since there are no headline measures, the need and program owner are centered. YTD results are available through April, which in this example means that results are not available (N/A) for numerous measures.

Table 9-14. Sample Non-Strategic Program Report for Business Acumen

In this example, program owner and CFO Davis has asked for L&D’s help to increase the business acumen of all employees. The two parties agree that learning has a role to play and that different programs should be deployed for leaders and employees. The plan is to deploy to leaders first, conveying both content and guidance on how to coach their employees and reinforce their learning. Once the leaders are ready, their employees will take a course. Performance support tools are also planned to provide help at the time of need. No measures have been singled out as “headline” measures of success.

Plans call for developing and delivering two courses to 100 leaders by May 31. All leaders are expected to complete the two courses, and CFO Davis will ensure that they do. There is no history to guide plan setting for the effectiveness measures, but similar courses have achieved 80 percent favorable ratings from participants with initial pass rates of 90 percent on the knowledge tests. A 90 percent intent to apply is very high but possible given CFO Davis’s commitment and the leader-first model. It is hoped that after 90 days, 80 percent will have displayed some business acumen from the training.

One course has to be developed for the associates by April 30; it needs to be tested and approved for production by May 31 when the leaders complete their two courses. Deployment to all 10,000 associates is planned by September 30, which will require a lot of coordination. All associates are also expected to complete the program. Total cost, including opportunity cost, is expected to be $300,000, with a net benefit of $200,000, resulting in a high ROI of 67 percent.

April YTD results show that 85 of the 100 leaders have completed the first course and forecast indicates that all will complete on plan plus an additional five. Expectations are that the second course will also be delivered and completed on time by all 100 leaders. Level 1 for the leaders is running four points above plan and should end the year slightly above plan. (The sample size is large enough to confirm YTD results are truly above plan.) First-time pass rates aren’t meeting expectations, but there are plans in place to improve these slightly by year-end. Intent to apply is 6 points below plan, and L&D believes it is unlikely that plan of 90 percent can be achieved for the year. In hindsight, the plan may have been too aggressive, and the forecast for both the intended and actual application rates have been lowered to reflect that.

The performance support tools are still being deployed so YTD usage is low, but the forecast indicates they are on plan for the year.

Data for the associates reflects that this portion of the program has not yet been deployed. Forecasts for this section of the report indicate that L&D and CFO Davis continue to believe the plan is achievable. The same goes for the summary at the bottom.

Creating the Non-Strategic Program Report

L&D creates the non-strategic program report by working with the program owner, which in some cases may be the CLO (Figure 9-3). Step 1 is always to meet with the program owner to review expectations for the year. Even when the program has been conducted for many years, there will likely be some change required for the coming year if only in the number of participants. Often L&D will need to update course content or add new modules to keep the training current and relevant.

Figure 9-3. Steps to Create a Program Report for Non-Strategic Programs

Step 2 is to agree on the programs and courses for the coming year, including program specifics (such as objectives, modality, duration, and audience), key effectiveness and efficiency measures, and roles and responsibilities. Both parties might also choose to select one or two effectiveness or efficiency measures to use as headline measures of success, elevating them in importance over the rest of the measures. These measures will appear at the top of the report, where the outcome measure for strategic programs would normally appear. For example, L&D and the program owners might choose the number of participants or some measure of satisfaction for the program headline measures of success. In an existing relationship, the discussion is likely to focus on any changes that need to be made. If it is a new program or new owner, more discussion will be required and it may take several meetings before agreement is reached on all items.

Step 3 is to reach agreement on plans for the effectiveness and efficiency measures, including the headline measures. Once the first three steps have been completed, the report itself can be created. Start at the top and work your way down. In step 4, fill in the need, program owner name, and the headline measures if there are any. Next, add last year’s actuals if they are available and the plans for the headline measures, such as a 10 percent increase in number of participants or a 3 point increase in the program owner or senior leader satisfaction with the program.

With the top portion complete, in step 5, add the names of the programs (if there are multiple programs) as well as the effectiveness and efficiency measures associated with each and their units of measure. Next, in step 6, add the plan numbers for each effectiveness and efficiency measure.

For all rows in the report, in step 7, set the forecast column to equal plan to start. So, at the start of the year, the report will show the forecast being on plan. You can update the forecast for a measure whenever you have new information that leads you to believe the existing forecast is no longer the best guess about how the year is likely to end.

Once you have the first month’s results, fill in the YTD column in step 8.

Operations Report

The operations report is the next report most users are likely to create. The primary users of this report are the CLO and senior L&D leaders who manage efforts to improve efficiency or effectiveness measures across all programs as well as manage initiatives to improve the efficiency and effectiveness of department processes and systems. The operations report includes both efficiency and effectiveness measures but no outcome measures because department initiatives are not aligned directly to a top-level organization goal. There will be just one operations report, which L&D should generate monthly.

Simple Operations Report

A simple operations report might include efficiency and effectiveness measures for only improvement efforts aimed at all programs or for improvement efforts aimed at learning processes and systems. It might also include a few measures from each type of improvement initiative. A more complex operations report would typically include more measures, such as measures from each type of improvement effort.

We begin with a simple operations report for program initiatives (Table 9-15). In this example, all the initiatives are directed at improving efficiency and effectiveness measures across all programs.

Table 9-15. Simple Operations Report for Program Initiatives

This example contains five efficiency measures and three high-level effectiveness measures (Levels 1, 2, and 3). The plan is to provide learning to more unique participants and to increase the average number of courses taken by each unique participant so that the number of total participants rises even more. The CLO also wants to significantly improve on-time course development and delivery and slightly improve the percentage of employees reached by learning. For effectiveness measures, the CLO wants to improve all three levels.

The operations report shows mixed YTD results for the efficiency measures. Depending on historic norms at this time of year, 49 percent of plan for unique participants may indicate the measure is on plan and likewise for total participants at 43 percent. Let’s say the CLO asked what these percentages have been the last few years and was told 47 percent and 41 percent, respectively. In that case, the measures appear to be on plan and a forecast showing the year ending near plan makes sense. The same analysis is needed to judge whether reach is on plan and likely to make plan by year-end. In this case, history shows reach is normally about 72 percent at this time of year, so it’s on plan and likely to make plan by year-end.

The two measures for percentage on-time are more problematic. While each has made good progress since last year, both are still considerably below plan, and it seems unlikely plan will be achieved. Consequently, forecast shows both measures falling short of plan, which may indicate the plan was too aggressive or simply turned out to be much harder to achieve than anticipated.

The report shows good YTD progress for Levels 1 (participant) and 2, but application rates are significantly below plan with little improvement over last year. Forecast indicates that all effectiveness measures are likely to end the year below plan except for Level 2 learning.

The next simple example explores department process initiatives. In this case, the department is seeking to improve its informal learning, help desk, and LMS. Table 9-16 shows the operations report.

The plans for these initiatives were shared in Table 9-7. For the informal learning initiative, three efficiency measures and one effectiveness measure have been identified for each component. For the help desk initiative, L&D identified four efficiency and two effectiveness measures. Three efficiency and two effectiveness measures are planned for the LMS initiative.

The operations report indicates that the informal learning initiative is making good progress for the communities of practice and portal content, and most measures are expected to make plan by year-end. The performance support initiative is struggling, however, and likely to end the year below plan for all measures except number of tools, where it should exceed plan. Further analysis is warranted here.

The help desk initiative is on plan YTD with training completed about a week ahead of schedule and for one more person than planned. Hold time has already dropped by about half, and dropped calls are running significantly below last year’s pace. User satisfaction has also improved dramatically, and the quality metric is halfway to plan (0.2 point improvement out of 0.4 point planned improvement). Forecasts for these key measures indicate ending the year very close to plan for hold time and dropped calls, and on plan for both effectiveness measures.

The LMS initiative also appears to be on plan. The upgrade has not yet occurred (planned for August 1), but the YTD and forecast columns shows that the project leader believes everything is on plan. The effectiveness measures have not shown any improvement yet and have actually deteriorated slightly, but the training has not yet been completed and the upgrade is still a month away from going live.

Table 9-16. Simple Operations Report for Department Process and System Improvements

Complex Operations Report

A more complex operations report would have a greater number of initiatives and may contain measures for both program and process and system improvements. Due to the complexity of planning and executing so many initiatives, this level of effort is usually reserved for the very mature L&D departments with more staff and budget. Table 9-17 includes numerous program measures as well as the process and system improvement measures from Table 9-16.

Table 9-17. Complex Operations Report

Notice that the top section for effectiveness measures now includes both measures for impact (Level 4) and both measures for ROI (Level 5) in contrast to the simple report, which ended with application. The bottom portion is just the simple operations report for department process and system improvements.

The report indicates that most of the effectiveness initiatives for programs are behind plan YTD but are forecast to improve significantly in the second half, and end the year near plan. Goal owner satisfaction is expected to fall short of plan by 5 points but still show improvement over 2020. In contrast, unique participants are on plan and forecast to end the year on plan, while total participants are forecast to fall 6 percentage points short of plan. On-time completion measures are much improved YTD but forecast to fall short of plan with reach expected to end the year on plan.

As discussed previously, the initiatives to improve communities of practice, portal content, the help desk, and LMS appear to be on plan and are forecast to end the year near plan. Consequently, no special action is necessary for these initiatives while performance support is struggling.

Creating the Operations Report

L&D creates the operations report through close collaboration between the CLO and senior learning leaders in the department. The report should capture measures for all the important initiatives planned for the coming year since it will be used monthly to manage these initiatives toward successful completion. The steps to create the operations report are shown in Figure 9-4.

Figure 9-4. Steps to Create the Operations Report

Step 1 of the process is identifying next year’s key efficiency and effectiveness measures (or this year’s measures if you are implementing partway through the year). These come from the CLO and senior learning leaders in the department. Typically, these leaders will have measures in mind to improve for next year or targets for particular measures (like number of participants or courses). So, the analyst simply needs to ask and then confirm with the CLO once the list of measures is complete. If it is a long list, the CLO will need to prioritize.

In step 2, list the measures as rows in the report. The CLO can decide whether to start with effectiveness or efficiency measures, but the measures should be grouped by type. Each measure should also have a unit of measure. In step 3, enter last year’s actual results if available for each measure.

In step 4, add the all-important plan numbers for each measure. These will come from the CLO and senior leaders who are responsible for the initiatives. The CLO should approve all the plan numbers. In step 5, set the forecast equal to plan at the start of the year, then update as necessary throughout the year. Last, in step 6, fill in the YTD column after the first month’s results are in.

So, the creation of the report itself is straightforward. The work is identifying the measures to be managed for the coming year and creating a plan value for each measure. Also, L&D will need to update the forecast periodically during the year as YTD results come in and as new information about the rest of the year becomes available. Chapter 11 provides guidance on creating plan and forecast values.

Summary Report

The last TDRp report is the summary report, which draws from both the program and operations reports. It contains all three types of measures and serves to brief senior leaders such as the CEO and CFO as well as governing bodies and the SVP of HR. The summary report is also excellent for briefing employees in the L&D function at monthly all-employee meetings and to use in presentations to other groups.

The summary report provides high-level, aggregate data on the most important outcome, effectiveness, and efficiency measures. It also depicts the alignment of learning to the organization’s most important goals. Given these target audiences, the report should not contain any learning or HR jargon. L&D should also produce the report at least quarterly.

In addition to being a briefing, the report helps the CLO manage expectations, reprioritize when necessary, and occasionally make the case for an increase in resources during the year. While the day-to-day management of L&D takes place through the program and operations reports, the summary report is the tool used by the CLO to manage at the highest level with the CEO, CFO, and governing bodies.

Format

In keeping with the TDRp philosophy to run learning like a business, the summary report has a business-centric format, meaning that the information most important to the senior business leaders (the key users) comes first. Consequently, the report begins with the organizational goals in the CEO’s priority order. If the goal owner agreed that learning could help achieve the goal, the planned learning program is aligned or listed below the goal. This clearly shows the strategic alignment of learning to the organization’s top goals.

Next, the report lists HR and other important goals supported by learning along with the agreed-upon learning programs. Learning can usually support planned improvements in employee engagement, leadership, turnover, and diversity. Moreover, in some organizations learning also plays a very important tactical role in meeting important organizational needs like basic skills training for large numbers of employees. This is the case in retail, restaurants, and the military. Other important needs may be onboarding, compliance training, and advanced skills training. While there may not be a high-level goal to provide this training, it is understood that this is critical to the organization’s success and that L&D is responsible. In these cases, any need addressed by L&D should be added. And if meeting these more tactical needs is the primary mission of L&D, then put them at the top of the report.

Last, the summary report includes a few key efficiency and effectiveness measures from the operations report. This shows senior leaders that L&D is not only closely aligned to the goals of the organization, but that L&D is always working to improve its own efficiency and effectiveness.

Since the format includes the standard columns for last year’s results, this year’s plan, YTD results, and forecast, the report also clearly demonstrates that specific, measurable plans have been set for each measure and the CLO is willing to be held accountable for delivering the planned results. The report shows that learning will be managed like any other department with plans and disciplined execution.

Strategic Summary Report

First, we examine a sample summary report for an L&D department that has responsibility primarily for strategic goals (Table 9-18). By “strategic” we mean the CEO’s or senior leader’s business goals for the year. These would have been approved by the board of directors or by the CEO for business units. In either case, the CEO or business unit leader is accountable for delivering these goals and will suffer in terms of pay, bonus, or promotion if the goals are not attained. In this model, business units provide the more tactical training for new hires and others in need of upskilling or reskilling.

As you can see, there are four high-level goals for the CEO:

•  Increase revenue by 20 percent.

•  Reduce injuries by 20 percent.

•  Reduce operating costs by 15 percent.

•  Improve quality by 4 points.

The CLO or program managers would have met with the goal owners and decided if learning had a role to play. If it did, both parties would agree on the type of learning and all the specifics associated with the training, including the cost, planned impact, and their mutual roles and responsibilities.

In this example, learning content is planned for all four goals, and the agreed-upon learning programs are listed beneath each goal (shown in italics). A quantitative outcome measure has been agreed upon for the first goal, with learning expected to contribute about 25 percent toward the goal of a 20 percent increase in revenue. In other words, learning should increase revenue by about 5 percent (25% × 20% = 5%).

For the next two goals, a qualitative outcome measure has been chosen to define expectations. Both parties believe training will be the major contributor to a reduction in injuries and thus the “high” impact. A “medium” impact is planned on operating costs, meaning that training, while very important, by itself will not likely be responsible for most of the cost reduction.

Last, learning is also planned to improve quality, but the two parties could not reach agreement on an outcome measure. So, they settled on the next-best measure, application (Level 3), on which they could agree to a plan. A high application rate may not always lead to impact, but it usually does, and we know for sure that there will be no impact without application.

Table 9-18. Sample Summary Report With Focus on Strategic Goals

Note

This report is an example of a summary report with mixed outcome measures, meaning more than one type is used. Alternatively, a summary report might contain only quantitative or only qualitative outcome measures.

The next section of the report includes the three goals of the SVP of HR:

•  Increase employee engagement by 3 points.

•  Improve the leadership score by 4 points.

•  Improve retention of top performers by 5 points.

The SVP of HR and the CLO agree that learning can contribute to the first two but not to the third. They agree on qualitative outcome measures with a “low” contribution of learning planned for employee engagement, but a “high” contribution planned for leadership.

Agreement on the value of the outcome measure is a critical part of the planning process because it will directly influence decisions about the level of resources required to achieve the goal (higher impact requires more resources), timing of the program (higher impact requires an earlier deployment), and goal owner commitment (higher impact requires greater commitment by the goal owner).

The top two sections, then, show the alignment of learning to the CEO’s and SVP of HR’s goals. The bottom section completes the report by sharing five important effectiveness measures and five important efficiency measures. These will be aggregated across all programs for which data are available and show the commitment of the CLO to improvements in both areas. The CLO might also include measures showing planned improvement in department processes and systems.

The summary report shows that learning is on plan YTD for revenue impact but forecasted to exceed plan by year-end, contributing to higher than planned revenue. Learning is behind plan YTD with its impact on injuries but is expected to make plan for the year, so no further special action is necessary. Learning to reduce operating costs, however, is significantly behind plan YTD and expected to fall short of plan by year-end. In other words, the lost ground in the first six months cannot be made up without special action, and the organization is likely to miss plan for the year unless something changes, in part due to a smaller contribution from learning than planned. This is an example of where the CLO, program manager, and goal owner need to understand the reasons for lagging performance, identify options, and decide what special action to take to get back on plan. In contrast to operating expenses, the quality initiative appears to be on plan YTD and should end the year just slightly below plan, so no special action is necessary.

Looking at the next section of the report for HR goals, learning appears to be on plan and the organization is forecasting that both goals supported by learning will be achieved.

The last section includes 10 important effectiveness and efficiency measures, highlighting efforts by the department to improve these measures across all programs. These measures are mixed:

•  learning is on plan and forecast to end the year on or near plan

•  the percentage of employees reached by learning is behind plan but forecast to catch up

•  actual application rate, unique participants, total participants, and unique informal learning users are behind plan but expected to end the year near plan

•  goal owner reaction and percentage of courses developed on time are behind plan and not expected to catch up

•  participant reaction to formal learning, participant reaction to informal learning, and percentage with development plan are above plan YTD and forecast to end the year above plan.

The CLO and senior L&D leaders may consider special action for those not forecast to end the year on plan or may simply explain to the CEO that the plans were too aggressive or unforeseen issues developed that will take longer to address.

Strategic and Non-Strategic Summary Report

We can now build on the strategic summary report in the last section by adding some non-strategic programs. Most L&D departments will have at least one or two strategic goals as well as HR goals they can support, so the strategic and HR goal sections remain. Many L&D departments, however, spend the majority of their resources on basic training of new employees or skill enhancement for advancing employees. The example in Table 9-19 provides several programs of this type, which are not directly aligned to the CEO’s or SVP of HR’s goals but are critical nonetheless to the organization’s ongoing operations and long-term success.

Table 9-19 shows a summary report for an organization in the fast food industry where thousands of new and advancing employees need to be trained. In this report, the non-strategic programs are shown first because they represent most of the L&D group’s efforts. The plan calls for providing basic skills training to 9,000 employees, a significant increase from 2020. The program owner also wants the duration shortened by two days while increasing the satisfaction of the receiving managers from 78 percent to 85 percent. The plan also focuses on the leadership training provided to new and existing leaders. For 2021, 600 leaders will need to be trained with a manager satisfaction score of 85 percent. The three measures (number trained, duration, and manager satisfaction) are the agreed-upon or headline measures of success for these two important programs. Since these are not strategic programs, there are no outcome measures.

Table 9-19. Sample Summary Report With a Focus on Non-Strategic Programs

While the YTD results for employees are only 42 percent of plan, forecast shows that not only will plan be reached by year-end, but an additional 1,000 employees will be trained, exceeding plan by 11 percent. The plan to reduce the program by two days has been achieved, and manager satisfaction is forecast to average 85 percent for the year. Leadership training is ahead of plan YTD, and forecast shows an additional 50 leaders will be trained by year-end. The planned level of manager satisfaction has also been achieved and is expected to hold for the remainder of the year.

The business, HR, efficiency goals, effectiveness goals, YTD results, and forecast are from Table 9-17. You can refer to that section for explanation and analysis.

How to Create a Summary Report

Create the summary report by leveraging the program reports and the operations report. If you have already created these reports, you should not require any new information—simply transfer the information to the summary report. If you have not yet completed the program and operations report, you can create the summary report from scratch. Refer to the steps shown in Figure 9-5.

Figure 9-5. Steps to Create a Summary Report

For step 1, enter the CEO’s top goals for the coming year. These need to come from the CEO and should be in priority order. Along with the goal (such as increase sales by 10 percent), enter the unit of measure and last year’s actual results if available. Typically, a CEO has four to seven key goals that have been approved by the board of directors. The CEO will be evaluated by the board on these goals.

In step 2, enter the learning outcome measure for each goal where the goal owner and CLO have agreed that learning has a role to play. The learning outcome measure will usually be the impact of learning on the goal but sometimes it’s a next-best measure, such as the application rate, instead. The goal owner and CLO need to agree on the measure and the plan value for the measure. If impact is used, state the percentage contribution from learning next to the learning outcome measure and the resulting impact in the plan column. For example, if learning is expected to contribute 20 percent toward achieving the goal of a 10 percent increase in sales, state the 20 percent planned impact next to “impact of learning” and place 2 percent (20% × 10%) in the plan column. The percentage contribution and the resulting impact should be agreed upon by the goal owner and the CLO.

For step 3, add the names of the programs planned to support the goal under the outcome measure for each goal. Use italics to make the report more readable. This shows the direct alignment of programs to the CEO’s top goals. The information for this step and the prior step can be found in the program reports if they have been created. If not, the information comes from the CLO and program manager working closely with each goal owner.

In step 4, repeat the same basic process for HR goals that will be supported by learning. The CLO will get the HR goals from the head of HR and, if both parties agree that learning has a role to play, show the goal, the agreed-upon learning outcome measure, units of measure, and last year’s results.

To complete the alignment portion of the summary report, in step 5, list other important needs supported by learning. These might include compliance training and onboarding if they’re not already included under HR goals, and may also include basic or advance skills training. In some organizations the bulk of the training effort is directed toward new hires so it is important to capture this activity. Add the need and whatever measures of success have been agreed upon along with the unit of measure, last year’s results, and plan for the coming year.

In step 6, add two to four key effectiveness measures and two to four key efficiency measures at the bottom of the report. These should be stated in plain language with no learning jargon. Add the unit of measure, last year’s results, and plan for the coming year. These will come from the operations report if it exists. If not, they will come from the CLO.

Step 7 is best practice—share the summary report draft with your CEO and governing bodies to see if they have any questions or would like to see any additional information. Given that the purpose of the summary report is to share results with senior leaders and manage their expectations, it is important that they are comfortable with it and that it meets their needs.

In step 8, you set forecast values equal to plan to start the year and then finish with step 9, adding YTD results after the first month’s results are available.

Conclusion

This chapter focused on the three management reports as well as the concept of running learning like a business. A management report is the highest-level and most complex type of report. Consequently, it will contain only those measures to be managed, requiring a plan, YTD results, and forecast for each measure. This information answers two fundamental questions each month: “Are we on plan year to date? Are we going to end the year on plan?” Answers to these questions often require analysis, which in turn may lead to more analysis to determine root causes and potential solutions.

Each management report serves a different purpose, and we recommend using all three in a learning department. The program report enables the program manager, CLO, goal owner, and staff in the goal owner’s organization to manage learning programs to deliver the impact agreed upon by the goal owner and CLO. L&D should generate and share the report monthly, showing the latest YTD results and forecast. A program report should be created for each top CEO goal, HR goal, and any other important goal or need supported by learning. A mature organization would typically have five to 10 program reports.

The operations report provides the CLO and senior learning leaders with information monthly to manage planned improvements in efficiency or effectiveness measures or to achieve planned targets (like number of participants). Unlike the program reports, which focus on specific programs, the operations report contains data aggregated across the enterprise. Along with the program report, this is how the CLO runs learning like a business. Specifically, the CLO has set measurable goals and allocated resources, and the learning leadership team is executing them with discipline using monthly reports to ensure that the plan numbers are achieved.

The actual management of the function occurs using the program reports and the operations report; the summary report serves a different purpose and audience. The summary report briefs senior leaders, governing boards, learning department employees, and other groups interested in learning. The report clearly and simply demonstrates the alignment of learning to the goals and needs of the organization. It shows that the learning department worked closely with goal owners to agree on the role of learning and the planned impact of learning on the goal. In addition, it shows that the CLO is working to improve effectiveness and efficiency across all programs and internally within the department as well. In short, the summary report shows that the CLO is running the learning function with business-like discipline, just the same as sales, manufacturing, and other departments.

Now that we have a good understanding of the five general types of reports from chapter 8, coupled with the in-depth examination of the three types of management reports from this chapter, we are ready to create our reporting strategy.