Insights: A Hybrid Workforce Wants (and Deserves) Hybrid Training

Two years of quantitative and qualitative training evaluations point to the same conclusion: participants appreciate the “optionality” of hybrid training.

MARCH 2023 | 15-MINUTE READ | DENIS COOK, PGMP & ADRIANA PERATSAKIS


Topics: HYBRID TRAINING, OPTIONALITY, EVALUATIONS


What is “hybrid” training?

A basic definition.

In its simplest form, a hybrid training is a combination of classroom and virtual trainings. You have an instructor and a producer in a “real” classroom with attendees, and the producer ports the training to virtual attendees via an online training platform. All attendees see and hear the instructor and see real-time training materials — some in-person, some through their computer.

Our hybrid solution — simple, scalable, and deployable just about anywhere.

Recognizing that hybrid trainings are relatively new in the federal training space, we designed our solution to incorporate emerging best practices while still being simple, scalable, and deployable in most training cases and environments.

How? By removing as many depencies on our hosts as possible. As long as we have power and high-speed internet, we can deploy our hybrid solution just about anywhere.

A basic list of tools and resources.

A hybrid delivery usually involves the following tools and resources:

  • Instructor

  • Producer

  • Instructor computer with training materials (to join the virtual classroom and show training materials to all participants). The instructor’s computer speakers are off, but the microphone is on

  • Producer computer with camera on the instructor (to host the virtual classroom and show the instructor to all participants). The producer should have headphones plugged into their computer, but their microphone muted

  • Physical training space, like a hotel conference room

  • Virtual training space, like Zoom for Government or Microsoft Teams

  • Projector (that displays the training materials for in-person attendees)

  • Microphone for the instructor

  • Bluetooth “clicker” for the instructor

  • Headphones for the producer to listen to the virtual classroom without creating an echo

  • Camera for the producer to record the instructor

  • Printed training materials for in-person attendees

  • Electronic training materials for virtual attendees

  • Reliable (and accessible) high-speed internet

Now that we have a general sense of our hybrid solution, we can review how we developed and implemented it.

Background

Built during Covid-19 with an eye to the future.

In fiscal year 2022, Colleague Consulting managed a nationwide training program involving a two-day course on grants and cooperative agreements management. Our audience consisted of state and federal employees from a range of experiences and work domains.

When we designed the course, during the latter-half of the Covid-19 pandemic[1], we wanted to support near- and long-term delivery requirements. We recognized the need for virtual deliveries today but also wanted to support a return to the classroom tomorrow.

We did not know, of course, when tomorrow would be. So, alongside our clients, we built a “format-agnostic” delivery model that would serve learners across a spectrum of operating environments. Ultimately, we wanted our instructors and producers to be able to facilitate the course in a classroom, virtual or hybrid format with only minor mechanical changes.

The future came quick – but in different places at different times.

As soon as we finished developing and piloting the course for a virtual audience, we received a request to deliver the course to a hybrid audience. Candidly, we hadn’t expected that call for another six to 12 months. But we were excited to put our brand-new, format-agnostic course to the test – even if we only had one pilot under our belt.

Different locations, different requirements.

The speed with which the hybrid request arrived reminded us of an insight that harkened back to the days of classroom instruction: different locations and audiences will have different requirements.

In this case, our training locations were all in different states, each of which were subject to markedly different requirements and constraints based on a state’s Covid-19 mitigation strategies. What we could do in one state, in other words, might not fly in another state.[2]

Hybrid deliveries threaded the needle nicely.

Data

Yes, we were able to accommodate disparate stakeholders and requirements with hybrid deliveries. But did they “work?” Did our hybrid deliveries create effective learning experiences, irrespective of the audience?

To answer this question, we turned to 314 participant evaluations.

An “actionable” sample size.

We delivered the course 21 times across the United States in FY22,[3] deferring to each training location’s format preference. While most locations selected a classroom or virtual delivery, several preferred the “optionality” of hybrid.

Format No. Deliveries
Classroom 9
Virtual 7
Hybrid 5

1,192 participants completed the training program in FY22, of which 314 completed our end-of-course evaluation – a response rate of 26%. (We prefer to not make evaluations mandatory because doing so leads to noisy data. A non-trivial number of participants, typically around 5%, will complete the survey in a matter of seconds, selecting one number for all questions and skipping narrative questions. In our experience, mandating a response tends to artificially inflate evaluation data.)

Considering the volume and “quality” – that is, thoroughness – of our evaluation data, we were comfortable extracting generalizable conclusions.

Participants assigned the highest “overall” rating to hybrid in FY22.

To our initial surprise, when we compiled the quantitative evaluation data at fiscal year-end, we found that participants assigned the highest “overall satisfaction” rating[4] to hybrid deliveries.

Format "Overall" Rating
Classroom 4.34 / 5.00
Virtual 4.39 / 5.00
Hybrid 4.54 / 5.00

And they still assign hybrid the highest “overall” rating in FY23.

To our surprise, once again, this trend has persisted into FY23 – that is, past the point of many Covid-19-related mitigation mechanisms. As of nine deliveries this fiscal year, we’ve compiled the following data: 

Format "Overall" Rating
Classroom 4.45 / 5.00
Hybrid 4.55 / 5.00

We assumed that classroom deliveries would eclipse their hybrid peers once we had returned to a “normal” operating posture. We were wrong. Participants continued to thank us for equipping them with options.

What’s behind the relatively high evaluations? It’s all about optionality.

Hybrid deliveries create a more learner-centric experience. Participants have more choices in terms of where and how they learn. Some learners appreciate the quieter aspects of attending a course virtually, for instance, while others enjoy shaking the hands of former colleagues during breaks. Either way, we see this trend in the open-ended, narrative feedback that participants provide in our end-of-course evaluations. We regularly receive comments to the effect of, “Thank you for the hybrid option.”

The “medium and technology” is getting better.

Hybrid wasn’t all sunshine and popsicles, however. We faced plenty of challenges, many of which were related to the technology stack available in each training location. We received the following scores in response to the statement, “The medium and technology used for course delivery was effective.”

With that in mind, we’ll turn to our recommendations for a successful hybrid training. Our recommendations stem from our lessons-learned, all of which we’ve implemented. Through FY23, these changes have increased our median “medium and technology” score by 10%.

Learning outcomes

Before we detail our recommendations, we should address the logical question: “How do hybrid deliveries affect learning outcomes?” (A quick “thanks” to a key stakeholder who challenged us with this question.)

To do so, we’ll once again turn to data — this time, the average scores from our end-of-class test. The test is mandatory, so the ‘Total Avg. Score’ is based on 1,617 test submissions from FY22 and FY23. All scores are based on 20 multiple-choice questions.

Format Avg. Test Score Total Avg. Score % Total Avg. Score
Classroom 18.1 17.6 +2.9%
Hybrid 17.2 17.6 -2.2%
Virtual 17.7 17.6 +0.9%

Fair warning — our test data is subject to several factors for which we cannot reliably control.

In a perfect world, we could attribute the difference in scores among the three delivery formats to nothing but the delivery format. But we can’t: there are simply too many other factors at play, and we are not confident controlling for them in a manner that necessarily produces actionable conclusions. Here are few examples of influencing factors:

  • Number of participants: Our classes vary in size — 13 in the smallest instance, and 105 in the largest instance. broadly speaking, the smaller the class, the more instructors can engage with each participant.

  • Participant experience levels: As we mentioned before, this course attracts a diverse student body. In terms of the test, participants who have “been in the seat” for years or are taking the class as a refresher are theoretically more likely to score higher than their greener peers.

  • Differing instructor styles: We rely on a pool of eight instructors for this program, and they all have different styles. While they're instructor scores are all comparably high, we cannot say, from an effectiveness perspective, they’re all “equal.”

  • Electronic vs. manual access to materials: The test is open book for all participants, but those attending virtually can more quickly find relevant information using an electronic, versus manual, search.

  • Evolving test questions: Over the course of the 29 deliveries from this which data is pulled, we’ve adjusted the test items based on participant feedback, instructor feedback, and programmatic priorities.

Considering these and other factors, we cannot definitively conclude, for instance, that classroom deliveries are the “best” because they have the highest average test scores.

End-of-class tests are not effective predictors of learning effectiveness.

We are confident that all learning and development professionals are all too familiar with this statement, but it’s worth stating anyway: end-of-class tests are not the most effective predictors of learning effectiveness. We know plenty of folks who can retain information for a test, score a 100%, and discard the information by dinner time.

We believe the “closeness” of the scores is what matters most.

At the risk of over-simplifying things, we generally assume the factors and limitations we discussed above are a wash in the context of a single delivery format. We’ve found, for example, that the ratio of first-time-to-refresher students is consistent across all delivery formats, suggesting this factor is immaterial. Similarly, while we’ve periodically refined our test questions, the average difficulty, as measured by average score and average completion duration, has remained constant.

What we can reliably claim, then, is that all three delivery formats offer similar learning outcomes. The spread among average scores is 0.9 — that is, 18.1 (classroom) minus 17.2 (hybrid). Had the spread eclipsed 2.0, we would not feel comfortable drawing this conclusion.

Headline recommendation — embrace hybrid!

If, as we believe, all else is equal, then our overarching recommendation is for federal learning and development professionals to embrace hybrid training. Not offer — embrace! We’ve heard it over and over: participants appreciate options.

Execution-based recommendations

if you agree with, or are at least intrigued by, our headline recommendation, then we encourage you to review the following execution-based recommendations to get your hybrid training program started off on the right foot.

Visit the training site at least one business day in advance.

Showing up an hour early, a standard practice for traditional classroom deliveries, is no longer enough. We recommend performing a two- to three-hour site visit at least one business day before the training to ensure everything is ready for prime time. The goal is to remove any day-of surprises.

Once an instructional team has become familiar with a training room/environment, they can revert to arriving the day of class – albeit 90 minutes early.

Get the right folks in the room.

When issues arise during the pre-class site visit, it’s critical to have the right resources, the right people, in the room to fix them. Fixing an IT issue, for instance, likely requires a member of the IT team. What’s more, this IT person may normally work from home or take every other Monday off – know their schedule, availability, and work locations.

We recommend having the following people attend the pre-class site visit:

  • Instructor(s)

  • Producer

  • Government training sponsor/program manager

  • Government IT resource

  • Government facilities resources

Have someone pretend to be a participant in the virtual classroom.

We also recommend having someone, who’s at home or at their office, test the virtual classroom. They should login to the training environment – Teams, ZoomGov, WebEx, etc. – and test the full virtual learning experience, including audio, video, and chat. This means the producer should activate the classroom, and the instructors should share their learning materials and walk through a few activities. You want the test to be as real as possible.

Test everything.

The pre-class visit shouldn’t focus exclusively on the virtual experience, however. Testing should cover everything. On several occasions, our instructions were the first people to use a classroom in years. Dust had collected. Projector bulbs were burnt out. Batteries were missing or dead. In one instance, our instructors arrived at a training site to find that over half of the chairs in the training room – which were ostensibly more comfortable than their office peers – had been “repurposed” elsewhere in the building.

Ensure the instructor and producer understand each other’s styles.

It’s likely, if not certain, that the producer will have to remind the instructor to repeat in-classroom questions for virtual participants, and doing so may require the producer to interrupt the instructor. For this to occur smoothly and respectfully, the producer should understand the instructor’s flow and cadence. At the same time, the instructor should appreciate the subtle visual clues a producer may offer when he or she needs something from the instructor.

Share exams, tests, and evaluations with links and QR codes.

We don’t know of a single instructor or facilitator who wants to return to the days of grading hardcopy tests and exams by hand. (If you do, they’re probably not a great fit for hybrid instruction.)

Like we said before, don’t rely too heavily on your host facility.

While we in no way want to downplay the critical role our training hosts play, we also want to be mindful of the resources they have available and our shared desire to minimize executional risk. Therefore, we recommend partnering with a training provider that can bring a complete, turn-key solution to you.

As we said before, all our lessons-learned pointed to developing a hybrid solution that minimizes external dependencies. Our instructors carry their own laptops and microphones, just as our producers will bring the video camera. Although far from reliable, we’ll bring a wireless hotspot just in case.

Tell us where to be. We’ll take care of the rest.

Conclusion

If we may extract one positive from the Covid-19 pandemic, it’s that the need to work from home undeniably demonstrated that federal employees (just like their private sector peers) were ready and able to embrace hybrid working arrangements — just as they were ready and able to embrace hybrid learning arrangements.

As we look into the future, both near- and long-term, we believe the federal government will necessarily embrace hybrid working arrangements to continue to attract, retain, and develop the best. And this hybrid workforce wants (and deserves) hybrid training.

Once again, it’s all about optionality. Hybrid training allows participants to learn and engage in the matter that best suits their personal and professional well-being. While not without its shortcomings, hybrid training does not pose “new” shortcomings with which we weren’t already familiar from classroom and virtual trainings. Hybrid trainings, in other words, present nothing but upside for participants, providers, and sponsors.

Footnotes

[1] At the time, we did not know we were in the latter-half of the pandemic.

[2] Some states, for example, had a cap on the number of unrelated individuals you could collocate in a physical space, such as a hotel conference room.

[3] Technically, we delivered the course in CONUS and OCONUS locations.

[4] On a five-point Likert scale, we asked participants to evaluate this statement: “Overall, I was satisfied with this course.”

[5] We have not delivered a virtual training in FY23 because our stakeholders now opt, by default, for hybrid instead of virtual.

[6] On a five-point Likert scale, we asked participants to evaluate this statement: “The medium and technology used for course delivery was effective.”

Format FY22 "M&T" Rating
Classroom 4.51 / 5.00
Hybrid 4.34 / 5.00