Improving the Performance Improvement Process Model

Each time I present the Performance Improvement Process Model (PIPM) and as it becomes more widely used, I receive feedback from colleagues on how to improve it. Some I accept and apply, others I file away for future reference ’cause you just never know when that suggestion might fit! Looking back, I have written about the model a lot starting with “Needs Assessment or Needs Analysis” (Still my most popular blog post to date) followed by “Just because it says performance doesn’t mean it’s there (sadly)” then “Opportunities vs. Good Ideas“and finally “Putting the NEED in Needs Assessment” waaayy back in February.

Since then – there has been lots of feedback from a number of presentations practitioners and readers, so it is time to officially share version ten of the model!

The Performance Improvement Process Model (V10)

What’s different? I’m glad you asked! First, I always appreciated that Van Tiem, Moseley and Desinger’s (2012) HPT Model was “wrapped” by Change Management (CM) to indicate that it is something that has to be considered throughout. This was an important improvement on their earlier versions. Enter Change 1: The PIPM needed that too! I also believe that Project Management (PM) is equally important, as all our work is project based. Like CM and PM, formative evaluation is an activity that occurs throughout. Van Tiem et al show this by having a box at the bottom of the model that spans all other phases. I added it to the three foundational practices and show summative evaluation as the “Measure Performance” step in the process.

Big Change #2: This all occurs after the intervention analysis step. I always felt this portion of the model was thin. For major interventions, like rolling out a new enterprise customer relationship management platform, we need to do a Cost Benefit Analysis (CBA) to verify the ROI and then put it into a business case for sponsor/client approval. Fail to do that before jumping into the design and development phases and you might be doing some serious back peddling that could have been avoided!

Big Change #3: A little more clarity on the split between training and non-training interventions. BOTH require objectives and metrics if we want to be able to measure performance during and after implementation so defining user requirements was added to the non-training intervention side. Defining user requirements always reminds me of this well known cartoon:

Source: https://i.pinimg.com/originals/77/2b/4f/772b4f5055ca898a809f8c64903360f4.jpg

Another piece of the puzzle that is crucial to getting right. The picture explains it best. Right? Right!? For anyone who hasn’t experienced one of the pictures above, please stand up. I see we are all still sitting.

Change #4: Not a huge change, but I added the infinity symbol between “Measure Performance” and results improved – or not. Obvious to some that we need to keep measuring as this creates the systemic feedback loop to monitor performance, not so obvious to others. So it’s a handy little reminder.

Last Change: In earlier versions I had “Mega, Macro, Micro” as originated by Kaufman (1996) beside the Needs Assessment Step and as layers behind “Gaps.” If you aren’t familiar with “Mega Planning” you can get a brief overview here: https://en.wikipedia.org/wiki/Roger_Kaufman#Mega_Planning. I changed to worker, work, workplace and world, based on Addison, Kearney and Haig’s (2009) work as I felt it was easier for people outside of the field to understand. Don’t get me wrong, I am a Kaufman fan and a believer in Mega… but if you have never heard of it, those terms won’t make much sense.

Okay – let’s wrap this up. I started writing about each of the major steps in the “Opportunities vs. Good Ideas“and “Putting the NEED in Needs Assessment” articles earlier this year before getting side tracked by that 4-letter word “work.” My near term schedule is a little lighter so I am hoping to get back to the deeper explanations of the remaining steps.

If you like it, please pass it along. If you see something missing, have a question or want to talk about fishing, drop me a line. The model continues to improve because of all the fantastic feedback. Thanks!

References

Addison, R. Haig, C. & Kearney. L. (2009). Performance architecture: The art and science of improving organizations. San Francisco. Pfieffer.

Kaufman, R. (1996). Strategic Thinking: A Guide to Identifying and Solving Problems. Arlington, VA. & Washington, D.C. Jointly published by the American Society for Training & Development and the International Society for Performance Improvement 

Van Tiem, D.M., Moseley, J.L & Dessinger, J.C. (2012). Fundamental of performance improvement: Optimizing results through people, processes and organizations. San Francisco. Pfieffer.

Advertisements

Debunking bad training practices is never-ending work!

I recently did a keynote and session for a large government organisation. As a Debunker Club member – I always try to add a little bit of debunkification to my sessions. After I was finished – I had the opportunity to attend another session delivered by some senior leaders within the organisation. Their goal was to tell all the training professionals in the room about their experiences with this organisation’s training system and how the system needs to change.

I have always found it interesting that there are people that believe because they have attended training, they have expertise in the analysis, design, development, delivery and evaluation of training. Of course everyone has valuable feedback to offer, but in this case, the senior leader stood in front of the same audience I told to stop worrying about generational differences and claimed we have to treat the youngsters differently because she has three boys and they are different. Oh my. Stand by for a small rant – which I saved for here rather than embarrassing this person in front of about 80 training professionals.

She started by educating us on learning styles and how we all learn differently. My migraine was beginning. I’ll leave that one alone. If you still believe inn Learning Styles – here is an article that I hope will help you think differently!

Then it was on to generations and the need to change the way we train the younger generations. The example provided was changing a tire. Her son had a flat. She told him what to do. He didn’t believe her and went to YouTube to find a video that showed him. It provided the same information. SO because the boy used YouTube, training professionals need to change the way we “do” training. HOWEVER, in the next breath she told us that SHE used YouTube to learn how the repair her toaster! (I think it was her toaster – could have been something else.) Two different generations using the same medium to access information to learn something they needed at the moment to complete a task. Video may or may not be the right way to support a desired learning outcome. Patti Schank (2019) explains it very nicely in her article. It’s about what needs to be learned. Not the age of the learner.

Example #2 of why organisations need to change for the millennial… they want organisations to be invested in their personal and professional growth. Okay. Doesn’t everyone – regardless of age, gender, race etc want their employer to be invested in their development? This isn’t exclusive to one group! No matter how you want to divide up your workforce!!

Example #3. Millennials learn at different paces. Yes. Yes they do! So do Boomers, Gen X ers, Gen Y. I am a horribly slow reader and have an even worse memory for certain things (names, title of books etc). That slows me down. I have friends who I am pretty sure have near eidetic. That helps them get to where they need to be faster. Lucky them. Thankfully Google and Kindle are technologies that helps a young Boomer (or old Gen Xer) like me to close that gap!

Example #4. Millennials want learning to be FUN! I am sure they do! So do I. The problem here is – not all learning can be “fun.” Some learning is hard work, extremely stressful, even painful and there is no way to design fun into it. Aspects of infantry training, paramedic training and surgery come to mind. Do you want your surgeon to have fun or be the best surgeon she or he can be? Fun doesn’t necessarily increase engagement or improve learning outcomes. Sure – there is a time and place for it. It depends on the desired learning outcomes and the best methods and media to “get there.”

I think that’s enough on generations. As Christensen and Tremblay (2013) said… “The most current research has shown that generational differences between learners [or race or gender sic] do not in and of themselves warrant the specification of different instructional designs or the use of different learning technologies. Rather than focusing valuable energy on determining if different generations will learn more from direct instruction, e-learning, blended instruction or gaming, instructional designers should continue to work closely with subject matter experts to identify the required objectives of the curriculum.”

References

Christensen, B.D., & Tremblay, R. (2013) Generational learning differences. Myth or reality? In R. Sottilaire (Ed.), Fundamental issues in defense training and simulation (pp 21-30).

Schank, P. (2019). Does video improve engagement and learning? Retrieved from https://elearningindustry.com/engagement-and-learning-does-video-improve

The Debunker Club (n.d.). Retrieved from https://debunker.club/2015/05/22/learning-styles-are-not-an-effective-guide-for-learning-design/


Employee Appreciation Day!

Employee Appreciation Day is observed annually on the first Friday in March. This day was created as a way of focusing the attention of all the employers in all industries on employee recognition. Businesses and organizations plan celebrations across the country recognizing the achievements and contributions of their employees. 

Employees are one of your greatest assets – regardless of the size of your company.  Personally, I believe they are your single most important asset. Without your employees – where would you be? Recognition and appreciation are one of the key motivational factors in the workplace!

You can show your gratitude for your employee’s efforts and contributions to the goals of the company in a variety of ways from rewards to verbal interactions. Expressing employee appreciation increases employee job satisfaction. Plain and simple!

HOW TO OBSERVE

Great Ways To Show Your Employees Some Appreciation

  • Be Flexible – Flexibility goes a long way in this virtual reality world. If possible with your industry, allowing a little flexibility can reap huge benefits when you need last minute work done.
  • A Thank You Note – When a job has been done well, a heartfelt, hand-written thank you means more than a slap on the back or an e-mail sent off at the end of the day.
  • Team Effort Celebration – If the team pulled together and made it happen, reward them with an office pizza party, casual dress day or even close the office early so they can spend some well-earned time with family.
  • Get Caught – Make sure the employee hears you telling someone else you thought they did a great job.
  • Create a Culture of Encouragement – Employees who expand their horizons bring new skills to your workforce and will encourage others to do so too. Praise their achievements and encourage others to pursue their goals.

If you have employees, be sure to show them some appreciation and use #EmployeeAppreciationDay to post on social media.

HISTORY

National Employee Appreciation Day was created in 1995 by Bob Nelson, a founding Recognition Professional International board member, together with his publishing company, Workman Publishing.

Reference: https://nationaldaycalendar.com/

Putting the NEED in Needs Assessment

It’s time for the next installment of the Performance Improvement Process Model or PIPM. A couple weeks ago, I talked about the difference between an opportunity and a “good” idea. This post will address the next step in the model “Want or Need.”

I’m an old dog. As hard as it might be to teach me a new trick, my good friend and mentor Dr. Roger Kaufman keeps trying. He has written extensively about the difference between needs and wants. The problem starts when we used need as a verb instead of a noun. I do this ALL the time – publicly and in private correspondence with Dr K. Thankfully he is patient and reminds me of the error and we keep moving forward. One day it might just stick!

When we use need as a verb, like “I need a new boat” (my wife may have heard this once or a dozen times) we are going right to the solution and not considering other potential options. Seriously! Look at it! In dire need of replacement.

My 208 Lowe FS175

I know… it’s a sweet boat and I have caught a lot of fish in it. There is no need here what-so-ever. There were some issues with the old gal (my boat NOT my wife). Mostly ancillary equipment like the trolling motor, bilge pump and *gasp* the stereo didn’t work. Long story short, I didn’t need a new boat, rather, I wanted to get all the little irritants fixed so my fishing trips would be more enjoyable. I have talked about this misuse of need and the jump straight to solutions in past blog posts as well. See Just gimme training and Just because it says performance doesn’t mean it’s there (sadly).

How did I get onto boats and fishing!? Okay – seriously, if I keep using need the wrong way, what’s the right way? It’s so simple. Kaufman (1998) has been trying for decades to get everyone on the same page and define it as “a gap in results.” For example, I want to catch more fish. I currently catch an average of 20 a summer. I want to catch 50. The gap between my current and desired results is 30 fish. A new boat may or may not close that gap. In reality, the best way to close in on 50 is simply to spend more time fishing.

Let’s shift over to a work related example of needs. Did you know that cashiers get measured on the number of items they scan per hour? It’s called the ISAH or “Items Scanned per Active Hour” and it is calculated by averaging the total items scanned per hour when cashiers are actively signed into their registers. Generally, good industry performance is 500 ISAH.

In this fictitious example, our experienced cashiers have an average ISAH of 900. The rock stars of retail. Cashiers with 3 months experience or less have an average ISAH of 250. Based on customer feedback, there is dissatisfaction with slower transactions. They prefer to get through the checkout line fast. What’s the need?

The gap in results at the worker (cashier) level is to increase the cashiers ISAH from 250 to 500 or better. The gap in results at the workplace or organizational level is the level of customer dissatisfaction. Improving the cashier’s ISAH will contribute to increased customer satisfaction.

If you can’t describe the problem or opportunity in terms of a gap in results, it’s a want, not a need and you should proceed directly to the stop sign, take a breath and give your problem a second look. It’s probably not the real cause of whatever is giving you business pain. Next up – the Needs Assessment. Stay tuned!

References

Kaufman, R. (1998) Strategic thinking. A guide to identifying and solving problems. (Revised edition.) International Society for Performance Improvement and the American Society for Training Development. ISBN: 1-56286-051-8.

Opportunities vs. Good Ideas

I’ve decided to embark on a series of posts that walk through my Performance Improvement Process model (below) to add some narrative to each of the steps. First up is “Performance problem or Opportunity.” I want to look directly at the opportunity – and then tackle the problem in the next post.

Have you ever had a boss that comes to you and exclaims “HEY! I’ve got a great idea!!” You have suffered the boss’s good ideas before and you know it’s going to be a long day.

The key difference between an opportunity and a good idea is its alignment – or not – to the individual, organizational and societal contributions that your organization exists to produce.

Oh no! It’s the Good Idea Fairy!

An opportunity will increase the value of your organizational outcomes by improving results at one or more of the three levels. A good idea on the other hand will make work, expend resources and possibly look productive in the short term, without contributing real value at any level.

Describe the Good Idea Fairy? Okay then! This example comes from a large government organization. I was approached by my boss with his weekly great idea. “I want you to find out how much it will cost to get ten video cameras for each school.” The red flags immediately started flying! Having been sent on a number of these “missions” in the past, I knew it was time to dig a little bit and get some details before expending any effort. “Why?” I asked. “I have solved the problem of converting classroom training to e-Learning!” My boss beamed proudly. The senior managers in the organization had directed that as much training as possible should be converted to online delivery without understanding the resource and funding requirements to achieve this lofty goal. “How will we do that with video cameras?” I pressed. “Simple. We will record the instructors giving the lectures in the classroom and put it in the Learning Management System (LMS). Then the instructors will have time to add to the videos and make them into better e-learning while the next group of students watch the videos!”

If you, my reader, come from the learning field – you already see a host of problems with this “good idea.” If you don’t, imagine that the training for your new job consisted of watching someone give a presentation in a video. No interaction. No feedback. No activities. Just video, test, video, test, repeat. Other issues with this plan included the level of effort required to put the videos into the LMS with no extra resources to do the work and the fact that the instructors that would do the e-Learning Development to “improve” the videos had zero training in e-Learning design or development. They are truly experts in their field of work that are brought in to do a tour of duty as an instructor before going back to the field.

The boss had the best intentions and was trying his best to accomplish an impossible task. I explained to him that in the short term, more e-Learning would be created, however, the immediate negative impact on the students learning and the longer-term impact of reduced organizational capability created by poorly trained personnel was a significant risk. By the end of the day, the video idea was shelved.

An opportunity is something new that is aligned to the individual, organizational and societal contributions that your organization exists to produce. It will increase the value of your organizational outcomes by improving results at one or more of the levels of worker, work, workplace and the world within which we exist.

Get to Know Your Customer Day

Customers in a coffee shop

Get to Know Your Customers Day is observed annually on the third Thursday of each quarter (January, April, July, October). This is a day to reach out to your patrons and get to know them better!

When businesses get to know their customers, they can also learn a lot about where they need to grow – or in performance technology language, they can identify needs (gaps between current and desired results) which – when addressed will help them grow. Do you have a favourite locally owned and operated business where you get exceptional service? Where you are known by name and the owners know your shopping habits? I have a few of those, because we live in a relatively small town. When they don’t have what I want, they are generally willing to get it for me.

With the advent of the Internet and big-box stores, much of that personal attention has gone by the wayside. Get to Know Your Customers Day is a day to turn that around. While we should be doing this every day – make it a point to get to know a little more about your customers and make each of them feel like they are your most important customer of the day today!

A great book I discovered last year, “The Absolutely Critical Non-Essentials” by Dr. Paddi Lund, a dentist in Australia, provides some fantastic strategies – and hard evidence to help you get to know your customers. I highly recommend it.

OBSERVING GET TO KNOW YOUR CUSTOMER DAY

Grow your business by taking the time to get to know your customers. You’ll be planting a seed that will flourish! Use #GetToKnowYourCustomersDay to post your best interactions with your customers on social media.

A Better World: Are you Adding or Subtracting?

The topic of sustainability keeps coming around in my reading as of late. I was first introduced to sustainability during my Commerce program at Royal Roads in 2003. Darlene and Garry McCue were our profs and the text we used was their own called “The spiral stair.” The course was very environmentally focused which at the time put me off somewhat as I was not of the same thinking as environmentalists.

Fast forward ten years or so and I was studying Systems Thinking at Boise State University and our text was “Thinking in Systems: A Primer” by Donella Meadows. My studies at BSU literally changed the way I look at the world. Systems Thinking, Human Factors Engineering, Design Thinking, MEGA planning, Behavior Engineering… all these different models , each a new lens through which to examine the world.

Meadows taught me that to understand how systems work you must see the relationship between structures and behaviours. Gilbert’s Behavior Engineering Model (BEM) taught me how to sort between behaviours and the environmental factors (structures) that drive or restrain performance. The view from the BEM is more at the individual level which reminds me of John Maxwell’s “The 21 Irrefutable Laws of Leadership: Follow Them and People Will Follow You.” Maxwell’s 5th law is “The Law of Addition” which states that Leaders add value by serving others.” When explaining this law he challenges the reader:

“If you are a leader, then trust me, you are having either a positive or a negative impact on the people you lead. How can you tell? There is one critical question: Are you making things better for the people who follow you? That’s it. If you cannot answer with an unhesitant yes, and give some evidence that backs it up, then you may very well be a subtractor. Often subtractors don’t realize they are subtracting from others. I would say that 90 percent of all people who subtract from others do so unintentionally. They don’t recognize their negative impact on others. And when a leader is a subtractor and doesn’t change his ways, it’s only a matter of time before his impact on others goes from subtraction to division (p. 51).”

In all my adult years I have worked for some great adders and some real big subtractors. In retrospect, I am pretty sure I have been on both sides of the equation at different times in my life. I am also pretty sure that at the end of the day, my balance sheet will be in “the black” and overall I will have added more than I subtracted. As I get older and wiser, I am looking for more opportunities to add, not only at the individual level, but at all levels.

Kaufman (2011), has challenged us to ask ourselves, if we are not adding value to our shared society, how are we assured that we are not subtracting value? I think about that a lot. As we see above, we can add or subtract value from the societal down to the individual level. Kaufman’s “MEGA” has strong alignment with what I have learned about sustainability, i.e., if the results of our actions increase sustainability we are “adding.”

To put a business spin on the connection between adding value and sustainability, let me share about a coaching session I recently attended. Our coach talked about “Critical Non-Essentials,” (CNE’s) an idea developed by an Australian Dentist Paddi Lund. Lund developed processes that add value for his customers. They weren’t essential to the dental issue being treated but the CNEs differentiated his practice and he became very successful and his practice achieved sustainability!  I just gave my copy of the book to MY dentist. I’m hoping to see an Espresso machine at my next visit (read the book to find out what I mean).

What I am seeing is that when we look for opportunities to add value, or increase sustainability at any level from friends or family, stakeholders, clients or organizationally, there should be a trickle effect that will contribute to the sustainability of society as a whole. Small actions add up.

References

Kaufman, R. (2011) The Manager’s Pocket Guide to Mega Thinking and Planning. HRD Press, Inc. Amherst MA

Maxwell, John C.. The 21 Irrefutable Laws of Leadership: Follow Them and People Will Follow You. Thomas Nelson. Kindle Edition.

Meadows, Donella H.. Thinking in Systems: A Primer. Chelsea Green Publishing. Kindle Edition.

Networking Power

Social media is under fire right now and this article in NOT about THAT! I have always been pretty open in the digisphere (buzzword bingo!) because I believe the benefits far outweigh the risks. A couple days ago, I experienced some great examples of why.

I was in Seattle for the International Society of Performance Improvement’s (ISPI) 2018 conference. To kick off day 2, Dr. Will Thalheimer, founder of the debunker club sent out a tweet inviting fellow debunkers to join him at the Starbucks Reserve Roastery for a coffee prior to the opening session for day two. Lo and behold, there we all were, meeting other folks that all shared the same desire to bust performance and learning myths!

debunkers

But wait! One of these things is not like the other. Who was this twenty-something sitting here with us!? Will asked, “are you a member of ISPI?”

“No…” she responded.

“How did you learn about this meeting!?” We asked. Turns out, our new friend and debunker Mel (@MelMilloway) saw a re-tweet from a colleague about the planned meet-up and thought “Hey, I walk right by there on my way to work. I’ll just stop in and see what this is all about.”

A new member of the over 600 member debunker club – with her own following of over 11,500 Twitter users! A significant influencer in the Learning and Performance field by her own right and I hope a future member of ISPI. Our chance encounter became part of the closing story for the conference and a lead in to next year’s conference in New Orleans where the theme will be storytelling. Apropos!

Another example was my online encounter with @TriciaRansom on Twitter who was following #ISPI2018 for the two days of the conference. She wasn’t able to join us, so she was watching what the conference delegates were sharing online and learning! Tricia has almost 3,000 followers!

For my last example, we are leaving Twitter and jumping over to Facebook. A good friend and colleague of mine who is currently serving with NATO in the US was also tracking what was going on. I have my Twitter account set up to re-post all my tweets on my Facebook personal and business pages. My retweet on “Learning Analysis of a Technology Supported Learning Environment” by Angela Low from the ISPI Potomac Chapter caught his eye on Facebook and he asked if I could get that paper for him. More sharing!

I haven’t been much of a Tweeter up to this point and only started using Twitter seriously at last year’s ISPI conference in Montreal as a bit of a personal experiment. My desire to spread the word about ISPI and all the amazing learning that happens at the annual conference had me tweeting away again this year. Seeing the above examples first hand has me more convinced than ever that we need to continue to embrace social media in order to grow professionally – and make great new friends along the way!

Turning Brick into Paper: An Example

A colleague of mine shared this article, Leaders turn brick into paper in our professional network for military Training Development Officers last week. The article brought back a lot of memories from my days in the Navy. As I read it, I reflected on great, good and not so good leaders at all levels and some real leadership challenges for me personally. Under the section of the article called “Common brick walls” the example of “that piece of equipment that hasn’t worked since I’ve been here and my people tell me it’s never worked” was a real flashback.

When I was assigned to be the Chief Instructor of Acoustics, the standard practice was to send each class down to the dockyard in the evenings to do their active sonar training on-board one of the ships in harbour. It had always been done this way as there was no active sonar simulator at the school, so at first, with many other more pressing items getting my attention, no problem!

cfb-esquimalt-02
Canadian Forces Fleet School Esquimalt Engineering, Weapons and Acoustic Schools

Then we were “asked” if we could increase the throughput of the school by about 35% for the next two to three years. If you have read the brick to paper article (it’s really good-you should), you’ll have seen the five no’s method for “honing your fortitude.” I may have been guilty of immediately throwing out “no” #1 at the start of the conversation. I’ve always been a can do sort of fella so I told my boss I would look into it, sat down with the senior staff and asked “well, can we do this?” Their immediate reaction was NO and a lot of “bricks” were being thrown into a rapidly growing wall as the discussion progressed. Oh-oh. Problem.

The school is a fairly simple system as shown in the Logic Model below. Students are the inputs, staff and facilities are the resources and trained sailors are the outputs. This system was designed for a throughput of 80 sailors a year. Eight groups of ten.

Kellogg2

That number was based on the number of classrooms, instructors and the various trainers/simulators needed to prepare each individual to meet the Navy’s standard for a new Sonar Operator. We sat down with a calendar, a calculator and a bag of chicken bones and started trying to fit different scenarios together to make 120 sailors go through the system each year. Let’s look at some of the No’s being presented.

NO!! #1: Secure Facilities

84109_secret_squirrel_coin_front_1024x1024Some of the material taught is classified which makes the Acoustic School a secure area and therefore, the number of classrooms available is fixed. The number of classrooms were insufficient, but there were classrooms in other parts of the Fleet School we could use during the heaviest parts of the schedule to teach the unclassified portions of the curriculum. We even developed a Plan B where we could run two shifts with day and evening classes. One brick turned into paper.

No!! #2: Schedule and Teaching Assignments

Training in the Canadian Forces is designed very rigorously and sequencing is one of the many design considerations. For this course in particular, Oceanography always came first because you need to understand how sound works in the ocean before you can start using sonar. Makes total sense. So the courses run three at a time with a staggered start date so one group finishes oceanography and then then next group comes in and starts with oceanography. A well oiled machine.

The instructional staff were assigned to teach one of the three main subject areas: Oceanography/Ancillary Equipment, Active Acoustics and Passive Acoustics. Because the passive acoustics is 2/3 of the entire course, more staff are assigned to that subject. To meet the objective, we would have to start more courses with a shorter stagger between starts and someone teaching one topic might have to teach one of the other subjects. There were also a couple courses where it just wasn’t possible to start with oceanography so we looked at the next best sequencing option and made it work.

The number of staff assigned to each course was temporarily adjusted as well. The standard required that two junior instructors be assigned to each class, with one lead instructor supervising two classes and four junior instructors at a time. Depending on the course load and staff available, this was adjusted as required to ensure that there was minimum one junior and one senior instructor assigned to every class. A single lead instructor may have been supervising three different classes, but everyone was “represented.”

NO!! #3: Staff

Staffing – as you can see from above – was a big issue. The school runs in a perpetual state of staff shortages. As noted earlier, the Acoustic School is designed for a throughput of 80 trainees with (if I remember correctly) a staff of 12. What was never included in the fine print was that from those 12 staff – the Navy would pluck out people for career training courses, higher priority taskings, etc. Looking at the schedule and the number of courses compared to available instructors, current and anticipated staff shortfalls – it was clear we would burn out the team and have periods where, even with me teaching in the classroom, we would be short. This took a few e-mails and phone-calls to work out with the various levels of HQ to make sure that when our staff were selected to go on their own training or for other tasks we would get replacements (normally we just handled it internally). Another brick gone.

NO!! #4: Active Sonar Training

No matter how many ways we looked at the situation, the active sonar training was an issue. There would never be enough ships in the harbour to get 120 sailors trained a year. But wait!! There was a new multi-million dollar trainer called the Naval Combat Operator Trainer or “NCOT” sitting in the basement of our building for training navy combat operators including active sonar. When I first arrived at the school I had asked why we weren’t using this trainer and went to the ships instead.

MDA_Naval_Combat_Operator_Trainer_NCOT
MDA’s Naval Combat Operator Trainer (NCOT) delivered to the Royal Canadian Navy in 2000. 
Photo: MacDonald, Dettwiler and Associates Ltd.

The answer was all kinds of no no no’s from the staff. The software was no good, there were too many errors in it. The trainer crashed all the time and on and on went the reasons. A big brick wall – and like I said earlier, there were more pressing issues when I first arrived. A common (and simple) method for conducting root cause analysis is the “5 Why’s.” Just keep asking why until you find the true root cause. I didn’t learn this method for another seven years, but I was applying it here! No – why? No – why? Repeat. Ultimately, the staff’s position was that the NCOT’s computer based training (CBT) and sonar simulator could not meet the requirements of the training standard. However, my staff – God bless them – couldn’t cite one specific example of where the standard would not be met. No data.

20170921_IWP

So we tasked one of my sharpest instructors to take the training standard and all the active sonar lesson plans down to the NCOT trainer and do a gap analysis. This was another term I didn’t learn until later in my career, but we were doing it here instinctively. “Here is the requirement – test they system and tell us where the issues are.” THEN we will talk again. We gave him two weeks. Twice (that I recall) in that two week period, my active sonar expert popped his head in my office, and relayed the great results he was seeing. Not only did the NCOT emulate the sonar very well, it also had self-paced lessons (CBT) that might reduce the amount of face-to-face instruction, reducing the staff burden noted above. Well! Imagine that. Data.

After the analysis was complete, we sat down again and looked at the entire situation. All the No’s had been addressed AND – added bonus – as part of this “naval gazing exercise” (pun definitely intended) we identified some other places where we could further refine the system. While instructors had their “subject area” of expertise, moving forward, they would have to stay up to date on the other areas as well in order to increase our flexibility with manning issues. All our bricks were paper. We increased the throughout as “asked” and the school entered it’s busiest period in recent memory, fully prepared thanks to the dedication and professionalism of everyone on our team! When I left the school in 2002, we were still spitting out 120 (ish) sonar operators a year.

Picture2

 

 

 

 

 

 

Just gimme training…

Author’s note: The names have been changed, faces blurred and voices altered to protect the innocent in this post. 

It’s a story I have heard many times. The boss wants training, the Instructional designer – performance technologist – analyst KNOWS that training isn’t going to fix the problem but is told to “gimme the training.”

Picture1

A client told me awhile back that they needed a course. That’s always the first red flag right!?

I asked “Why? What’s the problem?” Turned out that my client was told that a group in another part of the organization wasn’t performing as needed when requesting and using my client’s services to get their job done. Their boss wanted my client to train his people. Classic example of picking the solution before understanding the problem.

Picture3
Picture2

To make the story easier to understand, let’s call the group that is experiencing the performance issue the “Jocks.” We’ll call my client – the one told to fix the problem, the “Wizards.” The Wizards are responsible for producing some pretty cool magic that helps the Jocks prepare for their games. The problem is that the Jocks don’t understand exactly what type of magic the Wizards can provide, how to ask for or use the magic once they do get it and as you can imagine – Jocks and Wizards don’t exactly speak the same language, so they don’t even know how to interact with each other at this stage.

As a good performance technologist or CPT, I asked for more information from the Wizards and the Jocks – starting at the end – the outcome – the desired level of performance. In this case, that would be the Jocks being ready for the game. Two other key pieces of the puzzle were the Jock’s “playbook” and the Wizard’s “spells book.” Much to my surprise, the playbook used by the jocks to get ready for games had much the same process as used by the Wizards.  It may look familiar:

  1. Define the objective
  2. Plan
  3. Prepare
  4. Execute
  5. Analyse/Evaluate results

Yes… at a deeper level of fidelity, there are some differences, but overall, the Jocks and the Wizards are using the same process, with the same goal. So after carefully reviewing the plays and the spells, it was pretty apparent to me that if we added the information that the jocks had to give to the Wizards at each step above, training really wasn’t necessary. All we had to do was update the playbook!

Picture4

A meeting was held! After showing the Jocks and the Wizards that what they are doing is similar enough that the time and effort required to build a course might be better spent on improving the playbook, I was not at all surprised to be told, “just gimme the training.”

It might be because the direction given to my client was for a training solution. It might be because the Jocks don’t know what they don’t know at this point and they want to have the training first, work through the process a couple time first and THEN update the playbook. It might be because the Wizards don’t want to pay me to update the Jock’s playbook. I’m not sure at this point. I noted in a previous post that you may just end up getting a training solution when you really need a performance solution. It’s killing me that we are going to spend a lot of time and energy developing this course, knowing that the solution is an updated process.

In 2007 I attended the International Society of Performance Improvement’s (ISPI) “Principles and Practices of Performance Technology Workshop.” Geary Rummler and Roger Addison were my teachers. To this day I remember our discussion about this very situation and Roger’s advice that no matter how much training you develop – always leave them a job-aid.

527366_10150831625941475_1209075288_n

In 2012, my good friend and mentor Dr. Roger Kaufman – a legend in the field of performance technology (far right), introduced me to another legend, Dr. Joe Harless (middle). I had read many of Joe’s articles which all hold true today. As Guy Wallace explains at the linked article about Joe, you never say no to training. Harless (1985) is also credited with coining the term “Inside every fat course there’s a thin Job Aid crying to get out.”

So I am bashing ahead with designing and developing the course. I’m going to add in a “Student Manual” that will contain one page that explains the whole process. Thanks Geary, Roger A., Roger K. and Joe.

Reference

Harless, J. (1985). Performance technology and other popular myths. Performance & Instruction Journal, July 1985.