This will be a short and sweet post! Yesterday I was asked if I knew the location of a video produced by BSU OPWL on “scalaring.” The term was new to my learned colleagues at BSU so I had to translate it into normal speak and explain it is a task analysis method. My curious mind wondered why they didn’t know this term, so ingrained in my own vocabulary.
Task analysis is defined bu Businessdictionary,com as:
“the systematic identification of the fundamental elements of a job, and examination of knowledge and skills required for the job’s performance. This information is used in human resource management for developing institutional objectives, training programs, and evaluation tools. See also activity analysis, job analysis, and performance analysis.”
In my time working in the Canadian military training system we used “scalaring” to assist in completing a task analysis. This often results in a wall full of post-it notes that provide a visual breakdown of the tasks and how they are grouped into performance objectives, “duty areas” (groups of related tasks) and then specific jobs.
I am not aware of the history behind the military use of the term scalar or its verb form “scalaring” which obviously hasn’t taken off like “googling” something. A brief search on the web leads me to believe this is a unique to the military training realm as everywhere else I looked it is related to mathematics, computing or physics.
The Canadian Forces Individual Training and Education System manual Design of Instructional Programmes describes a scalar diagram as:
The recommended means of conducting and documenting an instructional analysis. A scalar diagram clearly defines the overall structure of the course content by graphically illustrating the hierarchy of EOs and teaching points for each Performance Objective.
The video in question on Task Analysis performed by Dr.’s Don Stepich and Steve Villachica. One of the great challenges in task analysis is getting the expert to fully explain all the required knowledge, skill and abilities involved in successful task completion. The BSU video below provides a great demonstration of an expert analyst – Steve on the left, asking a Subject Matter Expert – Don on the right, all the probing questions needed to thoroughly deconstruct a task.
I thought rather than lose this video link in my digital pile of reference material, a quick blog post would make it simpler for me (and you) to find in the future. Happy Friday and Happy 2018!
A-P9-050-000/PT-004, Manual of Individual Training and Education, Volume 4, Design of Instructional Programmes.
How many training sessions have you gone to where you received a one sheet evaluation form that asked you to rate your instructor, the course, the room, the chair and the snacks provided at ten a.m.? Chances are you have filled out a few of these over time. In my own experience I have probably filled out hundreds and after a while, there is a tendency to just tick off “strongly agree” on everything, especially if it’s getting close to supper time!
Once in awhile, a new method comes along that radically changes the way we do things. Fire, the wheel, smartphones… you get the idea. Have you heard about Dr. Will Thalheimer’s book Performance Focussed Smile Sheets: A Radical Rethinking of a Dangerous Art Form? Will is one of my top go-to guys in evidence-based performance improvement and for myth busting methods being used in the field that aren’t so evidence-based.
Will explains why the current design of end of training evaluations are actually counter-productive, and sums it up nicely with this list of nine points:
They are not correlated with learning results.
They don’t tell us whether our learning interventions are good or bad.
They misinform us about what improvements should be made.
They don’t enable meaningful feedback loops.
They don’t support smile-sheet decision making.
They don’t help stakeholders understand smile-sheet results.
They provide misleading information.
They hurt our organizations by not enabling cycles of continuous improvement.
They create a culture of dishonest deliberation (Thalheimer, 2016, Kindle Locations 137-143).
That’s just in the book’s introduction! Will uses the rest of the book to show us all a better way for “creating smile sheets that will actually help us gather meaningful data— data that we can use to make our training more effective and more efficient” (Thalheimer, 2016, Kindle Locations 2646-2647) by targeting training effectiveness and actionable results.
Here are the results from question #1 which Will calls “The world’s best smile-sheet question.” By asking this question we are getting a measure of potential for the trainee’s improvement back on the job.
37.5% responded “I have GENERAL AWARENESS of the concepts taught, but I will need more training/practice/guidance/experience TO DO ACTUAL JOB TASKS using the concepts taught.”
62.5% answered “I am ABLE TO WORK ON ACTUAL JOB TASKS, but I’LL NEED MORE HANDS-ON EXPERIENCE to be fully competent in using the concepts taught.”
Dr. Thalheimer provides a rubric or set of standards in the book to measure the responses for each question. The standards for question 1 are shown in Table 1 below.
Not bad, but not what I can accept either. Two-thirds of the respondents felt they would be able to employ the methods taught in the workplace with more practice. One-third has a general awareness but won’t be able to apply what they learned. This is a one-day workshop that covers a lot of ground and arguably the goal is to raise awareness of performance improvement. There is also a follow on “at-work” component that the learners can do to further increase their skills and earn a certification if they choose. My goal is to have all the learners choosing C or D. Clearly, I have some work to do in the design and delivery departments for this offering.
The result above made me question if the learners had enough hands-on practice with the case study and the exercises. That takes us to question 9 shown below.
The averaged response was 54%. Will believes (and I agree) that the absolute minimum for time devoted to practice is 35%. Given the number of practical exercises, I think this number needs to be higher… in the 65% range, so that gives me some quantifiable results and work to make changes to the design before the next session. More practice – less lecture. Check!
One final example. Have you heard of spaced learning theory? Casebourne (2015) provides a good overview of the body of research that suggests that by spacing learning over time, people learn more quickly and remember better. Will has designed questions such as #11 below to measure spacing.
The results were interesting because one respondent apparently went back to the training facility the following day! Overall, it seems that the spacing designed into the workshop was effective and 69% of the respondents recognized that topics were covered more than once. As noted above, the learners do have the opportunity to apply what they learned back on the job and submit it to ISPI to earn a certification which is another spacing strategy, but one I have little control over.
In order to get a measure of actual performance improvement for question #1, and to accurately measure the spacing effect, I will need to conduct the survey again after the learners have had sufficient time to apply the skills they were taught on the job. That’s still to come.
If you are still using level one evaluations or smile sheets that ask if the learning was fun, if the learner liked the instructor and the facilities were comfortable, it’s time to re-think your approach. If you attend a training session and still receive those old style smiley sheets, you might also ask yourself how effective the training design really was. I hope this example has shown you enough evidence to convince you there is a better way. If it was – please share it with your friends and colleagues. Heck – share it with your enemies, they might become your friends!
Casebourne, I. (2015) Spaced learning: An approach to minimize the forgetting curve. Retrieved from https://www.td.org/insights/spaced-learning-an-approach-to-minimize-the-forgetting-curve 06 December, 2017.
Thalheimer, W. (2016). Performance-focused smile sheets: A radical rethinking of a dangerous art form (). Work-Learning Press. Kindle Edition.
On the 22nd of October 2017, MSN Money reported that McDonald’s McCafe Coffee is now Canada’s favourite coffee chain. Tim Horton’s, which I think most people would assume to be the front runner in this contest was FOURTH!Second Cup and Starbucks were second and third respectively. In 2014, Canadian Business reported that Timmies was #1 for coffee and the Tim Horton’s brand was rated #2. I came up empty handed for 2015/2016 results, but the slide from 1st to 4th in three years is dramatic for the franchise that is really a Canadian cliché.
Having grabbed a lunch at Tim Horton’s just the day prior, this got me thinking about WHY? I thought it was proper to do some quick data collection on the product itself. I stuck to just looking at the medium size as I already drink way too much coffee and to keep my physician happy I drink half-decaf now. As I suspected, size, price, and rewards, Mikkie D’s edges out Tim’s.
Easier to open/stays open
Lid doesn’t stay open consistently (for me)
Yes (Buy 7 get 1 free)
Roll up the Rim
*Yes – taste is totally subjective
I was a faithful Timmies drinker until around 2010 when I was on a trip to New Brunswick (Yes – Gagetown for the military folks) and McDonald’s had one of their free coffee giveaways for the entire two weeks I was there. Being a frugal fella, my mates and I picked up our free coffees every morning and discovered that the new McDonald’s bean was pretty darn good. But that wasn’t all.
There is a human factors engineering aspect to this too! It’s all about the CUP and the LID. Props to Timmie’s for the Canadian hockey logos at the bottom (Go ‘nucks).
For whatever reason, my fingers have always been pretty sensitive to heat and I need to let a Tim Horton’s coffee sit for five minutes or so before I can comfortably hold it. Those little cardboard sleeves help – but they fall off all the time and are generally a pain. Poor engineering. The McDonald’s cup on the other hand has TWO walls and doesn’t toast my tender fingertips. Far superior engineering. Human Factors Engineering which is one of the components performance analysts consider when looking at workplace performance problems. Even though the ml of coffee is only different by 17 (just a few drips) the McDonald’s cup LOOKS a lot bigger too eh!?
Now, the lid. The Timmies lid is left, McDonald’s right. I have always had trouble tearing open the Tim’s lid and getting it to stay open with that little bump thing in the middle. It never seems to work quite right. The McDonald’s lid just flips back and sticks – every single time. Again, better engineering. Added bonus, the double walled cup seems to keep the coffee hot longer. I haven’t measured this – so that’s anecdotal. So we have looked at the product, the design… let’s consider service and the value proposition. Remember when I was in New Brunswick in 2010? Tim Horton’s was king of coffee. Even with the free give away at McDonald’s, cars were lined around the block for a double-double. My pals and I couldn’t figure it out, but we were happy our line was short. Did I mention it was free? Oh yeah – I did.
I have noticed over time that the service at Tim Horton’s has declined somewhat. Again – no science here – this is just my experience. Yesterday – they gave my sandwich to someone else and after three people behind me got served I asked where my meal was… they had no idea, but there was this crispy chicken sandwich that no one wanted – no doubt belonging to the fella that scooped my ham and cheese. Disappointing. Then there are those lines! I am always amazed at how long the lines are at the Tim’s drive thru compared to McDonald’s (anecdotal). I know both chains study this stuff and they time everything. I would love to see the data from both to see who is ahead. I think I know the answer.
Finally, let’s consider the value proposition. 442ml for $1.75 vs 425ml for $1.79 – but you also get the 8th cup free at McDonald’s. That drops the price per cup to $1.53 (if you collect those stickers – and I do.) See. Only one to go.
In summary, less expensive, more coffee, better cup, better lid, cool fingers and a taste I like better. I know – taste is subjective. The rest of it though is quantitative. If service is slipping like I have experienced personally (anecdotal) then that is just further exacerbating the issue. That’s my back of the napkin analysis.
If you found some value in this post – please feel free to “like” and “share!”
If you read my post on the Education Revolution, you will remember my question “Do students (of any age) prefer one form of media more than another for learning? Do we need e-Learning for children and chalkboards for Boomers?” My answer was “of course not!”
There has been more and more information shared about the multiple generations in the workplace and the need to treat each generational cohort differently. For example, the American Management Association (N.D.) says “Each group has its own distinct characteristics, values, and attitudes toward work, based on its generation’s life experiences. To successfully integrate these diverse generations into the workplace, companies will need to embrace radical changes in recruitment, benefits, and creating a corporate culture that actively demonstrates respect and inclusion for its multigenerational work force.” Do we really need to treat Millennials different than Boomers? Of course not! I’ll explain why in a moment. But first…
I am currently working my way through a business boot-camp provided by the Leeds Grenville Small Business Enterprise Center (LGSBEC). After a little more than a year of running my own business, I needed something to motivate me to get my business plan done and the LGSBEC has met that need! There were four days of face-to-face instruction provided by Karen McDonald of the Opportunity Group to walk my group of nine entrepreneurs through the ins and outs of business plan writing. Six of the nine will be awarded a grant to kick start their business. What a great program!
One of the entrepreneurs, Holly, had a daycare disruption in week two and had to bring her four-month old daughter Jillian to class with her. That’s Holly, Jillian and Karen looking at cash flows in the picture above. I believe that baby Jillian was pointing out an error with the formula that carried the cumulative cash-flow from the previous period into the worksheet for year two.
I have to say, Jillian is the BEST baby! Very quiet and happy. We hardly knew she was there. So, where am I going with this? These three got me thinking about the generational noise again. Recall that the AMA said “Each group has its own distinct characteristics, values, and attitudes toward work, based on its generation’s life experiences.” Let’s consider THAT.
First, we have to define what a generation is. The Center for Generational Kinesthetics uses this definition: A generation is a group of people born around the same time and raised around the same place. People in this “birth cohort” exhibit similar characteristics, preferences, and values over their lifetimes.”
There are reams of studies that have defined those “similar characteristics, preferences,” etc. The ones I use for arguments sake were published by (2005) Greg Hammill and adapted for use here to show a summary of personal, lifestyle and workplace characteristics by generational cohort. Hopefully they aren’t too hard to read for you folks in the Veterans Generation. :-O
Did you take a moment to look at the charts? Do it! Look at “your generation” and see if you agree with the characteristics assigned to you. Do you agree? Are you 100% aligned? 80%? Do you feel like maybe you were born in the wrong era? (If you are having trouble reading these tables, right click and open the image in a new tab and you can zoom in to increase the font size).
We should also be aware that generational differences in attitudes toward the balance between work and other parts of life such as family may vary to some degree by gender. The charts above don’t take THAT into account.
And there’s the rub! There are glaring weaknesses in the generational research, especially with respect to the understanding of generational differences among people in the blue collar and service industry work forces, and with regard to people of lower socioeconomic status. That’s a lot of variables that keep me wondering about the validity of these categorizations of people by age.
Weiss (2003) notes that most attitudes and distinguishing characteristics attributed to the generations are identified during childhood and adolescence, but these characteristics may undergo adjustment as people experience life stage changes such as marriage, childbearing, and challenges of adulthood.
Hmmm, so as people age and experience “things,” they change? That seems pretty radical. Is it possible that all the Boomers didn’t always see work as an exciting adventure for their entire work lives?
Wellner (2003) acknowledges as well, that demographic projections are fallible since they are assumptions based on past behavior, and future behavior may or may not follow the same patterns. More concerns about validity. If you judged me on my past behaviour as a 20 year old in the Navy, you would never have predicted that I would be sitting here writing this! Maybe we do change…
The Center for Creative Leadership says, despite what is seen on television, heard on radio, and written in newspapers, magazine, books, the differences between generations are not as stark as we have been led to believe.
Here’s my favourite. Jennifer Deal, author of Retiring the Generation Gap (2008), argues that we all want essentially the same things at work. [My emphasis] Her assertion is based on seven years of research in which she surveyed more than 3,000 corporate leaders. Deal says that the conflicts have to do with influence and power—who has it and who wants it. And in some ways, the negative stereotypes about each generation are the byproducts of defense mechanisms used by the competing age groups.
So – it is definitely not recommended to make assumptions in the workplace OR in the training environment about any one individual based upon his/her membership in a chronological generational cohort or gender, age, learning style, personality characteristics or other factors.
Well then – what DO we do? Marcia Zidle provides a list of ten principles to “help you look past the stereotypes and become a more effective leader to people of all ages.
All generations have similar values. In fact, they all value family, the most. They also attach importance to integrity, achievement, love and competence.
Everyone wants respect – they just define it in the same way.
Trust matters especially with the people you work directly with. Everyone wants to trust and want to be trusted.
People of all generations want leaders who are credible and trustworthy. They also want them to listen well and be farsighted and encouraging.
Office politics is an issue – no matter what your age. Most realize that political skills are a critical component in being able to move up and be effective.
No one really likes change. Resistance to change has nothing to do with age; it is all about how much one has to gain or lose with the change.
Loyalty depends on the context not on the generation. People stay or leave a company based on their boss, opportunities, stage of life and other factors.
It’s as easy to retain a young person as it is to retain an older one. It depends on what’s important to them. Age defines a demographic not a person.
People of all generations want to make sure they have the skills and resources necessary to do their jobs well. The ability and desire to learn continues throughout life.
Everyone wants to know how they’re doing. Feedback is desired but no one likes only negative feedback; they also want positive as well.” THAT relates back to my last post!
Did you identify more strongly with Zidle’s ten principles or Hammill’s generational characteristics? To circle back to the boot-camp, which is what got me all fired up… There were five, probably six generations in the room with little Jillian. We all shared the entrepreneurial spirit, a common goal, clear and caring leadership from Karen and from what I could see, age was never a factor, except we all wanted to hold the baby!
In summary, you will get better results by (1) applying effective leadership and (2) creating a high performing work environment than you will by labeling people in your workforce by age, gender, personality or favourite colour!
If you liked the article, please feel free to share and/or leave a comment! Although WordPress gives some great stats on how many have visited… it only gives a number and a country. I’d love to hear from you!
American Management Association. (N.D.). Leading the Four Generations at Work [web log]. Retrieved from: http://www.amanet.org/training/articles/leading-the-four-generations-at-work.aspx
Deal, J. (2008). Retiring the Generation Gap: How Employees Young & Old Can Find Common Ground. Personnel Psychology, 61(1), 202-205.
Hammill, G. (2005) Mixing and Managing four generations of employees. FDU Magazine. [online, Winter/Spring 2005].
Weiss, M.J. (2003). To be about to be. American Demographics, 25(7), 29-36.
Wellner, A. S. (2003). The next 25 years. American Demographics, 25(3), 24-29.
Zidle, M. (2013) Working with different generations. [web log]. Retrieved from: https://managementhelp.org/blogs/supervision/2013/08/22/working-with-different-generations/
A lot has been said on the topic of feedback. Google will give you about 2,070,000,000 results in 0.82 seconds. There are different kinds and different definitions… my interest is in the realm of workplace performance which is defined as “the transmission of evaluative or corrective information about an action, event, or process to the original or controlling source; also: the information so transmitted.” (Thanks Merriam-Webster)
I have been seeing more and more of the traffic signs that provide feedback on your actual speed popping up as of late and it has definitely affected MY behaviour every time I come into my sleepy little town. When approaching these 40 km/h zones, I now set my cruise control at 42 and breeze by the local constabulary with an ear to ear smile!
On a recent visit to British Columbia, I saw more of these signs, with a little twist. Instead of just the hard data (your speed), a positive or negative reinforcing stimulus was also included in the form of a happy or sad face, similar to the image to the left. Psychologists call this “operant conditioning” and these signs are an effective application of feedback and reinforcement. This video of my mom driving into Vancouver is a great example of the application and a successful outcome!
Reflecting on my own experience with these signs, I realized that the sign positioned by our elementary school on the north side of town, has had the desired effect while the ones on the east and west sides of town, haven’t. Initially, when the eastern and western signs went up, I complied. Now I am less likely to slow down as much as I do for the sign at the school. That got me wondering why. No kids. Less hazard??
There is (thankfully) more scientific data than a video of my mom and my own reflections. If you are so inclined, I have provided some references below (Ebrahim, Z. & Nikraz, H., 2013 and Shinar, D., 2017) that show an overall positive effect in the reduction of speeding and accidents in areas that used digital signs instead of standard signs. There are many other studies (Chhokar, 1983; Goldhacker et al., 2014; Pritchard et al., 2012; Park et al., 2011) that report the positive effects of feedback on performance.
A Transportation Alberta guideline for the placement of these signs notes that “permanent installations may lead to a proliferation of Driver Feedback Signs which could lessen the visual impact of the signs when they are needed most” and recommends that the signs be used in one location for no more than 30 days. This explains my declining compliance with some of the signs. All the signs in my town appear to be permanent and the two not situated by a school zone are losing their impact on me. I am sure this article will make it top of mind again for awhile! I suspect that the combination of the sign at the school coupled with the fact that the police are often parked on a side street contributes to me slowing down there more often.
Long story short, these signs got me thinking about feedback and how important it is in the workplace and any system. In the most basic sense, when a system output is measured, the result (positive or negative) can be fed back into the system to make adjustments as required to improve performance.
The measurement and change aspects often become the weak spots in the system, as determining what to measure isn’t always easy and, as M.W. Shelley said in Frankenstein, “Nothing is so painful to the human mind as a great and sudden change.”
When we think about performance in the workplace, it is helpful to look at it from different perspectives. I use the four listed below (Kaufman, 2006; Addison, Haig & Kearney, 2009):
World (or societal impact from organizational outputs);
Workplace (the organization as a whole);
Work (processes and practice level); and
Worker (teams and individuals).
How we measure outputs and apply changes to the system, based on the feedback received, will obviously be different depending on the perspective, the type of output and the change(s) required.
The frequency of feedback provided will also change depending on the factors involved. In an automated system such as a thermostat connected to a furnace, the frequency is almost continuous while feedback on employee performance should definitely not be continuous.
When I started out in the military, we had an annual performance feedback session which I felt wasn’t enough. Over time that changed to quarterly, which I also felt wasn’t always enough. When I was the Chief Instructor of Acoustics, our students got feedback at least weekly and more often if they had exams or had committed some horrible sin like letting their hair get too long or putting two creases in the sleeve of their shirt! For a poor performer it could be relentless. My instructors had a feedback session on a monthly basis which seemed about right.
Chhokar (1983) notes that “more [feedback] may not always be better” and finds that “the existence of some optimum frequency of feedback (not necessarily, the most frequent)” would result in a desirable level of performance. So how do you find that sweet spot? Is it the same for every performer? Is all feedback effective feedback?
Brethower (2006) cautions that “data dumps are not feedback” and “Intelligent value-adding performance is possible only with adequate feedback; defective feedback yields defective performance, always” (pg. 126). There are lots of ideas on how often feedback is required. I can’t find a study that says “X is the optimum frequency” likely because there isn’t one.
Of course, organizations must establish a minimum to ensure that feedback is being provided. Much like trying to address every individuals learning style when designing training, creating an individualized feedback system for every employee would be impossible.
The key is (1) appropriate feedback can increase performance, (2) too much won’t have that same positive effect and (3) when you are the person providing the feedback, asking your employee how much is enough could help you find that sweet spot!
Addison, R., Haig, C., & Kearney, L. (2009). Performance architecture. The art and science of improving organizations. San Francisco, CA: Pfieffer
Brethower, D. M. (2006). Systemic issues. In Pershing, J.A. (Ed) Handbook of human performance technology: Principles practices potential. (pp. 111-137). San Francisco: Pfieffer.
Goldhacker, M., Rosengarth, K., Plank, T., & Greenlee, M. W. (2014). The effect of feedback on performance and brain activation during perceptual learning. Vision Research, 99, 10, 99-110.
Kaufman, R. (2006). Change, Choices, and Consequences: A Guide to Mega Thinking and Planning. Amherst, MA. HRD Press Inc.
Park, J. H., Son, J. Y., Kim, S., & May, W. (2011). Effect of feedback from standardized patients on medical students’ performance and perceptions of the neurological examination. Medical Teacher, 33, 12, 1005-1010.
Pritchard, R. D. D., Weaver, S. J. J., & Ashwood, E. (2012). Evidence-Based Productivity Improvement: A Practical Guide to the Productivity Measurement and Enhancement System (ProMES). Hoboken: Taylor & Francis.
I was recently invited to the NATO School in Oberammergau Germany to deliver instruction on evaluating E-Learning. As an added bonus I was asked to present the closing keynote speech to the class on the subject “The Education Revolution.”
While I am comfortable on the topic of evaluation, I am (was) not as familiar with the keynote topic so research began! The regularly scheduled keynote presenter is a big supporter of learning technology and that in part shapes his view. I too am a supporter of Learning Technology and have been branded by some as a “Techie” but you just have to see the WiFi go down in my house to know that’s not necessarily true. Anyway… as I sat down to begin my research, I wondered “Is there an education revolution underway?”
I have said “words are important” in previous posts. It’s a lesson I keep learning and applying, so no different here. I thought I had better look at the definitions of education and revolution and make sure I understood what I was looking at – and for. Thank you Merriam Webster!
1a: the action or process of educating or of being educated; also: a stage of such a process
1b: the knowledge and development resulting from an educational process
2: the field of study that deals mainly with methods of teaching and learning in schools
Pretty straight forward right? Albert Einstein’s explanation here strikes a chord with me. So what about revolution? Whenever I hear that word, images of civil wars, the Arab Spring and so on, pop into my mind. Merriam Webster‘s definitions of revolution are…
a: a sudden, radical, or complete change
b: a fundamental change in political organization; especially: the overthrow or renunciation of one government or ruler and the substitution of another by the governed
c: activity or movement designed to effect fundamental changes in the socioeconomic situation
d: a fundamental change in the way of thinking about or visualizing something: a change of paradigm
e: a changeover in use or preference especially in technology
Definitions A and B didn’t seem relevant to me. I am not aware of a sudden, radical or complete change in the education system or a fundamental change in political organization (in the West) that has impacted education in a revolutionary way so I struck those off the list and honed in on definitions C to E.
The Australian Government implemented an Education Revolution in 08-09 which had a digital component, and an infrastructure component so when you google the term – you get a lot of hits related to Australia. It was assessed just three years later that this “revolution” was not successful (Author, 2011). As a counter-point to the government’s description of the program, historian Geoffrey Blainey argued that there has only been one education revolution in Australia and it occurred in 1870 (Vanstone, 2009). Now I am no historian because remembering who did what to who – and when – has always been a challenge for me so I set off to find out what happened in 1870.
As it turns out, Blainey’s argument related to definition C “activity or movement designed to effect fundamental changes in the socioeconomic situation.” In the early to mid 1800’s Westerncountries began to mandate education for children and in the later 1800’s state funded school boards, mandatory attendance and secondary schools were implemented. This meant that children subjected to child labour were now moving from mines, factories and fields into school rooms. That’s pretty fundamental socioeconomic change so I agree with Geoffrey’s assertion that there was a revolution in the 1870’s (each country’s timeline is a little different). This gave me a “baseline” to view the education system from.
With the shift in policy, a private system for the privileged was being supplanted by a public system for all (in the west). Engineering of education had to happen to find efficiencies to meet the influx. There are always pros and cons to any system design. A core curriculum that everyone must follow, including standardized testing to ensure that students are learning what the system requires (and the teachers are teaching what they are supposed to) provides efficiency, but there are going to be compromises. The two cartoons show the trade-offs of an efficient education system pretty well, but I digress… I could not find anything to indicate that between the first revolution of the 1870s and now there has been any new activity or movement designed to effect fundamental changes in the socioeconomic situation via the education system. (If you know of something – please feel free to share your thoughts so I can become more educated). Before I struck definition C off the list, I looked into the current child labour situation. In 2014 there were approximately 168 million child labourers in the world! Based on this, it would seem that moving children from mines, factories and fields into school rooms is not yet complete and therefore the socioeconomic change started in the 1870’s (even in parts of the west) was not entirely sudden, radical or completed (definition A).
At this point, I struck off definition C and pondered, between the 1870’s and now, has there been a fundamental change in the way of thinking about or visualizing something [education], a change of paradigm (definition D)? Prior to 1543, man believed that the Earth was the center of the universe (and Toronto was the center of Canada – still a belief). Nicolaus Copernicus proposed a new model with the sun at the center (Wikipedia). A true paradigm shift! What is at the center of the universe is somewhat analogous to training and education… is it the student or the teacher at the center? In my experience there is strong agreement that it should be the student, but as we can see below in many cases the environment today (top) is much the same as it was in 1901 (bottom) and the teacher is still up front being the sage on the stage. So what about content?
Prior to 1870, the core curriculum was the “Three R’s” of reading, writing and arithmetic. I burned more than a few brain cells as a kid wondering why they were referred to as R’s… In the 1870’s the curriculum was expanded to include the sciences, history and geography. The current system still uses the same core curriculum. I’m not seeing a fundamental change in what is being taught. What about the media?
Media, or the replicable “means”, forms, or vehicles by which instruction is formatted, stored, and delivered to the learner has, and continues to change dramatically. Especially in the past 20 – 30 years. With the exception of the Pressey Learning Machine in the top right of the media examples pictured here, I have been subjected to them all. The next aspect I considered was learning methods, or the “conditions which can be implemented to foster the acquisition of competence” (Glaser as cited in Clark, 2011). Examples of learning methods include Action Learning and Coaching. There has been – in my time in the field – a steady flow of “new” methods, however, in my humble opinion, there are many that are a re-packaging or slight tweaking of existing methods aimed at generating revenue for entrepreneurial folks in the field. I am not aware of any revolutionary methods that have turned the education world on it’s ear. Again – if you are reading this and know of something – I’d love to learn about it!
When you put the media and methods together with a learning strategy that identifies activities that motivate and engage learners, formative and summative assessments to provide a program that meets organizational needs – you have a great instructional design. Nothing revolutionary here either. This has been the practice for at least 60-70 years. At this point, I concluded that a fundamental change in the way of thinking about or visualizing something: a change of paradigm in the education system overall has not happened since the 1870s. Yes, media, specifically technology based media (We still have teachers, books and white-boards) continues to advance and influences methods and strategies but is it revolutionary?
Merriam Webster’s last definition of revolution a changeover in use or preference especially in technology has two key words. USE and PREFERENCE. This made me think of the Digital Immigrant and Digital Native debate that was popular at the turn of the century when proponents like Tapscott (1999) and Prensky (2001) believed there was a generation of technologically adept learners that required a radical transformation of the education system. At that time, to me, it felt a bit panicky… like there was a crisis and if we didn’t “revolutionize” the educational system for the digital natives they were all going to fail miserably. Others like researchers Bullen, Morgan and Qayyum (2011) took the view that technology should simply be used to enhance current practices.
No doubt that younger people love their Information and Communication Technology (ICT). But how and why do they use it? What are their preferences with regards to ICT in learning? Before I go further, an important note here about generations and generalizations. “The popular press, scholarly publications, business leaders and social pundits have all used the inherently weak practice of grouping individuals into broad generational categories to support speculation that millennial students enrolled in today’s higher education institutions, as well as different generations of employees in the workplace, require different approaches to education and training” (Pedro as cited in Christensen and Tremblay, 2013). Research has shown that “leisure time use of ICT doesn’t necessarily translate into effective use of technology in education and training” and not all members of a generational cohort have the same access to ICT (Christensen and Tremblay, 2013). Trying to generalize USE by generation is tricky business or maybe bad design (or science!?).
What about PREFERENCE? Do students (of any age) prefer one form of media more than another for learning? Do we need e-Learning for children and Chalkboards for Boomers? Of course not! After chasing e-learning as the holy grail from the mid to late 90’s for at least a decade, the field began to realize in the early 2000’s (Pappas, 2015) that a blended learning approach is often better. Research has shown that “students use a limited range of mainly established technologies such as search engines, e-mail, mobile telephony and SMS messaging frequently, while “Web 2.0″ technologies such as blogs, wikis and social bookmarking tools were only used by a relatively small proportion of students” (Christensen and Tremblay, 2013). Other studies (Corrin, Lockyer & Bennett, 2010; Lohnes & Kitzner, 2007) have shown that younger students use ICT more for social purposes and older students use it more for study. I also wonder if the use of newer media has been limited by instructional designers who stick to what they know and are not using the newer technologies yet. Another potential topic!
If there has been a changeover in use or preference especially in technology (for learning) it seems that it may be with older learners like me? AND it still doesn’t seem to be revolutionary. I did an online course for math in 1996 using a Commodore 64 computer. Khan Academy is certainly slicker but I was learning the same gizintas ( 2 gizinta 4 twice) on a 64K machine with a dial up modem. I took psychology courses in 1997 and 1998 using books and a telephone. The LMS and Skype are also slicker but the same design methods still apply.
There has been steady progress and incremental change but there is a long way to go. Saying that, there are also interesting things happening that have me wondering if we are on the cusp of a true revolution.Finland‘s education system is switching from the standard core curriculum to an interdisciplinary approach. Definitely revolutionary – if it works and is applied broadly.
Another interesting initiative is Victor Saad’s Leap Year Project where he took a year off from work to create his own MBA education, described in this (20 min) Tedx Talk. From this, he created the Experience Institute to “establish experience as a credible form of education and equip students with the tools necessary to transform our world. Through apprenticeships, self-guided projects, meetups, and coaching, we create a space within higher education that helps individuals build creative confidence, agency, and a compelling portfolio.” In addition to the cohorts following the Ei program, Saad has also partnered with Stanford University to integrate his approach into educational institutions. The Leap Course is explained in this short (2 min) video.
If you made it this far – thanks for sticking it out to the end. I know this was a long post. My final conclusion is that at this moment in time while there area lot of good things happening in our field, there doesn’t appear (to me) to be a revolution underway. I am in agreement with Bullen, Morgan and Qayyum (2011) and we should continue to apply new technology (and methods) to enhance current practices – through the deliberate application of instructional design of course!
Bullen, M., Morgan, T., and Qayyuim, A (2011). Digital learners in higher education: Looking beyond stereotypes. Proceedings of the ED MEDIA Conference, Lisbon, 1 Jul 2011.
Christensen, B.D. & Tremblay, R. (2013). Generational learning differences – myth or reality? In Best, C., Galanis, G., Kerry, J., & Stottilare, R. (Eds.) Fundamental Issues in Defence Training and Simulation. Farnham, UK: Ashgate
I was walking into my current client’s building from the parking lot the other day and this path got me thinking about why everyone takes the shortcut across the grass. That got me thinking about the movie a few good men and the scene where Corporal Jeffrey Owen Barnes was cross examined by Captain Barnes… if you haven’t seen the movie or don’t recall the scene, you can watch it on YouTube.
Captain Barnes (Kevin Bacon) was trying to make a point that if something isn’t written down as policy it doesn’t exist. Lt. Daniel Kaffe (Tom Cruise) crushes the prosecution’s strategy when he asks the Corporal to show him where in the book it tells you how to get to the mess hall. I always loved that scene.
That connection made me think that knowledge management is kind of like that path and knowing where the mess hall is in ‘gitmo.” The reason this worn path exists is because parking behind the clients building is limited, but there is free parking a half block away (a rare thing in Ottawa.) Now on my first day I was there, I didn’t know about this and my client had to explain to me how to get to the free parking spot and then to the building via the worn path. It’s not written down anywhere but everyone knows – after a day or so.
As with any term these days, there are a hockey sock full of definitions. I like the one in Wikipedia because it has withstood the review of many experts.
Knowledge management (KM) is the process of creating, sharing, using and managing the knowledge and information of an organization. It refers to a multi-disciplinary approach to achieving organizational objectives by making the best use of knowledge.
Knowledge Management is relatively new having only arrived on the scene in the early 1990’s. I had the opportunity to become a certified knowledge manager back in 2010. One of the things that always stuck with me from that course was when someone asked what knowledge we should focus on creating, sharing, using and managing… the response was “if you got hit by a bus today what would the person coming in behind you need to know to do your job.”
Clearly – where the free parking is doesn’t fall into that category. But it gives you and idea where your KM requirements should start!
It’s Saturday night movie night and I was browsing Netflix for something stimulating (mentally). I stumbled across “Experimenter” a biography of Stanley Milgram. If you have done any psychology courses you probably already know of him.
If you haven’t heard of him or have ever wondered why German soldiers in WWII willingly participated in the concentration camps, why US Guards at Abu Ghraib abused prisoners or why people in your workplace will turn a blind eye to practices they know are wrong – this is a great introduction – in only 98 minutes!
Words are important. I hate to admit it – but they are. There are some folks who love to sit and debate from dawn until dusk about the best verb to use in a performance objective statement. I am 180 degrees opposite and want to get the verb that the majority agree on and move on! In my experience, the subject matter experts are pretty good at choosing a verb that works.
There are two terms that continue to get used interchangeably, Needs Assessment and Needs Analysis. Even worse is the fact that both can get shortened to “NA” by practitioners which can lead to even more confusion. Want to get crazy? Add in Training Needs Analysis (TNA) which is also often described as the “NA.”
If you want to find “the” explanation of NA and NA, be prepared for an arduous search through the Internet and many many texts where authors have put their own spin, tweak, massage and a coat of rust-o-leum paint on the definitions. I say “the” because there is no single definitive explanation.
While analyzing a previous client’s training system, the NA and NA terms were being used in very interesting but not necessarily accurate ways. To help clarify how the terms are related (but different) I headed to NeedsAssessment.org to do my own research. Watkins, Meiers and Visser’s (2012) FREE book A guide to assessing needs: Essential tools for collecting information, making decisions and achieving development results is an exceptional resource which helped me to develop the first version of this diagram:
With some feedback from my colleagues John Egan and Julie Maiilé (merci mes amis), the diagram was tweaked, spun and massaged into the picture above which tells this story:
A performance problem or new opportunity starts with a Needs Assessment. When you do a Needs Assessment you will (should?) use both needs analysis and performance analysis. The results of the Needs Assessment works to improve results through the implementation of non-training and/or training interventions.
If a training intervention is required, then you will have to do a Training Needs Analysis. The TNA uses task analysis to determine what has to be trained and what doesn’t.
One of the big ah-hah’s in this client’s situation was that the Needs Assessment function resides within the training system and was being done by training specialists. How many non-training interventions do you think get recommended?
This is a very macro view of Needs Assessment aimed at making us all a little wiser about when we should use NA or NA… or maybe never use the acronym at all? If you want to learn more, go get that book! Did I mention it’s free!?
We are adding a garage onto the front of our house. Being a telecommuter, and with warmer weather starting to appear off and on, I have been keeping the front door open and as such, I catch the occasional discussion going on between my good friend Phil aka Da Boss and Lanny, his trusty sidekick.
Last week, they were starting to close in the walls and Lanny was happily tapping away with the nail gun making that shhht-thunk noise when Phil hollered “Lanny! There’s no nails in that gun!” I wondered how Phil knew that when they were working at opposite sides of the “site.” Lanny looked puzzled because he had just loaded the gun and had no idea why the nails weren’t coming out. A quick bit of troubleshooting by Phil determined that an adjustment of the thingymajiggy had to be made because he was using longer nails. Phil showed Lanny the fine art of thingymajiggy adjustments – rapidly tapped three nails into the top plate and passed the gun back (which is when I snapped the shutter.)
That brief exchange set me off thinking about experiential learning vs on-the-job training (OJT), apprenticeships and the likes and my own preference for learning by doing. If I spent a tenth of my green fees on golf lessons and half the time I spend on a course at the driving range instead of just whacking that darn ball, I could probably break 90.
Tell me and I forget, teach me and I remember, involve me and I will learn.
~ Benjamin Franklin
Experiential Learning has been around for a long time. Kolb (1984) proposed a four stage learning cycle, shown below. Simply put, it’s learning that is designed so that students are directly involved in the learning experience. Korth & Levya-Gardner (2006) note that while the model was designed for educators (read – in the classroom), it has also “been applied to a variety of professional, organizational and managerial situations” (pg. 1124). These different applications are represented by the different groups in the center of the model below.
When Kolb’s theory is applied, the learner starts by (1) having a concrete experience followed by (2) observation of and reflection on that experience which leads to (3) the formation of abstract concepts (analysis) and generalizations (conclusions) which are then (4) used to test hypothesis in future situations, resulting in new experiences.
So this has been a bit of an epiphany if you like big(ger) words – or an ah-hah moment if you prefer the shorter ones. I have been misusing the term experiential learning for quite awhile. My scenario with Phil and Lanny is not an example of experiential learning. Lanny did not have the opportunity to reflect on his experience with the nail gun thingymajiggy and definitely did not conduct any analysis or arrive at any conclusions to test hypothesis in the future. I have been incorrectly attributed learning from any life experience as experiential learning. My bad!
Kolb, D. A. (1984). Experiential learning: Experience as the source of learning and development. Englewood Cliffs. NJ: Prentice Hall
Korth, S. J. & Levya-Gardner (2006). Rapid reflection throughout the performance-improvement process. In Pershing, J. A. (Ed.), Handbook of Human Performance Technology (1122-1146). San Francisco: Wiley.