Tuesday 12 May 2015

Learning and Development - Strategy, Evaluation and Analytics


It's definitely learning & development season in HR at the moment.  Today is the online CLO Symposium, tomorrow the CIPD L&D Show and then next week ATD ICE in Orlando.  Oh, and today I ran one of my development sessions on L&D for Symposium.

Linked to the conference is plenty of new research (and some old findings) looking at the effectiveness of L&D.


One of the areas which everyone seems to agree needs to be improved is evaluation of learning and development.

The CIPD's 2015 annual survey find that just 7% of L&D professionals evaluate the impact of their initiatives on the wider business or society, and 37% only measure the satisfaction of those that take part in L&D initiatives, rather than their wider impact.  (Interesting too to see a pie chart in a commentary on effective use of data!)




CLO note that
"Chief learning officers are often asked to demonstrate the value of training. But most aren’t satisfied with the tools, resources or data available to them. Therefore, they can’t properly establish training impact. 
Organizations increasingly leverage analytics as a decision-making tool, but only 40 percent report their measurement programs are “fully aligned” with their learning strategy. This reflects an ongoing trend: The state of measurement in learning and development is falling behind other areas of the business. CLOs are more dissatisfied with their organizational approach to measurement this year than last, continuing a trend of the past three years."

However they also look at the the HR and business measures used to inform evaluation which seems to suggest more use of higher level evaluation (eg Kirkpatrick levels 3 and 4, plus Phillips' level 5, vs 1 and 2) than is found by the CIPD:





I've also had an email from HR analyst Lauri Bassi suggesting that:
"In most organizations a different approach is needed.  More of the same (which typically means using Kirkpatrick levels 1-5, with an emphasis on the lower levels) won't get CLOs where they want to be - understanding what's working (and what isn't) in learning and development initiatives and targeting resources at the most fruitful areas for improving business results.  Instead, a more modern, "analytics-enhanced" approach is necessary."

Lauri provides several recommendations for dealing with the CLO findings, all of which I agree with:

  • Create an "authentic" learning impact evaluation by embedding it in a more holistic framework
  • Stop waiting for the perfect data warehouse.  Instead, create a "data hut."
  • Don't let the perfect become the enemy of the good.
  • Choose your initial analytics project carefully.
  • Start under the radar.
  • Remember: insightful reporting trumps data dumps.
  • Use learning evaluation to improve the effectiveness of learning.


However, I do think Lauri's suggestions go wrong when looking at opportunities for the initial analytics project she proposes:
"The best place to start is with a burning business issue.  Examples might include one or more of the following:'

  • Customer satisfaction problems
  • Lackluster sales
  • Safety
  • High levels of regretted turnover
  • Failure to achieve diversity and inclusion goals
  • Stagnant or declining employee engagement
  • Design your initial analytics project to provide actionable insight on issues that are front-and-center for senior executives, and you will find yourself in a much better place.


The problem with this approach is that it hardwires in measurement areas which may or may not be relevant, which has always been the problem with the Kirkpatrick model too.

Customer satisfaction and sales may be relevant to a L&D strategy but of course they may not.  Similarly, reaction, learning and application may be relevant to evaluation, but they don't have to be.

The real opportunity for both evaluation and analytics is to focus on strategy first and the focus measurement, evaluation and analytics on the elements in the strategy.

Ie:

1.  Develop an L&D strategy, defining inputs, activities, outcomes and business objectives / impacts which are the elements of the organisational / HCM strategy map (and have also been used within the CIPD's Valuing Your Talent framework.)




2.   Use these four perspectives to identify measures to support the strategy - these are then the basis for evaluation (and they may relate to reaction, learning and performance, but they may not.)

This then provides a L&D scorecard (and see my Slideshare presentation on why other forms of scorecard don't work, but this one does!)




3.   Use this evaluation as the basis for your initial analytics project (which may link to customer satisfaction and sales, but again it may not - it depends upon what you put in your strategy!)




As the reports point out, doing this would hopefully help L&D improve its effectiveness in driving and supporting its learning strategy too.


  • Consulting   Research  Speaking  Training  Writing
  • Strategy  - Talent - Engagement  - Change and OD 
  • Contact me to create more value for your business
  • jon [dot] ingham [at] strategic [dash] hcm [dot] com



0 comments:

Post a Comment

Please add your comment here (email me your comments if you have trouble and I will put them up for you)