One of the many benefits of digital learning is its ability to generate and store both qualitative and quantitative data to help evaluate a learning activity.
A survey conducted by the CIPD in 2015 found that half of the participants did not conduct any evaluation on the majority of L&D activities and just 7% evaluated the wider impact of the activities on the business (CIPD Learning and Development Annual Survey Report, 2015). With L&D departments under increasing pressure to justify budgets, it is vital that results are monitored and presented to senior stakeholders in language they understand.
Using the Kirkpatrick model as a base I have illustrated below how digital learning can assist in evaluation.
Many L&D professionals do employ happy sheets to measure satisfaction after traditional learning events but how many are properly analysed and how useful are they?
Polls, article ratings and short surveys are quick and easy for the learner to complete and can add a social element to the learning. They also produce quantitative data that can be manipulated by the L&D professional as well as the option of free text boxes for rich, qualitative data.
Analytics can also be analysed to measure satisfaction. For example how long do people spend on an eLearning module? How much do they interact with chat boards etc.
Using an LMS system to monitor assessment results is a great way to track learning has been acquired. Even without an LMS, something as simple as Google Analytics can still track overall pass rates anonymously to prove learning has taken place.
In order to measure learning transfer to the workplace, surveys can be pushed automatically a set period after the learning event. To get a complete picture of behavioural change, these surveys can be sent to both the learner and their line manager and results stored on the learners log.
Benchmarking desired behaviours and outcomes before and after learning events and analysing these against data from other departments is the key to understanding the projects ROI and therefore proving success.
For example a programme aimed at improving customer service could quiz learners before and after they sit the learning and the improvement recorded. These results can then be tracked against customer satisfaction scores and sales data to prove the impact on the wider business.
Evaluation is an area that all L&D professionals should be thinking about and digital technologies really can help. By embracing modern learning platforms the process of evaluation can be simplified and automated, and reliable results delivered in a ready to use format.