The web-site is now in readonly mode. Login and registration are disabled. (28 June 2019)

Rosie Earl's Design Narrative: Making DPA Interesting

Cloud created by:

Rosie Earl
25 March 2013

I was the creator of a new e learning course, as I had identified that the current course was not engaging, and therefore not fit for purpose.

In October last year the company I work for suffered some serious DPA breaches, and it was my thought that not only did a DPA refresher need to be created, but also the current DPA training needed to be made more interesting, therefore more memorable. As we needed to reach over 700 people across 2 sites, I suggested that e learning would be the most sensible approach.
The key people involved were myself and another trainer designing, my manager overseeing and the compliance manager overseeing the project for sign off. The other trainer and I were very interested in creating an interactive e learning experience. Our manager was skeptical of the value of an e learning project. The compliance manager had no previous knowledge of e learning.

We wanted to create an interactive and memorable e learning DPA refresher course aimed at existing staff. The e learning activity lasted about 20 minutes, followed by an 'open book' knowledge check of 20 questions. The knowledge check acted as a quick measure of success, with a pass mark of 85%. In the long term, the measure of success would be a fall in DPA breaches.

1. Training Needs Analysis - this took place in the form of discussions with the Quality Team (who monitor such things as DPA breaches) and Operations Managers within the business.
2. Research - discussion with the Compliance Officer to find out what needed to be included, and to fact check any existing material. This took a long time, as the Compliance Officer is based in a different site.
3. Discussion with manager over deadline and budget. Budget was nothing, so my colleague and I decided to create an e learning activity using Microsoft PowerPoint.
4. Initial design phase - we put together some content only slides in order to make sure the information we were using was correct. We tried to keep the information as high level as possible. It was submitted for sign off and our manager added a lot of extra information. This was difficult for us, as we felt a lot of the new information would be confusing for the target audience. We had to negotiate with the manager and explain why certain pieces of information were unnecessary. This information was subsequently taken out. The material was then sent to 4 different managers for sign off. They each added and amended content, and my colleague and I had to keep a close eye to ensure the material did not become too complex.
5. Once the content had been signed off, we added interesting elements, such as pictures, video clips and quizzes. One challenge we had with this is the Compliance Officer was confused as he thought it was a PowerPoint presentation, not an e learning module. He felt it would be confusing to watch on a projector screen. I phoned him and explained exactly how the module worked, and made amendments based on his notes.
6. First wave - we started off by asking all of the managers and coaches to complete the knowledge check. We received feedback saying that they found it difficult to fill in the answer sheet online while referring to the presentation. We changed it for the second wave to have paper answer sheets. We also found that a number of managers failed or lost marks as they had not referred to the e learning, they had gone from knowledge. Unfortunately this knowledge was inaccurate.
7. Evaluation - the first wave answers were evaluated and a report was produced to state what our findings were, and any appropriate action which needed to be taken. This allowed for personal development for the management staff.
8. Second wave - once all of the managers and coaches had been completed, the e learning was distributed to the advisors within the contact centre. This was done a team at a time, so that we could mark and log the results accurately.
9. Evaluation - the scores were entered for each question, so that not only could the overall score be logged, but we could also break down the number of people who answered each question correctly/incorrectly. This will allow us to give more specific feedback. This action is ongoing.

The results in respect of the objectives are quite difficult to evaluate, as the project is ongoing, however we did succeed in creating a more engaging training module. This was proven by feedback from the first wave delegates, who stated that they enjoyed the pace and presentation of the module.
So far, the knowledge check results have been successful, with a 90% pass rate, therefore I feel the knowledge transfer element of the course has been successful.

The experience was difficult as there were so many people involved in the sign off process who didn't understand the methodology of the e learning. If I was to undertake the project again, I would ensure that all of the managers were aware of exactly how the e learning would work, and the audience that it was aimed at.
It would also have been easier if we had Ben given specialist e learning software, such as Moodle to carry out our design work. This would have made it less confusing for the delegates and the managers.

Extra content

Embedded Content


Contribute to the discussion

Please log in to post a comment. Register here if you haven't signed up yet.