Cloudworks is no longer accepting new user registrations, and will be closing down on 24th June 2019. We hope to make a read-only archive of the site available soon after.

Jonathan’s Design Narrative – Refreshing the Command, Leadership and Management (CLM) Programme

Cloud created by:

Jonathan Harding
9 April 2016

Narrator

I was the CLM Refresh project lead, co-ordinating and controlling across the full range of actors up and down the chain of command. This was an additional, and large, task on top of a busy job. 

Situation

  • The project took place between October 2014 and Jul 2015.
  • The CLM Part 3 Programme (3 courses of between 5 and 8 days, each delivered at a different rank) had not been updated at all for 5 years and not fully updated for 7 years.  (The CLM Part 3 Programme is part of a bigger CLM Programme owned by Army HQ Trg Branch)
  • Feedback from both instructors and students was that elements of the course were very outdated and/or not required or duplicated elsewhere.
  • Instructors were tending to develop their own courses, meaning that the mandated product was not being delivered and that different courses were being delivered in each education centre.  Therefore there was no consistency of output and no effective means of assessing what the effect of the CLM Programme was and whether it was a useful investment of resources.
  • Instructors felt that they had no ownership in the product they were mandated to deliver and were consequently discontented with the quality of the CLM Part 3 Programme – which was the Education Service’s flagship product  - and how its delivery and outputs reflected on the Education Service.
  • There was no central or dedicated resource available to refresh the CLM Part 3 Programme.  Resources had to come from within Education Centres where personnel were already running at stretched capacity trying to clear a backlog of students requiring the courses for promotion.
  • Additional course product in the form of cultural awareness and thinking skills had to be inserted into all levels of CLM Part 3 but no additional time was given, meaning time allocated to other Training Objectives had to be reduced.
  • The refresh had to be conducted within a timespan of 8 months; we managed to extend this to 10 months but it was still a short timeframe for a complex task when it was being conducted in addition to people’s daily work tasks.

Actors 

Geographically the actors were located mainly across the UK from Edinburgh and Belfast all the way down to London and Andover.  Some actors were also based in Germany and Cyprus. Less the small groups they were in at each location and two CPDs which brought some actors together, the project was conduct across significant geographical space, using Microsoft Sharepoint and email as the means of co-ordination.

The key actors were (in hierarchical order):

 1.  Training Branch – Owners of the CLM Programme and the overall training requirement.

2.  Director Education – Owners of the CLM Programme Part 3 and the Training Objectives for that programme. Lead for cultural awareness and thinking skills. Also responsible for quality assurance of the CLM Part 3 programme.

3.  Comds Education North, South, Germany and Cyprus – Owners of the education centres that delivered CLM Part 3 and thus controlled the available resource to conduct the refresh and to trial the new course.  I worked for Comd Education South

4.  Officers Commanding Education Centres – Responsible for refreshing the part of the course allocated to them and their staff and for conducting pilot courses. (I was one of these, in addition to being the project lead)

5.  Instructors – the course deliverers and responsible for conducting much of the detail of the refresh on my direction and delivering/providing feedback on the pilot courses.

6.  Students – responsible for providing feedback on the pilot courses.

Task

The aim of the refresh was to:

 1.  Review the Training Objectives (we were not allowed to change these) and within them the Enabling Objectives and Key Learning Points (these we could adjust within reason) to update appropriately, ensure progression both with and between courses (utilising Bloom’s taxonomy as a frame), and remove unnecessary duplication.

2.  Create Enabling Objectives and Key Learning Points for Cultural Awareness and Thinking Skills at all three course levels.

3.  Adjust time allocations for delivery of each Enabling Objective and its associated Key Learning Points and produce new model timetables.

4.  Write new Instructional Specifications for each Enabling Objective which set out the key information and content (these are the baseline from which instructors write their own individual lesson plans on a case by case basis).

5.  Write new Assessment Specifications, Assessments, Marking Guides and Assessment Record Sheets.

6.  Conduct pilot courses and make adjustments in accordance with instructor and student feedback prior to releasing the courses as ‘live’.

7.  Put in place a continuous update process to ensure that the CLM Part 3 Programme remained relevant and up to date.

8.  In addition, course instances were created on the Defence Learning Environment and all the materials uploaded.

Within the short timeframe available, the measure of success was positive feedback from across the range of pilot courses that gave confidence that the changes we had made delivered the requirement, enabled the students to engage into relevant and up to date learning, and resulted in positive learning outcomes for the students.

Actions

  1. Wrote a project brief in detail that was circulated to all actors (less students) to ensure they were fully aware of and content with the parameters and timescale of the project.
  2. Using a small group of instructors, we reviewed the Training Objectives and Enabling Objectives, adjusted the Enabling Objectives to ensure appropriate progression across the courses and to include the new requirements on cultural awareness and thinking skills. We also produced model timetables at this stage to ensure appropriate flow of learning and to give a timeframe for the delivery of each Enabling Objective (this latter point was not ideal and would have been better done after the Instructional Specifications had been rewritten but was forced upon by the decision not to allocate any additional time to the overall courses).
  3. The revised Enabling Objectives and model timetables were issued to all Education Centre actors and tasks for reviewing/rewriting the Instructional Specifications (including Key Learning Points), Assessment Specifications and associated materials were allocated on an Education Centre basis with a nominated lead (usually the 2IC) in each Education Centre.  A deadline was set for the return of the new material in draft (eight weeks was given for the review/rewrite).
  4. I and my 2IC acted as the quality control for the new materials and there was considerable back and forth negotiation over the materials produced.  This extended the timeframe for the rewrite beyond 8 weeks.
  5. Once we had a set of materials we were content with, a set of pilot courses (between 3 and 5 iterations per level) were run across the Education Centres.  Instructor and Student feedback was gathered and collated from these courses, resulting in adjustments to the materials where there was appropriate and sufficient evidence for a change.
  6. The materials were uploaded to the Defence Learning Environment (over a period of time as we relying on a Reserve Officer with the required skillset to do this).
  7. An update brief was delivered to a CPD and issued in writing to all those not present.  This dealt with a number of contentious issues that had been raised during the redesign.
  8. A framework set of documents was created for the launch of the refreshed CLM Part 3 Programme, ensuring that appropriate direction was available to all delivery actors.  This included a process for Education Centres to ‘own’ specific Enabling Objectives on behalf of the Education Services and to update those Enabling Objectives and associated material as necessary, submitting changes for approval to a central control mechanism. The ‘go live’ date for all Education Centres to deliver the CLM Part 3 refreshed courses was set and authorised by the Chain of Command.
  9. Lessons learnt were captured and sent in a short note to Director Education.

Results

 Expected Outcomes:

  1. A refreshed set of course materials fit for purpose.
  2. Courses that were relevant and engaging for students.
  3. Courses that instructors were happier to deliver and felt they had a stake in designing and developing.
  4. Completion of a key outstanding task for the Chain of Command, for which resources did not otherwise exist to complete.
  5. The creation of course instances on the Defence Learning Environment, enabling instructors to use the technology to enhance their learning delivery and overall course delivery, as well as enhancing the learning experience for the students.
  6. Clear evidence that some instructors were too inexperienced to deliver the material competently or confidently and tended to resort to what they were capable of rather than what was required.  This raised the need for internal instructor training and sufficiently robust instructor coaching and mentoring programmes at each Education Centre.

 Unexpected Outcomes:

  1. The resistance to specific changes from some actors (mainly individual instructors).
  2. The difficulty of achieving reliable and valid assessment within limited timeframes.
  3. Ensuring that the changes complied with the external accreditation frameworks associated with the CLM Part 3 Programme (e.g. CMI, ILM) – this was achieved but not entirely by design.

The objectives of the project – to deliver a refreshed CLM Part 3 Programme – were met entirely.  However, there were some contentious issues, particularly around progression, assessment, and instructor competency to deliver the materials, where the directed practices were not as embedded as I would have liked before I was posted on in July 15. Also as a result of being posted I had to pass on the task of longer term internal and external evaluation to others.

Reflections

On top of a busy daily job this project was an exhausting task.  The lack of time capacity of myself and the majority of instructors to carry out the redesign work resulted in extended timeframes and reduced quality of work because of intermittent focus.  The geographical challenge of co-ordinating this project across large distances and never meeting some of the instructor actors was significant.

 However, looking back(!), I thoroughly enjoyed the experience and learnt a great deal from it.  Some of my observations are as follows:

  1. Learning Design needs time and dedicated resource to focus on designing the learning required.
  2. Clarity of intent and direction is critical, as is a central control mechanism and quality assurance of the Learning Design.
  3. When a project is conducted across distance, collaborative tools are critical – Sharepoint was very limited in this regard.  Both videoconferencing via desktop and use of a tool like OU Live would have improved the learning redesign process; I am sure there are plenty of other tools that would also have been a potential help.
  4. Instructor competency to both design learning and to deliver learning needs serious consideration.  Some of the initial refreshed materials were very poor because instructors did not have any experience of learning design – consequently this placed a burden upon more experienced instructors to redesign and improve these materials.
  5. There was a need for both myself and the instructors to be more familiar with the Moodle-based Defence Learning Environment; I think we probably, looking back, could have used this as the forum for the redesign if we had known its range of capabilities.

 

 

 

Extra content

Embedded Content

Contribute

Contribute to the discussion

Please log in to post a comment. Register here if you haven't signed up yet.