My Experiences of evaluating learning design (Jane Nkosi )
Introduction
An example of my experience in evaluating a learning design was when I evaluated a learning module for student teachers. The focus of the evaluation was on the interactivity of the product. I used the GesSCI framework for evaluation of content. This is a Likert-type evaluation instrument with a rating scale of 1-4
The following aspects of the design were evaluated:
- Quality and Comprehensiveness of Content
- Ease of use, Functionality, Navigation and Orientation
- Attractiveness of design and quality of Craftsmanship
- Appropriateness of content for intended purpose
- Value added through interactivity and multimedia
- Maintenance and support
This is the link to the tool:http://www.gesci.org/assets/files/Knowledge Centre/content-evaluation-tool.html
The process involved evaluating the design with experts and prospective learners. The experts included content experts, material design experts and ICT expert. Learners were students in a similar programme with intended users of the design.
The experience was very enriching. It enabled me to stand back and look at the design with a fresh eye. I was able to improve the design by comments and views from the people who helped me evaluate the product.
Extra content
Comment 1 by Yishay Mor
Comment 2 by Yishay Mor
Yishay Mor
10:22am 22 February 2013 Permalink
Jane, This sounds very much like a heuristic evaluation method. Am I right? Was it conducted as a summative (i.e. post-hoc) or formative evaluation (i.e. corrective feedback on work in progress)? Yishay
Comment 3 by Tom Reeves
Tom Reeves
12:52pm 22 February 2013 Permalink
Hi Jane, Thank you for sharing this excellent example of an evaluation experience. The link to the GesSCI framework is very useful. I was not familiar with it. There are numerous tools out there (checklists, questionnaires, interview protocols, etc.) and we are hoping to collect a lot of them in the toolbox Yishay mentioned above. - Tom Reeves
Comment 4 by Tiffany Crosby

Tiffany Crosby
11:09pm 23 February 2013 Permalink
Jane,
I hadn't heard of this tool before. Does this tool allow for user explanations for the ratings or do you have to get insight into the ratings through other means? I often find the most value not in the rating itself but in the rationale as to why it was rated as it was. The why helps you narrow in on the changes the would increase the scores while also making sure you don't remove a feature that was driving a higher rating.
Tiffany
Comment 5 by Ida Brandão
Ida Brandão
12:13pm 24 February 2013 Permalink
Dear Jane,
I wasn't acquainted with this UN funded tool and I find it very interesting. I like the colour scale!
I suppose that the rationale is in the link Jane presents above
Comment 6 by Jane Nkosi

Jane Nkosi
2:50pm 24 February 2013 Permalink
Hi Tiffany and colleagues,
I found the tool to be very useful. you can actually add qualitative questions whwere participants can give more explanation and insight to some aspects.
Jane
Contribute to the discussion
Please log in to post a comment. Register here if you haven't signed up yet.
Yishay Mor
10:13am 22 February 2013 Permalink
Thanks Jane! Can you add the tool to our toolbox? http://cloudworks.ac.uk/cloudscape/view/2049 Yishay