The web-site is now in readonly mode. Login and registration are disabled. (28 June 2019)

A real life case study of an evaluation from the past

This is a description of an experience I had in a real world evaluation almost 20 years ago.

Cloud created by:

Tom Reeves
21 February 2013

Here is an excerpt from a book I wrote with an Aussie colleague, John Hedberg, about ten years ago (Reeves, T. C., & Hedberg, J. G. (2003). Interactive learning systems evaluation. Englewood Cliffs, NJ: Educational Technology Publications.). This excerpt describes a real life case study of an evaluation of an interactive training program. All six evaluation functions ((review, needs assessment, formative evaluation, effectiveness evaluation, impact evaluation, and maintenance evaluation) that were mentioned in the video introduction to Week 7 of the OLDS MOOC were involved in this project.

When the Macintosh computer was first introduced by Apple Computer, Inc. in 1984, the interfaces for the operating system and the few software programs then available for the novel machine (e.g., MacWrite, MacPaint, and MacDraw) were so simple and intuitive that Apple promoted the idea that no training was necessary to use a Macintosh. However, within a few years, the complexity of the different Macintosh models that became available, accompanied by the proliferation of sophisticated peripheral devices and powerful software packages, changed the situation dramatically. In the face of these changes, the people marketing, selling, supporting, or just using Macintosh computers began to demand more and better training. In response to these needs in the United States, Apple established a network of education centers around the country where new employees, vendors, and users could experience hands-on training with the Macintosh. As Apple sales went up, the demand for basic training increased at the education centers. At the same time, the need for higher level courses was identified as even newer programs and peripherals for the Macintosh came on the scene.

In the face of unrealistic demands on the live trainers at the Apple Education Centers, the training development group at Apple headquarters in Cupertino, California began to explore the idea of developing an interactive learning system to deliver basic Macintosh training. Eventually, a large-scale training package called Macintosh Fundamentals was developed under the leadership of Jim Laffey, a multimedia pioneer at Apple Training Support (and now a professor at the University of Missouri). Although it may seem like ancient history to today’s e-learning designers, the development of the award-winning Macintosh Fundamentals program provides a rare and noteworthy example of the integration of evaluation throughout the ISD process. The initial conceptualization of the interactive learning system included thorough reviews of existing interactive training materials as well as a comprehensive literature review. In addition, external evaluators were contracted to conduct a needs assessment at Apple Training Centers to identify the key goals and objectives that should be included in this first comprehensive interactive learning system for the Macintosh computer. 

Actual design of the training system commenced once the specifications for the delivery platform were identified. The design and development phases of the process were informed by a sequence of alpha, beta, and field tests conducted both in-house at Apple headquarters in California and at three Apple Education Centers in different states. Alpha tests normally involve assemblage of a working version of the interactive learning system so that it can be tested in-house. During beta tests, a relatively complete version of the system is provided to all internal clients as well as to selected external clients. During field tests, well-tested versions of the training system are released to most customers with the understanding that changes may still be made in the program. Numerous other formative evaluation activities drove the development effort. 

Although it took almost an entire year to complete, and cost nearly a million dollars, Macintosh Fundamentals was used by more than 10,000 trainees in the first two years of its use. During the first year of use, effectiveness evaluations were conducted by external evaluators at several training centers. After effectiveness was demonstrated, limited impact evaluations were conducted. Overall, these evaluations indicated that Macintosh Fundamentals was as effective as Apple’s leader-led training had been, yielded more consistent results, and had significant long-term impact on the knowledge, skills, and attitudes of the trainees who had used the interactive learning system. Ongoing maintenance evaluation activities eventually indicated the need for a major reconceptualization of introductory training for the Macintosh, and a new learning system was developed. The subsequent program did not have to be delivered at Apple Training Centers because it could be distributed via CD-ROM to trainees at their own local sites. (Apple now delivers much of its training over the Web.)

Of course, not everything about the development of Macintosh Fundamentals proceeded as smoothly as described above, nor was every decision made solely on the basis of evaluation data. Internal politics as well as design prejudices and other factors influenced some of the decisions made by the project managers and team members, but most of the critical decisions were informed by timely and accurate evaluation data. All six functions of evaluation that we have identified (review, needs assessment, formative evaluation, effectiveness evaluation, impact evaluation, and maintenance evaluation) were represented in this case study.

Extra content

Embedded Content


Kelly Edmonds
11:04pm 28 February 2013

Tom, good story and one that shows evaluation is important verus relying on hearsay. However, it is diffuclt to get organizatons to budget money or spend time on evaluations. Though late to this week's positngs, I am going to review some of the resources and comments to see if anyone has a few tips on implementing effective yet quick course/program/learning evaluations.

Contribute to the discussion

Please log in to post a comment. Register here if you haven't signed up yet.