The web-site is now in readonly mode. Login and registration are disabled. (28 June 2019)

Cloud created by:

Robert Farrow
26 October 2011

John Rinderle & Norman Bier
Carnegie Mellon University

"Conventional wisdom in the OER community maintains that one of the more important features of the open education approach is the malleability and customizability of materials, allowing freely available component resources to be remixed, adapted and modified to suit specific institutional directives, student needs or faculty interests.  These features are important enough that the ability to revise and remix content is a core part of the commonly accepted 4R framework that defines open content.  While the ability to tailor OER to meet changing or specific needs is one compelling part of the open model, the infinity variety that this encourages creates serious obstacles for another expected benefit of openness: using learning analytics to drive adaptive teaching and learning, support iterative improvement, and demonstrate effectiveness.

This presentation will explore the benefits and trade-offs to be made between adaptability and analytics.  In the course of this exploration, we will argue that the benefits to be had from an approach that places a higher priority on analytics may outweigh those to be gained from endless variety in the OER space.  Similarly, we will discuss some approaches to better harness open education’s promised ability to drive learning analytics, with greater and lesser compromises to the adaptability of OER. We will propose open communities of use and evaluation coalesced around individual OERs using learning analytics to improve the resource through coordinated revision and remix."

[The speakers profess a 'scientific' approach to learning design where hypotheses are advanced, data gathered and evaluated and new hypotheses formed.]

Understanding challenges and possibilities in learning analytics:

  • Driving feedback loops: design, performance, activities, learnign
  • 19,310 modules of content
  • 600 free courses at the OU
  • Potentially infinite proliferation
  • 4 R's Remix, Redistribute, Revise, Remix - but NOT Recreate and ADDING Evaluate
  • General proliferation of statistical approaches
  • Change driven by data / intuition / markey demand / instructor preferences
  • Problems of variety: hard to evaluate
  • Effectiveness is hit-and-miss
The speakers present the hypothesis that "OER is effective when it demonstrably supports students in meeting articulated, measuarable learning outcomes in a given set of contexts."

Delegates questioned this definition in a couple of different ways (mainly around the words 'demonstrably' and 'measurable').

"Collecting the data is not enough.  We also need to make sense of it in ways that are actionable." i.e. to lead to interventions.
Common mechanisms are thus needed for data collection, with agreed standards and methods for collection.  Data collection provokes issues around:
  • Ownership
  • Privacy 
  • Security
In any case, working with communities of evidence is a way that seems to help the sense-making process.  We also need better ways of sharing data, such as through grantee contracts.

It is proposed that the 'reusability paradox' can be ameliiorated through a community based approach.  

"This approach forces us to change our mind based on evidence..."



Extra content

Embedded Content


Contribute to the discussion

Please log in to post a comment. Register here if you haven't signed up yet.