Learning Analytics: CAL 2011
13 April 2011
Manchester Metropolitan University, Manchester, UK
Tony HIrst, Simon Buckingham Shum, Gráinne Conole, Rebecca Ferguson, Denise Whitelock & Anna De Liddo
Cloudscape created by:
8 April 2011
‘Learning analytics is the use of intelligent data, learner-produced data, and analysis models to discover information and social connections, and to predict and advise on learning’ (Siemens, 2010). Educators have a long history of using data to examine, analyse and evaluate teaching and learning. The recent increase both in the number of students engaged in online learning and in the variety of tools available to support this activity can mean that the amount of data and resources available proves confusing and overwhelming. Educators and learners urgently need analytics that will help them to evaluate and use these new tools and resources effectively to support teaching and learning.
This symposium presents research carried out at The Open University, the UK’s largest provider of online education with over 180,000 students currently registered, and many thousands more accessing its learning materials through open learning programmes such as Open Learn, iTunesU and iSpot. The OU has been producing wholly online undergraduate courses since 1999 and has, since its foundation, made use of learning analytics to support its students and tutors.
The four papers in this symposium deal with issues that have prompted the current worldwide interest in learning analytics: the need to improve learners’ experience, the emergence of new tools to support data analysis, the overwhelming increase in online resources, and the development of new tools for learning and teaching.
Paper 1: Analysing Tutor Feedback for e-Assessment Systems: Denise Whitelock
The need to improve learners’ experience, coupled with pressure on time and resources, prompts a demand for reliable and trustworthy e-assessment systems. This paper demonstrates how a well-established and verified model (Bales 1950) has proved successful in classifying feedback in a range of different academic disciplines, supporting development of an automatic classification system that can be used in all fields.
Paper 2: Learning analytics to identify exploratory dialogue within synchronous text chat: Rebecca Ferguson
The current abundance of online tools and resources can leave learners adrift in a sea of information. This paper describes work to identify learning dialogue taking place within the Elluminate text chat at online conferences and seminars. These analytics have potential to recommend sessions that inspire learning dialogue, and to support the effective use of dialogue by learners and teachers within synchronous online learning settings.
Paper 3: Towards discourse-centric learning analytics: Simon Buckingham Shum and Anna De Liddo
In addition to the use of pre-existing tools, learning analytics involves the creation of new tools to support learning and to deploy better learning analytics. This paper describes the development of Cohere, a prototype tool to support learners to annotate Web resources collaboratively with their ideas, classify these ideas according to their rhetorical role, and connect them into a network of meaningful rhetorical moves. This network of learners’ annotations provides both a visual representation of the online discourse and an analytical artefact which can be used to generate what we define discourse-centric learning analytics.
Paper 4: Visual Analytics around the Edges: Tony Hirst
New tools to support the visualisation and analysis of online data appear frequently but, without expert users or well-considered use cases, they are difficult for educators to use effectively. This paper employs Google Analytics and Gephi (a tool for analysis and visualisation of social networks) to examine online learning activity, both formal and informal, and shows how these tools can be used to understand and support learning and teaching.
Discussion: Gráinne Conole
This section of the symposium will extend and build on discussion of learning analytics that has taken place online and at the recent Learning Analytics and Knowledge conference (Banff, February 2011). Drawing on the evidence presented in the four papers, together with the expertise and experience of participants, it will focus on two issues: (1) How can we integrate the technical, social and pedagogical dimensions of learning analytics? (2) What are the relevant considerations when developing, evaluating and employing learning analytics?
Clouds in this Cloudscape
- Analysing Tutor Feedback for e-Assessment Systems (1 comment)
- Learning analytics resources and links
- Learning analytics to identify exploratory dialogue within synchronous text chat
- Presentation: Conole - learning analytics to foster good pedagogy
- Towards discourse-centric learning analytics
- Visual Analytics around the Edges