Social Learning Analytics
Cloud created by:
14 June 2011
Presentation at CALRG 11 by Simon Buckingham Shum and Rebecca Ferguson
Simon starts off with a scenario: analytics report on a prospective student, indicates that there is a high chance of them dropping out (37%) - do you accept or reject?
Response from audience: Third option of defer? Has happened at the OU already, we understand this sort of prospective chances, but we don't use it for screening. Show it back to student, offer alternatives that may be better fit?
Second scenario: Feedback to student on progress in learning.
Reaction: Scary! All that information known about you in the system. May become repetitive on frequent use. Turning the tables - student may feel need to comment on tutor's activity. Might expect these dialogues to become reciprocal.
These are a bit science fiction, but most of what's in the scenarios can be done. Horizon 2001 puts learning analytics in 4-5 year window for mainstream adoption. Upcoming 2nd Int Conf on Learning Analytics and knowledge. Doug Clow's learning analytics cycle. 'Academic Analytics' - Goldstein (2005), articulated by Campbell & Oblinger (2007) in Educause paper. This goes on at schools level. IET Student Statistics and Survey Team have predictive models, previous OU study data are best predictors of future success. We do this retrospectively - we don't filter applicants, we use it to reflect on how good the outcomes were compared on what we expected. Purdue has a Signals project (traffic lights) - is a live predictive model feeding back to students. Predict that more active students in the VLE are more likely to succeed. Students like it - feel like they're being more cared for.
Academic analytics - more from business analytics world. What models of learning currently underpin analytics? And what should analytics track?
Rebecca takes over to talk about a taxonomy of social learning analytics
Learning analytics tends to be individualised. On Social Learn, interested in social interaction. Taxonomy of six types:
1. Social learning network analysis - Social network analysis. Strength of weak ties idea. Examples: SNAPP - Social Networks Adapting Pedagogical Practice - can apply it to Moodle forums (but not OU Moodle forums!), or Blackborad, and gives you a network map of it. Tony Hirst using Gephi to analyse large Twitter, delicious networks. So useful to identify interactions that promote learning, and interventions that can support.
2. Social learning discourse analytics - Neil Mercer, Karen Littleton - importance of the quality of educational dialogue to learning. Examples: Cohere - can see individuals, or groups of learners, and map by type of interaction. Rebecca's work looking at keywords in discussions, 'exploratory dialogue' in Elluminate chat - feed back to learners, also for finding route through resources - can pick out key knowledge-building moments in, say, a long conference presentation. Denise Whitelock's work on assessment feedback to learners - Open Mentor.
3. Content analytics - Not detailed analysis of forum content, but pictures, PowerPoint, etc. LOCOanalyst (Jovanovic et al 2008) - shows interactions in a forum. Suzanne Little (KMi) on visual search - look at one picture, find other usages of it - can be a social tool, if you are interested in this resource that uses these pictures, you may be interested in other resource that use them. ISpot - uses analysis - logos next to people's name which shows their expertise; can interact with the views, iterative process.
4. Social learning dispositions analytics - Ruth Deakin Crick on learning dispositions - qualities of a good learner - seven dimensions of 'learning power'. Dynamic assessmetn feed back to learners. Not learning styles (fixed), more this is how you represented yourself at this moment in time, here's how you may want to move on. Gives you a profile. ELLIment tool to help have this discussion online. Used in tool called EnquiryBlogger - sits alongside a blog - shows how learner categorises their own posts to a learning blog. Gives dashboard view.
5. Social learning context analytics - haven't found a developed instance of this. Think is important to take context in to account; much data available about context. Possibilities - My OU Story as informal feedback - e.g. learners on this course are very sociable, or all learners on this course have to encourage each other to hang on in there. Also location data - could recommend groups, other learners, resources (based on Linked Data - e.g. you're near Hadrian's Wall, why not look at the OU video about it? James Paul Gee's work on affinity groups - learning online in informal settings
Talking to CALRG about this!
Simon again, on reflections
Big shifts in landscape - OU, schools - technology shift, move to free/open expectations, social learning central to innovation (for companies), values, post-industrial role for institutions. Reshaping our conceptions of learning ... so should reshape our conceptions of learning analytics.
Business intelligence, management information perspective - is industrial C20th model of hierarchy, and stable environment. Doesn't work well in complex, changing landscape now facing us. Won't serve us in education. Important things - social capital, crtical thinking, etc - are harder to count in analytics.
It's not just what they do (taxonomy), it's how we use them with credibility and integrity. Could roll these out by an institution, the instution gets a view of it ... but is really about putting them in the hands of the learner, reflecting the social web/participatory model. Not just senior decision-makers who has access to these, but people at every level. Place them in the hanced of those being tracked. Concerns about the abose of analytics - e.g. looks a bit scary - may rests on old power configuration of an institutionally wield instrument. We want to make credible, ethically sound use of them.
Social Learning Analytics about helping people grow as learners through personal (and collective) formative feedback.
Steve: Any views about brand-new students and continuing, about which applications you'd like them to use?
Simon: Haven't rolled these tools out yet; some being used by early adopters. Not just in the research labs - are in Enterprise 2.0 tools, are in corporations already. Not deployed in systematic way in the OU.
Perry: Late response to the scenarios: second one, personalised feedback made me very angry at the time, trying to work out why. Generated by a system - irritated that it's pretending to be a human. Would prefer more depersonalised. Reminded me of personalised marketing material, cold calling experience - associate it with being screwed. So moving away from that, putting information in hands of learners is very important.
Jon: To counter that, if you present the user with more raw data, an assumption that they're skilled enough to know how to take it and do something with it. That personalised feedback is doing that for them. Could be a value in going down that route. Worry about whether there is one canonical way.
Simon: Could be personal preference - I might just visual dashboard, you might want the matey approach.
Jo: Found it patronising, like Big Brain Academy telling you to pull your socks up. Knowing it was a system produced it is somehow irritating, runs the risk of dismissing it.
Perry: Amazon recommendations - in plain English, not raw data, but nobody would confuse it with a person: those who bought this also bought.
Someone: Encourage people to use personal learning environment, could help us give feedback there. For the professor/teacher is very helpful, gives you information you've lost now students are in their own environment. Is otherwise almost impossible to track.
Simon: Skill of interpreting. Reading a visual analytics dashboard is something you may need to be trained up on - ALs could learn it, make effective use of their time and attention.
Jonathan: Big fan of analytics, see it as feedback. Question is about what kinds of feedback are needed, rather than around what visualisations. Issue of representations, ownership. Disagreement around using the hapTEL tool for monitoring. Is it really analytics, or a feedback mechanism?
Simon: Whether formative or summative is a question. Varies across their examples. Issue of literacy - children in school who understand their ELLI profiles, get insight through it, growing up thinking about themselves in those terms. May grow up thinking about their place in a social network. Parsing a social network diagram may be like reading a bar graph, which is a basic literacy thing we teach in schools.
James: Schools dashboard for Governors and teachers - children think of themselves as 'I'm a level 5 in maths, that's what I can attain'
Simon: As soon as there's a high-stakes analytic, you start gaming the analytic. Alarm bells start ringing. If we're going to count stuff, what's the academic and ethical integrity? Is it possible to reduce intangible social learning dispositions to things that can be counted in a digital world, or is that profoundly mistaken?
11:21 on 14 June 2011
A technical report setting out the line of argument in more detail…
Buckingham Shum, S. and Ferguson, R. (2011). Social Learning Analytics. Available as: Technical Report KMI-11-01, Knowledge Media Institute, The Open University, UK. http://kmi.open.ac.uk/publications/pdf/kmi-11-01.pdf
Abstract: We propose that the design and implementation of effective Social Learning Analytics presents significant challenges and opportunities for both research and enterprise, in three important respects. The first is the challenge of implementing analytics that have pedagogical and ethical integrity, in a context where power and control over data is now of primary importance. The second challenge is that the educational landscape is extraordinarily turbulent at present, in no small part due to technological drivers. Online social learning is emerging as a significant phenomenon for a variety of reasons, which we review, in order to motivate the concept of social learning, and ways of conceiving social learning environments as distinct from other social platforms. This sets the context for the third challenge, namely, to understand different types of Social Learning Analytic, each of which has specific technical and pedagogical challenges. We propose an initial taxonomy of five types. We conclude by considering potential futures for Social Learning Analytics, if the drivers and trends reviewed continue, and the prospect of solutions to some of the concerns that institution-centric learning analytics may provoke.
Simon Buckingham Shum
11:27 on 22 June 2011