Future Grand Challenges Arising from Methods and Models of NGTEL Workshop

Cloud created by:

Gill Clough
29 March 2011

This cloud is to collect the proposed grand challenges for technology enhanced learning over the next ten years. These arise from the four strands of the Methods and Models of Next Generation Technology Enhanced Learning workshop.

These are being firmed up and will be used to guide research across Europe in coming years so feedback is important. Please comment in the discussion area.

Proposed Evaluation Grand Challenges

  • How can we construct evaluations of TEL that allows complexities of interaction between policy, strategic leadership, teacher and student to be negotiated successfully?

The challenge is having a common awareness that allows each set of stakeholders to feel that their stake is being addressed. You need methods that take into account the motivators for those various stakeholders. The motivations differ between stakeholders. Unless you get consistency, you end up in the situation where you have an effective methodology for evaluating an initiative, but the stakeholders do not buy into it.

  • How can you make evaluation adaptive and integrated with evolving designs of learning over the entire development cycle?

Proposed Design Grand Challenges

  • To create socio-technical environments in which people of all ages are inspired to learn rather than have to learn.
    • To give learners enough control to become active in the process of pursuing personally meaningful problems as well as providing enough support for their activity to result in the construction of useful knowledge and artefacts.
  • How do we cre [removed][removed] ate a platform for open, live, malleable, dynamic representation of design knowledge in TEL, supporting collaborative processes of design for learning, learning to design, and learning by design, and including the broadest community possible in these processes.
    • How do we capture design knowledge and represent it in a sharable (universal) language
    • How do we bridge the gap between expert designers and user designers?
    • Can we create open living systems that allow users to participate different ways in the design process?

Proposed Assessment Grand Challenges

  • Assessment literacy for all
  • How can learning be assessed in an open, social TEL environment.

Proposed Research Grand Challenges

  • Develop new technology to harness the power of emotions for learning

 

 

Extra content

Proposed Evaluation Grand Challenges  

  • How do you identify learning success from data collected from everyday informal learning.
  • Assessment is a fundamental grand challenge as we move into more open environments.
  • Evaluation needs to be adaptive, agile to match what is happening at the design and development level.
  • Evaluation objectives need to be aligned with the learning and the methods we use need to measure what matters not just what can be measured.
  • Who owns the evaluation – who is it for and whose interests does it server
  • Standardisation whilst at the same time allowing space for evaluation.
  • We need to allocate adequate resources so that evaluation and assessment is meaningful.
  • How can we carry out effective evaluation when the learner becomes the expert and may know more than the teacher?
  • How can we teach governments to ask better questions?
  • How can we construct evaluations of TEL that capture complexities of interaction between policy, strategic leadership, teacher and student?

Gill Clough
10:27 on 29 March 2011 (Edited 10:52 on 29 March 2011)

Proposed Design Grand Challenges

  • To create socio-technical environments in which people of all ages want to learn rather than have to learn.
  • To give learners enough control to become active in the process of pursuing personally meaningful problems as well as providing enough support for their activity to result in the construction of useful knowledge and artefacts.
  • We need to understand the drawbacks associated with interest-driven learning. Is a fragmented culture better or worse for enhance learning? How do we assess the quality of resulting artefacts? Will we drown in irrelevant information?

Gill Clough
10:34 on 29 March 2011 (Edited 10:55 on 29 March 2011)

Assessment grand challenges (proposed)

  • What counts as evidence?
  • Look for guidelines for assessment for elearning
  • How openness give challenges to how we demonstrate our credentials when assessing students performance in elearning.
  • What can we do with mobiles and assessment?
  • Development of an evidence-based assessment system for cognitive, affective and psychomotoric learning providing learners with timely feedback at the right moment that leads, finally, to a society-wide asssessment literacy and a changed perception of assessment.

Rebecca Ferguson
10:40 on 29 March 2011 (Edited 10:44 on 29 March 2011)

Research Grand Challenges (proposed)

  • How do you create platforms that are open, dynamic, malleable and live, which allow people to engage with these representations [of the learning design process] and the processes that they support?
  • Can we design technology to harness the power of emotions for learning?
  • How do we capture design knowledge and represent it in a sharable (universal) language
  • How do we bridge the gap between expert designers and user designers?
  • Can we create open living systems that allow users to participate different ways in the design process?
  • Can we create a system that is open, dynamic, malleable, alive representation of design knowledge and support design processes. That allows people to engage at different levels with the representations that these support – as users of design products, or critiques. Or engaging with these materials to support their own projects.

 

Rebecca Ferguson
10:48 on 29 March 2011 (Edited 10:51 on 29 March 2011)

And here's how one of those challenges looked a year later...

New forms of assessment of learning in TEL environments

Develop learning analytics and ethical frameworks that will enable us to change our assessment practices to support learning and teaching in data-rich environments.

Our current model for the assessment of learning was developed when groups of learners were taught and examined at the same time in the same physical location, teachers and learners were clearly differentiated, and both online collaboration and publication were unknown. This model of assessment is primarily individual and is firmly bound to hierarchical educational structures.

The past decade has seen worldwide growth in the use of online and blended learning, together with increasing interest in the possibilities offered by informal learning and lifelong learning. Open social TEL environments provide learners and teachers with global access to a range of people and mediating artefacts. At the same time, digital media offer new modes of representation and expression and the opportunity to reach a wide audience. Within these environments individuals may move rapidly between the roles of novice and expert as they work on different tasks, and their collaborations can extend across time and space.

During the same period, the use of virtual learning environments (VLEs) and content management systems (CMSs) such as Moodle and Blackboard has become widespread. Educational institutions therefore have access to increasingly large sets of data available about students – including interaction data, personal data, systems information and academic information. This is ‘big data’ in the sense that the amount involved is too large to be managed or analysed by typical database software tools. Nevertheless, it is clear that it could be used to support assessment of learning process and learning outcomes.

Together, these changes provide opportunities to extend our current repertoire of assessment techniques. At the same time they present the challenge of using this data to support assessment in ways that are meaningful, effective and ethical.

These opportunities and challenges are already being explored by learning analytics researchers – including the Society for Learning Analytics Research www.solaresearch.org - and by the educational data mining community www.educationaldatamining.org/ Recently, the Gates Foundation has invested millions of dollars in order to ‘dramatically improve college readiness and college completion in the United States’ and has identified learning analytics as one of the key ways of doing this. While this level of interest and funding are welcome, it is important to ensure that the development of learning analytics is not driven by the economic agenda of a single country.

Bringing together practice, research and perspectives from many countries will enable the development of new models of assessment that support the changes in education that are taking place across Europe.

Main activities to address this Grand Challenge Problem

  • Learning network analysis – assessing networks and driving the development of groups and networks that provide effective support for learners
  • Learning dialogue analysis – assessing the quality of dialogue, and using this formative assessment to guide the development of learning dialogue
  • Learning dispositions analysis – assessing the activity of individuals and groups, and using this formative assessment to guide the development of skills and behaviours associated with learning
  • Learning content analysis – assessing the resources available to learners and teachers, and using this information to recommend appropriate materials, groups and experts
  • Summative analysis of networks, dialogue, behaviour and content that are valued by learners, teachers and society.
  • Development of recommendation engines that use these analytics to provide personalised recommendations that support learning and that are valued by learners, teachers and society
  • Development of visualisations and dashboards that present these assessments to learners and teachers in accessible and meaningful ways
  • Development of ethical frameworks that guide the collection, manipulation, analysis, storage and reporting of data in the context of learning analytics and assessment.

Timeframe for the Grand Challenge Problem

Initial work on learning analytics is currently underway, providing analysis, visualisations and recommendations that support learners and teachers and help to develop meta-cognitive skills, educational dialogue and learning. Within five years these initial explorations could be trialled, developed, validated and scaled up for widespread use.

Measurable progress and success indicators

Measurable improvements in:

  • Engagement with learning – supported by directed feedback
  • Quality of online learning dialogue
  • Engagement with online learning networks
  • Retention – due to appropriate and personalised feedback
  • Class management – due to development of students-in-trouble alerting systems
  • Learners’ and teachers’ awareness of the value of learning analytics

Attraction of funding

A potential funder is ‘Next Generation Learning Challenges’: a collaborative, multi-year US grant programme aimed at dramatically increasing college readiness and completion through applied technology. Grant money is issued in multiple funding waves launched every 6-12 months. Wave 1 included a call to research learning analytics http://nextgenlearning.org/

Another possible funder is Google, which offers Research awards in several relevant areas, including machine learning and data mining, and educational innovation http://research.google.com/university/relations/research_awards.html

While learning analytics can be developed to run on specific VLEs, a large-scale research effort would be needed to bring together different forms of learning analytics and to make them available to learners and teachers working on different platforms.

Connected research questions

  • How can we assess learning in open, social TEL environments?
  • How can we use assessment to provide measurable improvements in learning and teaching in open TEL environments?

Rebecca Ferguson
20:24 on 15 March 2012

Embedded Content

Contribute

Rebecca Ferguson
1:53pm 29 March 2011 (Edited 2:15pm 29 March 2011)


Mike explained the ideas behind a grand challenge:

  • Inspiring enough to attract interest and funding
  • Easily explainable
  • Addresses a fundamental issue of TEL within Europe
  • Achievable within 5-10 years, but not immediately
  • Timely, measurable and incremental
  • Should relate to the needs of learners

Contribute to the discussion

Please log in to post a comment. Register here if you haven't signed up yet.