The web-site is now in readonly mode. Login and registration are disabled. (28 June 2019)

Evaluation

Evaluation session at JISC Cluster C 11th November 2009

Cloud created by:

Gráinne Conole
11 November 2009

Evaluation

  • How has the baseline reports informed your evaluation plans?
  • The baseline as the starting point – the first set of evaluation data
  • How to measure time on course creation activities?
  • Understanding what people are doing with that time?
  • How to measure the softer aspect?
  • Project as a messy, complex environment, doesn’t occur in isolation and so how can we really attribute outcomes to this specific project
  • Can we pin down what are our core concepts, can we look at this in terms of our evaluation?
  • Time is a really interesting aspect of looking at design….
  • Measure of time or quality of outcomes
  • It’s all about helping teachers to improve learning and teaching
  • How do we assess quality of the designs
  • How do you assess how much L&T strategy has been taken into account?
  • Quality is about meeting certain benchmark?
  • Measuring impact of intervention (on learners)
  • Evaluating design not in development
  • Strathclyde know the REAP principles work but its getting them into practice that is the tricky bit
  • Is it about measures of risk and measure of usefulness to the different stakeholders, its about measures of application
  • Effect of other variables
  • Getting principles into practice is tricky!
  • May well be conflicting different views amongst stakeholders and how do we address this?
  • How do we evaluate sharing?

Extra content

Three main themes to explore in terms of evaluation

  • Quality and acceptability - how to assess the quality of the designs
  • Principles and embedded policy
  • Problem of evaluating impact in delivery

Gráinne Conole
15:43 on 11 November 2009 (Edited 15:46 on 11 November 2009)

Sharing - how do we evaluate "sharing"?

  • Communities of practitioner or more?
  • Also about how educational practice unit works with across the board, the co-development is important
  • Types of stakeholders: learners, teachers, support staff, senior managers

Sharing what do we mean?

  • Two-way process where both ends are picking up and embedding with their practice, sharing as a dialogue/conversation
  • Does the sharing bring real benefit
  • Sharing of designs and the sharing in terms of dialogue
  • What assumptons do we make about what people are talking about in terms of learning and teaching and assume that will be taken into their practice, can we have a proxy, in terms of level of engagement and discussion
  • How do you describe the attributes of something on Cloudworks and why does that get them engaged
  • Can we look at the quality of designs on Cloudworks and the level of detailed discussion

Attributes

  • Attributes of communities, beginner teachers, subject communities, specialists, how cohesive the communities are, 

 

Can we come up with levels of sharing and categorising them?

  • Basic exchange of information
  • Using examples from their practice
  • Examples with reflection and action
  • Augmenting content/co-operation
  • Peer reflection 
  • Feedback and suggestions
  • Contextualising in terms of theory
  • Sharing and application across different disciplines
  • Socialising/small talk

Could have a profile of these for a cloudscape and see how it changes over time

Can we use this to evaluate other activities, not just online environments?

Also want to capture the follow on activities beyond the workshop, could build this into follow on evaluation data capture activities

Can we identify the kinds of networking tools, what works in which contexts? When should we run a workshop, use Cloudworks, use pedagogical patterns?

Can we identify a set of MAs across the three projects – workshops, patterns, Cloudworks etc – can we identify some measures about what to use and when and why and match those to particular stakeholders

  • Workshop/: relevance needs to be high, potential of addressing a problem/need they have, has to be fun, size and structure are important, mix of people, facilitators
  • Design representations: From abstractions through to rich narrative case studies/pedagogical patterns, abstraction of good practice/pedagogy, ease of use, familiarity, value,  a particular view of the design, it might provoke deeper thought, it highlights a particular aspect of a design, can you see quickly whether its worth looking at the design in more detail/see if it is useful/relevance for you, level of granualarity, context vs. abstraction, can you see it from a students’ perspective
  • Cloudworks:  finding things of relevance to you, serendipity, finding other relevant people, find things of interest to you quickly, motivating, addictive, fun, of value/use,  layers of participation – lurkers want different things to Cloudworks champions, timeframe to become engaged with the site
  • Informal and formal institutional mechanisms: with other colleagues, by cascading, in design teams, through institutional events – subject teams, institutional meetings
  • Professional networks – conferences, professional subject meetings, papers

Also there is an issue about transferability ?

What criteria support sharing:

Measures

  • Cloudworks – number of people engaging, quant stuff,
  • If someone has logged on have they logged on within a month
  • What do people do with their sharing?
  • Being successful in emergent ways that you might not have thought of

Gráinne Conole
16:31 on 11 November 2009

Getting principles into practice (principles in this context will mean viewpoints/ theoretical models/ methodology)

We discussed the key principles associated with each project. This discussion will focus on how will capture impact of principles on practice

Issues

  • Getting academics to 'buy-in' to principles or recognise opportunities and benefits
  • Cultural issues within institutions such as resistance to new approaches, time issues, established roles etc
  • Success will come from a shift in approach to planning, attitudes to reflective practice and also knowledge/ content
Solutions?
  • Avoid being didactic in approach to staff/ avoiding 'should' type statements
  • Train staff on volunteer only basis (can't force uptake) - champions
  • Design of tools responsive to staff need (and make this process explicit)
  • Articulate underlying principles to staff so they can make links between theories that they recognise and principles/ tools
We have set out to evaluate positive impact on practice. Specifically:
  • efficiency, flexibility and innovation (OULDI)
  • creativity and innovation (Viewpoints)
Evaluations method
Informal, verbal feedback from focus groups which is used to inform design process - continuous
Use self-recording/ questionnaires so people can identify what changes they have experienced as a result of the intervention
Ask institutions if we can get feedback from people who didn't come to the workshop and interview around design practices (control groups)
We recognised that this discussion has opened up another huge range of issues around how we might define/ measure ie innovation.

 

Rebecca Galley
16:38 on 11 November 2009 (Edited 16:43 on 11 November 2009)

Improving quality in design?

  • What are we approving and when?
  • How can we improve quality, peer review of designs, checklists/series of questions?
  • Pedagogy and learning isn’t a key feature of what is presented for approval
  • Levels of acceptability/quality
  • One objective measure – number of rejections at the approval stage?
  • Impact on the design practice (uptake of principles)

Gráinne Conole
16:43 on 11 November 2009

Getting principles into practice

  • Identified what we mean by principles – ulster viewpoints the theoretical models, OULDI – learning design methodology
  • What are the key issues in moving the principles into practice – getting uptake from staff, avoiding being too didactic in approach, difficulty of some of the cultural issues, resistance to new approaches, time issues and established roles. Success will come from a shift in academics approach and their attitude to critical reflection on what hey are doing and how they understand knowledge and content. Improvements in efficiency, improvement in creativity/ Short regular focus groups, self recording or questionnaires so they can identify the changes themselves, feedback from people who didn’t attend interventions

Gráinne Conole
16:43 on 11 November 2009

Embedded Content

Contribute

Alison Muirhead
5:07pm 8 December 2009 (Edited 3:49pm 21 December 2009)


I took part in the discussion around improving quality in designs. You can see the mindmap created here, and some of the key points emerging were:

  1.  
    • The moment of approval may be key in assessing quality, but must consider what comes before and after this
    • Are we approving the right things and asking the right questions in the approval process?
    • At the moment pedagogy is not key in the approval process, so need to think about how this could be evaluated before and after the project interventions
    • Some kind of checklist for peer review of designs could be used
    • The number of rejections may be a measure, as anticipate (if the approval criteria is right) then as quality increases, rejections (or requirement for resubmission) will decrease
    • Consider the possibility of having benchmarked designs to set standards for design quality

 

Rebecca Galley
3:36pm 19 April 2010


Update on a framework for evaluating Cloudworks (indicators of community) here

Contribute to the discussion

Please log in to post a comment. Register here if you haven't signed up yet.