THU: Role of Assessment in MOOCs - Exploring the Challenges (Heather Bloodworth)

Cloud created by:

Heather Bloodworth
8 January 2017

Abstract

MOOC (Massive Open Online Courses) is part of a new paradigm of education that provides the instructors and teachers with the opportunity to teach a global and varied student population (Lua et al., 2014). Nevertheless, MOOC initiatives also present many pedagogical challenges, particularly in relation to devising appropriate assessment strategies that are suitable and meaningful to a diverse group of students (Admiraal et al., 2014; Lua et al. 2014). According to Sandeen (2013), assessment needs to be a key feature of the MOOC’s design from its inception. Sandeen also states that currently one of the most exciting developments in MOOC is the “high level of experimentation and rapid prototyping of technology-based assessment that is occurring” (2013, p.2)

The aim of this project is to review relevant literature related to the emerging trends in MOOC assessment and their application in supporting student learning. While assessment is often equated with tests, exams and evaluations, according to Erwin (1991) cited in Swan et al., (2006, p.1), the term ‘assessment’ may be used more broadly. He defines assessment as: “a systematic basis for making inferences about the learning and development of students. More specifically, assessment is the process of defining, selecting, designing, collecting, analysing, interpreting, and using information to increase students' learning and development”.  The importance of assessment as an educational experience cannot be overestimated and there is general consensus that assessment shapes learning (Garrison, 2011).

A scoping exercise will be undertaken to elicit the breadth and depth of the different MOOC assessment strategies currently under use.Most MOOC’s will incorporate and offer some form of assessment. They tend to utilise strategies such as automated grading tools, which include multiple choice or true/false questions; these questions are intended to gauge the students’ knowledge in relation to course content and the scores are given to students as formative feedback (Suen, 2014). There is also a large body of literature Admiraal et al., (2015); Balfour (2013); Ruggiero & Harbour (2013); Suen (2014) which states that some form of peer review e.g. calibrated peer review, is also an effective method of assessment as it enables students to hone their writing skills through evaluating other students’ work. Chew (2016) purports that peer assessment practices enhances assessment and feedback experience of international students.

A formal evaluation of the various assessment strategies will be undertaken to assess their capabilities and limitations, and then the findings will be presented to the local MOOC design team so that the relevant assessment and feedback strategy can be incorporated into the new study Skills MOOC that is currently being developed in the Healthcare Science department where the author works. The project will be presented in a multimedia format and the findings from the literature review will be discussed at length at the H818 conference. 

 

References

Admiraal, W., Huisman, B.,  Van de Ven, M. (2014). ‘Self- and peer assessment in massive open online courses’, International Journal of Higher Education, vol. 3, no. 3, [on line]. Available at http://files.eric.ed.gov/fulltext/EJ1067524.pdf (last accessed 22 November 2016)

Balfour, S. (2013) ‘Assessing writing in MOOCs: automated essay scoring and calibrated peer review’, Research & Practice in Assessment, vol.8, pp. 40-48.

Chew, E., Snee, H., Price, T. (2016) ‘Enhancing international postgraduates’ learning experience with online peer assessment and feedback innovation’, Innovations in Education and Teaching International, vol.53, no. 3, pp. 247-259.

Garrison, D, (2011) E-learning in the 21st century a framework for research and practice, London, Routledge.

Luo, H., Robinson, A. C., & Park, J. Y. (2014). ‘Peer grading in a MOOC: reliability, validity, and perceived effects’, Journal of Asynchronous Learning Networks, vol.18, no.2, pp 1-14 [on line]. Available at https://olj.onlinelearningconsortium.org/index.php/olj/article/view/429 (last accessed 22 November 2016) 

Ruggiero, D., Harbour, J. (2013) ‘Using writing assignments with calibrated peerreview to increase engagement and improve learning in an undergraduate environmentalscience’, Course International Journal for the Scholarship of Teaching and Learning, vol. 7: no. 2, pp. 1-15.

Sandeen, C. (2013). ‘Assessment’s place in the new MOOC world’, Research& Practice in Assessment, vol.8, pp 1-5. [on line]Available at http://www.rpajournal.com/dev/wp-content/uploads/2013/05/SF1.pdf (last accessed 2 December 2016)

Suen, H. (2014) ‘Peer assessment for massive open online coursers (MOOCs)’, The International Review of Research in Open and Distance Learning, vol. 15, no. 3, pp, 312-327.

Swan, K., Starr J., Hiltz, R. (2006) ‘Assessment and collaboration in online learning’ Journal of Asynchronous Learning.

Extra content

Embedded Content

Contribute

Dr Carol Waites
9:38pm 26 January 2017


Interesting paper.  Do MOOC instructors currently give feedback on the peer assessment part?  Or is it considered sufficient to have students assess each other?  Is there a cost involved?

Andy Brooks
1:49pm 27 January 2017


Hi Heather, did you look at assessment with regards to accreditation also?

Heather Bloodworth
5:01pm 27 January 2017


Hi Carol, some instructors do offer feedback, particularly if the numbers undertaking the course are reasonably low/ manageable. My colleagues run an infection control MOOC and they do give peer assessment feedback, however there are several instructors and only 138 students. Lou et al. (2013) looked at a calibrated peer review system that was being used within one of the Coursera courses. The students that were undertaking the course were taught how to use a grading rubric to grade an assignment submitted by their peer. The results suggested that it did provide consistent grading results.

From a cost perspective, most MOOC platforms do offer some assessment tools that can be used, some even have a rubric tool!  I think the biggest cost is the instructor time and  IT support.

Heather Bloodworth
5:29pm 27 January 2017


Hi Andy, most MOOC’s aren’t accredited but do provide a certificate on completion. Coursera are now hoping that they will be able to offer some credits (2-3) for their premier courses while others are looking at using ‘open badges’.

Julie Skeats
5:37pm 27 January 2017


Coursera have the option to obtain a certificate on successful completion of the course. The cost of this service starts at $49 per certificate and includes ID verification to validate that it was you who submitted the assignments: i.e. there was no cheating involved

Heather Bloodworth
10:08pm 27 January 2017


Authentication and plagiarism are both issues that MOOC providers are trying to minimise. Giving students some sort of ID verification probably does reduce the risk of cheating. Some MOOC providers are even using  Skype or webcams when students are sitting exams so that they can see that the students aren't cheating!

Mary Howell
8:23pm 28 January 2017


Hi Heather

It looks like you are well on with this.  We used peer review in the last MOOC I mentored.  We also have an element of self - assessment of own practice and development which peopke are encourged to post thoughts about in the forums.  It works to some extent, but cultuaral differences and contexts vary widely among MOOC students and that can be problematic.  Your course sounds much more speciallised and targeted at a much more specific audience - I suspect that will increase the usefulness of peer assessment.

Sarah Adrienne Hughes
11:09am 29 January 2017


Dear Heather,

I am really looking forward to this, as numbers of PQN students may dwindle in favour of 'Nursing associates' and 'Apprenticeships in nursing', this sort of delivery may enable more flexibility in study patterns.

What feedback have you had from NMC validation processes?

Regards,

Sarah

Andy Brooks
12:36pm 29 January 2017


Hi Heather, thanks did you look at the futurelearn platform also, things are changing so fast in this area aren't they

Jude Toasland
3:05pm 29 January 2017


I find the concept of peer review/assessment very helpful.  Although I appreciate the need for an overall quality assurance mechanism built in, students can gain so much from the process of reading, reflecting on an evaluating one another's work.  This has been one of the most useful aspects of the H818 course for me.

Heather Bloodworth
10:14pm 29 January 2017


Hi Mary, as you know from experiance, there are pros and cons to every assessment strategy. Utilising more than one assessment (as you've done) will hopefully keep the students motivated and engaged. 

Heather Bloodworth
10:18pm 29 January 2017


Dear Sarah, thanks for your comments. We haven't taken this course through the NMC validation process yet as it's main geared toward postgrad students and not even the advanced clinical practitioner pathway that I run has to be validadted through the NMC. If this module is sucessful then we'll need to have NMC aproval if we incoroprate it in the undergraduate programme.

Heather Bloodworth
10:19pm 29 January 2017


Hi andy, yes I looked at 'Preparing for University' - this was excellent!

 

Heather Bloodworth
10:21pm 29 January 2017


Hi Jude, I agree that peer review in any form is a good way of improving critical thinking and writing and I've found my peers on the H818 to be very supportive.

Uffe Frandsen
3:17pm 1 February 2017


Hi Heather

I am looking very much forward to learn more about your take on the different tools of evaluation. I also agree with Jude on the potential for learning through the review of others work, such as here on H818. Will you also consider different grading/evaluation tools in respect to the subject of the MOOCs? It seems to me that for the purpose of this module on the OU critical discussion plays an important role - it might not be the same case within all areas of for example Healthcare Science?

Heather Bloodworth
4:57pm 1 February 2017


Thanks for your comment's Uffe. Critical discussion and analysis is also important in healthcare. We use a rubric to assess levels of  critical discussion within our healthcare courses.The rubric is taylored to each assessment and module.  The calibrated peer review seems like a good option because I'm familiar with grading using a rubric.

Dr Susan Morris
12:45pm 2 February 2017


Hello Heather,   Accreditation and assessment is an interesting one.  On Future Learn the Programs of MOOCs are an innovative move and I've yet to see the impact on - for example - the Chartered Manager status accreditation through this route and how 'industry' feels this is.

I'm looking forward to your presentation,  Enjoy the conference experience,

Susan

Heather Bloodworth
4:39pm 2 February 2017


Thanks Susan. I agree, even with accreditation I'm not sure it improves MOOC retenion. If the course is essential for professional development then the student might well be motivated to do well and  complete the module. I think it is essential to have a good  course content and relevant assessments whether the course is accredited or not. 

Sarah Adrienne Hughes
6:21pm 2 February 2017


Hi Heather,

'We use a rubric to assess levels of  critical discussion within our healthcare courses.The rubric is taylored to each assessment and module.' (Bloodworth, 2017)

I would be very interested to hear more about the rubric that you have developed, as we use them for formal 'written' assignments, but it would be excellent to have one that relates to critical discussion online.

 

Regards,

Sarah

Heather Bloodworth
6:21pm 3 February 2017


Hi Sarah, we've just started to use a blog as part of a summative assessment for the general practice nurse module, their graded (80%) on their critical discussion and how it relates to  their clinical practice. Our rubric had to be modified  to reflect what we were trying to assess.  We don' get them to grade each others work but they do have to post comments.

David Jenkins
10:58am 4 February 2017


Hi Heather, I am looking forward to hearing about these other means of assessment - they sound quite liberating and affirmative! I know some people feel that MOOCs have struggled to provide the pastoral support associated with traditional/face-to-face courses and I will be interested to see how this is being addressed re: assessment.

Heather Bloodworth
1:09pm 5 February 2017


Hi David, thanks for your comments. There's going to be a group of us (lecturers)  overseeing the students' contributions and supervising the assessments. We do this currently with our other online courses but  this has the potential of being on a much larger scale! 

Dr Simon Ball
4:26pm 9 February 2017


Hi Heather
Please find below the main questions and comments from your live presentation. It's up to you how to answer them, whether you wish to group them, or whether you wish to point to an answer already given above, for example.
Best wishes
Simon

  • Did you consider reflective tasks or self evaluaton of progress and reporting this in discussion forums as ameans of assessment?
  • Was this also tested against dropout rates as well Heather.
  • That could be an interesting study, assessment and retention. Does offering an assessed MOOC increase/decrease completion rates?
  • Is it dropout or did they just register for the resources? Need to assess intentions at start.
  • I really liked the idea of giving students 'test' assessments to do before a live peer review/grade
  • I would love to think that Nursing 'schools' or faculties could work together to share the work of designing MOOQs...

jan turner
8:23am 10 February 2017


Hi Heather, fascinating presentation.  Sadly my machine 'played up' during the conference so I was not able to 'chat' the following question:

I wondered how long a delay did you find was effective to build into assessment?  I think there is an matter of 'long enough' and 'too long' isn't there?

Pat Townshend
3:40pm 10 February 2017


Hi Heather

I've just watched the recording so well done!  [round of applause] 

I sympathise with your challenge with authenticating students' work. It's a constant problem for me and has taken up quite large parts of my working weeks in recent years.

Thinking about the timing of tests, I think that a rapid quiz at the end of portion of study can be a great motivator as it gives immdeiate feedback, at least as far as knowledge learning is concerned. Implementing that knowledge as part of a skill set would be a longer term assessment - "what differenece did knowing that X do to my practice?" 

Just a thought anyway.

Regards

Pat

Heather Bloodworth
9:30pm 13 February 2017


Hi Ian & Pat, thanks for your questions re taking tests immediately or delaying them. I agree with Pat that taking a test immediately after undertaking a task does motivate students because they get immediate feedback. However, we have with one of our blended learning courses delayed the test for a week and the student's did reasonably well. We tend to leave some of our tests sites open so that the students can have  multiple attempts to improve and test their knowledge.

Dr Simon Ball
6:08pm 20 February 2017


Many Congratulations Heather! Your presentation has been voted by delegates to be one of the most effective of the H818 Online Conference 2017 and you are officially one of our H818 Presentation Star Open Badge Winners! Please see how to Apply for your Badge here: http://cloudworks.ac.uk/badge/view/33

Well done!

Simon

H818 Conference Organiser

Danny Ball
11:42am 21 February 2017


Congratulations Heather!

Leanne Johnstone
12:29pm 21 February 2017


Well done Heather :-)

Heather Bloodworth
6:28pm 21 February 2017


.Thank you for all your support - really enjoyed the coneference

Sarah Adrienne Hughes
9:45am 22 February 2017


Well done!

Sarah

Pat Townshend
8:24pm 22 February 2017


Well done Heather!

Contribute to the discussion

Please log in to post a comment. Register here if you haven't signed up yet.