Giving teachers the learners perspective on e-assessment
I am an elearning advisor and I work with learning providers in the FE and sills sector, basically any government funded learning between school and university including work based learning. I have a recently appointed colleague with the same remit as me and to let him experience what it is like to deliver a webinar we came up with an idea to explore eAssessment tools.
The learning activity was timed to fit in with half term holidays as a point in time where teaching staff may be able to join in easier than during term time. The people we were aiming to connect to are mainly unknown, they see the session advertised and decide it is interesting to them. There are some regular participants and these help you gauge the session as they will interact during it.
All the participants are part of the same educational sphere and work mainly with 16-18 year olds and also a range of other learners aged 19-90! The learner ability varies, as does the social background. One recurring thing though is that many learners that these educators work with are disaffected by education, or may have 'failed' in school. To combat this there is a need to look at ways to engage learners. Alongside this is the pressure of looming Ofsted inspections, and the requirement to have elearning embedded into teaching.
These people are pushed for time, and have no budget to spend so any tools for them need to be easy to use and free. We make an assumption that there is a degree of digital literacy and technical ability, after all they have signed up online or an online session.
The only requirements set for the participants is that they need a device with web access and either speakers or headphones to listen in. We don't enable mics for ppartcipnts do that we can control the session length and keep it on track. However all those involved are encouraged to use the chat pane to contribute and discuss.
The idea was to run two sessions of 45-60 mins duration. These sessions would be delivered via the Adobe Connect platform. All participants were sent the link in advance and again on the day as a reminder. When they log in they enter their name.
The thinking behind these two sessions was to give educators an opportunity to experience a range of eAssessment tools from the learners perspective to see how engaging the tools were and to make judgements about the interfaces. The second part was to show what information the task setter could collect from the assessment. The idea was that the educators tend not to see what the learners see, also unless you have several people testing a tool at the same time you can't see how it truly works. also by using a distributed group of people with diverse systems it would allow people to see what operational issues might occur before using it with a live student group.
For session one three quizzes were set up on three different platforms. These tools were predominantly ones for PC/Mac access.
For session two four quizzes were set up on four different platforms. For this set the tools all allowed multiple device access, so PC, mac, tablet, smart phones etc could be used.
For each session participants were told of the system requirements.
Between my colleague and I we set up the seven quizzes. The topics were very general, ideas like TV and driving were used as we wanted to use topics that wouldn't exclude participants but be sufficiently challenging to engage them.
Each session began the same way. The online room was opened around 15 mins before the advertised to allow time for people to login and iron out any issues. We did have a phone helpline available but no one made use of it.
We were sat side by side on a web cam. There was a shared mic and audio and video was tested before opening the room. An iPad was used a a monitor to see what the participants could se during the session. The PC was set up with two monitors. The reason behind this is you can have the web room open on one and any other screens you want to share on the second.
As people joined the session we chatted to them. This has two functions it makes them feel welcome and also serves as a way for people to test their audio. We also oriented people to the various tools in the platform. At this point you also got some some chat between those participating who already knew each other. Although we cover a region of around 140 miles x70 miles people met up at face to face events we hold periodically.
We had a per loaded PowerPoint to summarise the main points and act as a backbone screen for the activities and chat.
Within the sessions there were repeated activities for each assessment tool. I will describe the generic procedure we used for each one.
The tool was introduced and how to log into each one was shown and any access code for the specific quiz given. The access code was also pasted into the chat pane. At this point the participants were invited to open a new browser window and complete the quiz.. We asked them to let us know when they had returned to the presentation room by adding a tick to their name.
The completing the quizzes on the whole worked well. Occasionally there was an issue with the network people were on not allowing access. Fortunately this was only in a couple of instances. We coped with ths by showing the tutor perspective of results coming in through our shared screens. The bit that I'd fail was people indicating they had done the quiz and returning to the main room. It was partly because they got excited and forgot. However they did start to communicate in the chat room so we just used that to gauge when to move on.
We then showed how we could see the results and the level of information you could get. This was followed by an opportunity for participants to ask questions about the tools and comment on them.
At the end of the session we got overall reactions from participants. We also sent out a feedback form to gather reaction and see if anyone had issues with the sessions.
For the second set of quizzzes the only difference was that the participants could do the quizzes on a variety of devices. We did poll what was used and if there were any problems in using the device.
A lot of very positive comments from users. There was a sense of delight and fun from the participants as they saw their own and each others results. They felt it was a really good way to experience the potential of the tools.
Unexpected results were the identification of connectivity issues for participants to be aware of when using them with a class. It also allowed people to see where it was there network and not the tool that was problematic. This gives potential to go and ask network managers to change settings to accommodate tools.
The bitwe need to work on was the rule setting and showing people where to indicate return to the room. However this didn't affect the overall experience.
The survey returns were also very positive. We have had email feedback of people adopting some of the tools into theirbownpractice
It was an untested idea but it worked well. My colleague found the experience valuable as an introduction to presenting and running webinars. We found we worked well as a double act and have followed up with a face to face event presentation done as a double act as we work well together. Those participating found the activity worked and allowed than a different view to that which they would normally get..
If you are doing h800 then you can find the links to the recordings in one of the early forums as I posted them in there. Sorry I can't get them off the iPad that I'm writing this on.
Add extra content