Over the last seven years, I’ve had the privilege of working with the majority of universities in the UK and Ireland regarding course evaluation. I like to think we have made a positive impact and ultimately have improved the student experience, which for me makes it worth the effort.
As I look out at you it’s becoming apparent that whilst I have loads of experience in this area – I do not in fact work at a university.
I am also aware that you might have an underlying concern that as a vendor, I will break into a sales pitch and attempt to flog my software for 3, easy, interest free payments. I promise to do my best to resist the urge to make this a thinly veiled product promotion – in fact feel free to hiss or boo if I start to do so!
So, rather than give a pitch on how to implement best practice around course evaluation, I’ve decided it might be fun to instead write about how to absolutely get the worst out of your investment in technology and positively guarantee useless results leading to meaningless metrics, disengaged students, angry academics and set adrift management teams with no idea of what is happening within their classrooms.
I’ve decided to call this the seven deadly sins of course evaluation
The first deadly sin - Ensure you invoke survey fatigue
Has anyone heard of the concept of survey fatigue? Yes? Well you’ll know then in order to increase survey fatigue amongst your students – whatever you do – do not try and identify what surveys are being run across your institution – I’ve met with one HE provider that had a very successful survey fatigue programme with students being asked to complete 31 different surveys throughout the year – well done! I refer to this as survey abuse – just keep the surveys rolling – students love them – vendors don’t mind supplying systems for them either!
Deadly Sin number TWO Be very Sneaky
When running online surveys – be very sneaky! Don’t worry about data protection – don’t mention when running surveys that there is a team of learning technologists with a live feed monitoring the time, location, gender, pass rate and attendance of the participant. Students don’t mind! –after all it’s the era social media and they certainly don’t care about their personal data.
The Third Deadly Sin Make sure your academics collect the surveys personally
Along these same lines – when using paper surveys – make sure your academics collect the surveys personally – it always makes the student feel better about giving honest feedback when they experience the personal touch of an academic breathing over them while they complete the survey.
My fourth deadly SIN Don’t close the feedback loop with your students
When running surveys – make sure you never actually tell the students what measures^ or actions will be put in place as a result of their feedback – just keep sending out surveys – hopefully, they will give up eventually.
Deadly SIN the Fifth Academics are not too busy
Contrary to popular opinion - Academics are not too busy – yes I know many of them have to produce research and now are being pushed to improve their teaching – why not get them to run their own surveys as well?– just give them a bit of software and a log in and let them go at it! – they will appreciate it – additionally, why not drive all your enhancement efforts off the comments of students….no better way to create happy academics than to escalate the vindictive comments of a few disaffected students.
The Penultimate Deadly SIN Let the students fill in the surveys when they want
For online surveys – don’t use in class university time – just send the surveys out and hope the students complete – after all they are tech savvy and live on their smart phones – of course they will find it virtually impossible to resist chatting with their friends and checking social media thereby putting off the survey till later and later – result!
The Final Deadly SIN Utilise negative incentives
Finally - here is one the best techniques .If you want 100% response rate with questionable integrity! Try this! The student goes into their VLE – but they disregard their pending survey – the next time they log in – they get locked out unless they complete the survey – no better way to insure rubbish results than to blackmail students into providing thoughtful feedback.
In all seriousness – the examples I’ve mentioned are real and well-meaning attempts at measurement and enhancing the learner experience at university. You can imagine how successful they’ve been!
So let’s address these challenges – Let’s focus on three key areas: Students, Academic Staff and the Management.
Firstly, when arriving at university, students have very little experience in participating in market research and a trusted advisory relationship has to be cultivated in order to convey the importance and responsibility they hold as stakeholders in their education. Efforts in gaining trust and buy in must be sustained^ and the reasons for participating must be clear and mutually beneficial, which is why it’s so important that students see tangible, timely outcomes from their efforts. Seen in this light it makes no sense to use negative incentives to encourage participation, disregard the protection of their personal data, be vague regarding anonymity or forget to close the loop on the results of their feedback.
Let’s look at the academic staff. Due to great diversity of cultures and identities of HE provision^ and the variety of expertise that needs to be harvested across disciplines, consultation is vital in winning hearts and minds of academics. Equally important is to choose the right questions^, sensitivity around the use and visibility of open comments and transparency concerning the implications of course evaluations on performance monitoring. Tempting as it might be for the analysts to indulge in ‘sneakiness’, module evaluation is at its heart an instrument designed to provide the academic with useful feedback to improve the delivery and design of their courses. The academics need to believe and trust in course evaluation policy, or they may actively work against its success: and they certainly won’t be bothered to give up their class time to complete the survey!
Lastly, management teams have a responsibility to both support the academics as well as insure the overall quality of teaching and learning for their students. It is right therefore, that they have access to timely and meaningful metrics enabling them to identify excellence as well as areas that need additional development or resources. The policy must be clear, especially in regards to controlling the proliferation of surveys – what I refer to as ‘survey abuse’, and how the results will be used in performance, including student comments, the Key Performance requirements for the academic staff and administration and how the results are to be communicated with the students.
To conclude, what these two different perspectives demonstrate
Firstly: Be careful of the unintended consequences of technology - Just because you can – doesn’t always mean you should
Secondly: Though it pains me as a vendor to say it – acquiring the right software is simply not enough.
However if you disagree – Let me remind you - my software is available in 3, easy, interest free instalments.