Many organisations struggle in choosing the best methodology for running course evaluations and other survey projects necessary in driving quality enhancement and strategic planning efforts. In fact there is no easy answer or quick fix. Many organisations choose to go with paper or online methodologies based upon the recommendation of another institution or influential claims by a vendor. In fact there is no one size fits all method; rather the decision should be based upon the specific circumstances unique to each institution. Claims by manufacturers can be especially dubious as they are heavily influenced by their particular product focus and market experience. Vendor claims to watch out for can be broken into five main categories.
Claim # 1 – Generic hosted online survey tools can handle course evaluations.
Hosted survey services offer generic survey authoring and publishing tools with basic .csv or statistical output. A typical response rate from online survey is between 15-30% and extensive additional work is required in order to create aggregate reporting for curriculum managers and senior department heads and executive stakeholders. The biggest challenge however with these offerings is in providing instructor level feedback as individual instructor reports need to be created one at a time.
Claim #2 – Paper only data capture is easy to use and reduces the workload.
Again these products are generic applications designed to be run centrally by a designated administrator. Paper data collection offerings claim to reduce data entry by up to 90% however, these products are characterised by highly complex user interfaces and the claims for reduced data entry do not address the many other manual constraints inherent in distributing surveys. For example, to individual instructors that may teach against multiple courses or participate in lecturing teams with other instructors. These paper data capture products typically output simple .csv or ascii data and do not attempt to address the highly complex reporting requirements within higher and further education.
Claim #3 – Hybrid paper and online survey offerings make survey administration easy.
Several software manufactures have tried to address the need for paper and online requirements of organisations by attempting to add-on functions after the fact or through business alliance partnership. Key phrases to be aware of from these vendors are “seamless integration” or “connect agents” The point here is that paper and online surveys are run as separate processes with separate underlying data structures and then combined at some point after the fact. The end result is a highly complex process for the administration staff that must be managed centrally.
Claim #4 – Add on VLE (Virtual Learning Environment) surveys makes course evaluation easier.
One only needs to look at the marketplace to see that it is not a huge undertaking to develop an online survey tool. Generic online survey tools are introduced to the market seemingly on a daily basis and typically named after a tropical animal or fruit. Clearly these tools are not designed to address the challenges of course evaluation automation or module level analysis and reporting in higher and further education. Confusion arises as to the viability of an online survey tool when offered as an upgrade module from a VLE or LMS vendor or possibly from the creator of the core student information system. Surely these hosted survey tools are less generic and better designed to meet the real requirements of higher and further education? Unfortunately the issue with the add-on approach is that the survey plug in is not the core innovation focus of these products.The effort that goes into the development of a (VLE) virtual learning environment or a (SIS) student information system is extensive and the team or developer designated to create the add-on survey module will play a smaller part in the overall product release effort. This is concerning for organisations that need or are waiting on additional functionality or are experiencing technical issues as these vendors will not have the ability or focus in order to respond effectively. Lastly, there is a high likelihood that add-on VLE/SIS survey tools were added on through acquisition of one of the many online tools popping up in the market and will, like Hybrid offerings, falsely offer “seamless integration” or “connect agents.” These tools will in fact be based upon a separate underlying data structure with limited generic statistical output and reporting.
Claim #5 – Outsourcing course evaluation will save time and money.
As a last resort many organisations give up on administering core survey processes in-house and choose to outsource to an outside provider. This decision is understandable for certain targeted qualitative market research initiatives such as focus groups as there are vendors that have developed specific talent and specialised skills in the area of market research. The choice to outsource quantitative course evaluation initiatives to an outside vendor however is both very expensive and problematic. This survey approach creates significant delays in gaining access to aggregate reporting limiting the viability and effectiveness of the outcomes. In addition, much of the value gained from the survey process is in the administration and choices made during the analysis process. This knowledge regarding the data is then lost to the organisation and held by the outside vendor.
We are a full service provider for Higher Education in the area of Survey Data Analytics powered by the EvaSys Enterprise Survey Management software. Achievability helps improve the delivery and design of programmes through the EvaMetrics cloud based platform that drives high survey response rates, powerful reporting for all stakeholders and provides context and comparison through our community based module benchmark for universities. Achievability is an Electric Paper company.
To find out more, visit us at http://www.achievability.co.uk/