A response to “Are we missing too many alumni with web surveys? (Part 2)”

I was interested to read today a guest blog on the Cool Data site by Peter B. Wylie and John Sammis. It is called “Are we missing too many alumni with web surveys?” and is part 2 in a series (part 1 was published in 2012).

In summary, they looked at a North American university’s recent survey data (and presumably the institution’s full constituent data) and compared respondents, non-respondents and email-uncontactable alumni with regard to age, event attendance and giving. They were looking to identify and demonstrate demographic or behavioural differences in the survey respondents as compared with those who were unable to be invited to complete the survey (no email address) or those who chose not to.

Their key finding (broadly speaking!) is that those who were not invited to complete the online survey (due to not having an email address) were also less likely to have attended an event or to have given significant lifetime amounts. They also note that older alums are also more likely to not have an email address on file.

This sounds about right to me for a few reasons:

  • If you have attended an event, you are more likely to have up-to-date contact details. This is either because you may have been prompted to update your details at event registration, or because you are more likely to have received an invitation to attend an event because you already have up-to-date contact details i.e. those who attend events tend to be those who are successfully invited. ‘Engagement’ could also be a factor here (those who attend events are more engaged and therefore more likely to return the survey), but perhaps more difficult to prove. Please see my post on measuring engagement for more on this.
  • Like with event attendance, if you have given to your alma mater you are more likely to have an email address on file. You will have most likely added your full contact information to the donation form (online or paper) when you gave, and you are more likely to be receiving solicitations and ongoing stewardship/engagement information from your university if you have up-to-date contact details. Those who receive solicitations are also more likely to be donors because the university can’t ask you to give if they can’t contact you. Engagement may be a factor, as above.
  • Older people are less regular users of the internet.

So what’s the problem with that?

Peter and John are highlighting that you could be missing out on the views of particular types of people if you conducted a web survey alone – and your survey results therefore may not be representative of the entire alumni population.

I agree with their advice that any survey analysis should take into account how representative the sample is (or is likely to be) of the demographics of the potential population. This is crucial context to have when interpreting the results – I find this context invaluable as a researcher, and I know clients appreciate this transparency.

However, despite the limitations, online surveys have big benefits and should not be discounted as a viable research tool.  They tend to be easy and convenient for the responder to complete and are lower cost/less resource intensive to run than postal, telephone, focus group or face-to-face research methodologies.

Depending on the target audience of the market research project and the questions you’re trying to answer, you should always consider if adding other methodologies to the mix would bring value – you don’t have to rely on web surveys alone.

Market research limitations in general

Firstly, my philosophy is this: the results of any market research study, no matter how robust, will not give you absolute certainty of what your entire alumni population think or feel about your institution, nor what they will do when prompted.

Regardless of the methodology used, there are always a few limitations to market research (I’m sure there are many more, but these are the first that come to mind):

  • You only get to hear from those who do respond to your research invitation and that’s what you have to work with. People who don’t routinely respond to surveys may be of a certain personality type with very different views to those who do participate – but short of forcing them to participate, this is a limitation that you have to accept and account for.
  • People have a tendency to say what they think you want to hear – this may be particularly true for donors given their investment in your work.
  • Completing a survey asking alumni what they might do in a certain situation is very different to them actually being in that situation and making the decision in real life – e.g. people often think they are concerned about ‘quality’ when they are actually more likely to make a decision based on price when they’re in the shop. Or to use a more relevant example, significantly more people will say they’re interested in volunteering than those who actually take you up on the offer post-survey.

Market research is something you should undertake with your eyes open and treat as one source of information (albeit a very powerful one) for decision-making and ideally not the only source.

There are several things you can do to improve the accuracy of your results and reduce bias. You have levers to pull with methodology as mentioned (qualitative and quantitative perhaps), questionnaire design and sampling. In order for you (or the agency you brief) to determine how the project should be designed, you will need to be clear on these:

  • What your objectives are
  • What the key questions are that you want answered
  • How you want to use the answers to these questions to make decisions
  • How accurate you want the results to be – usually this relates to the above, the weightier the decision and the more reliant you are on the survey (i.e. there are no other ways to answer the question/information sources to consider) the more accurate you’ll want the results to be.

‘Questions to Ask Survey Vendors’ – answered

The authors also propose some questions to ask potential survey agencies, I’d like to have a go at answering them here:

“What about the alums who are automatically excluded from web surveys because they can’t be reached by email? What do you do about them?”

Firstly, the survey may not have to reach everybody. For example, if you’re evaluating the experience/quality of your programmes, you won’t need to deliver your survey to lost alumni – you’ll just want to hear from those who have participated recently and who are therefore likely to have more complete contact details.

What I mean is, it really depends on the objectives of your survey and the questions you are trying to answer. Rather than start with ‘I want to do an online alumni survey’ I would always recommend starting with ‘I want to know what X audience thinks about Y so I can Z’. Then an agency can recommend the best methods to engage the audience whose opinions will be most useful for your business decision – and a web survey may well be part of that.

If you’re wanting to be more inclusive of age demographics say, and the agency is considering a phone or postal survey, they’ll also need to know if phone and postal data is reliably held. Bear in mind that printed surveys, telephone interviews and focus groups all have draw-backs in terms of cost and survey length.

A lack of an email address is also not necessarily a reason to despair. There are also lots of creative things you can do to cross-promote an online survey through print and social media to get wider reach. An agency can discuss these options with you to make sure you get the most useful insights for your investment.

“Generally speaking, what percentage of the entire alumni database can’t you reach because these alums don’t have an email address?”

I don’t know this for sure, but you may be able to access bench-marking reports to find out the answer to this question – such as those published by CASE. The average is around 70% contactable in the UK, but I believe that combines email with postal details etc.

In terms of the percentage of your own database that is uncontactable via email, you would ideally provide this stat to the agency as part of the briefing stage. The agency will need to know this in order to determine the best methodology.

“What are the typical response rates you get from the alums you can reach?”

There are many things that affect response rates and there are also several techniques you can use to raise response rates. For an online survey, it usually comes down to:

  • the quality of the email invitation,
  • the communications/marketing plan
  • the online survey software,
  • the usability of the online survey,
  • the incentive offered and crucially,
  • the survey’s length.

It’s very difficult to estimate a response rate at the outset of the project, however the agency may be able to prepare a rough estimate or target based on your objectives and what information you can offer them about the audience, such as the alumni giving rate. The target survey response is useful to work out the likely overall accuracy of the results.

I would be very cautious with taking the response rates of other institutions and thinking they will apply to you (in my experience they can vary from about 5% to 20%). A good agency will consider your circumstances carefully and provide a custom estimate for you.

“Generally speaking, what are the differences you find between responding and non-responding alums? For example, how do they differ on age, giving amounts, and event attendance?”

Other than those mentioned and explained above (and in John and Peter’s blog), in conducting the analysis and producing the results for you, the agency should compare the survey sample demographics and behaviours with the proportions of the potential population. This can help with the analysis, as a statistician can weight results for demographics that are under or over represented to better reflect a result for the full population. Any weightings used and applied should always be disclosed – transparency is key.

The future of online surveys

I’d like to reiterate that market research is by no means a silver bullet and does not replace the need for other information sources to be considered (where possible) when making big strategic decisions. If delivered well, a survey can challenge preconceptions, generate new ideas and eliminate some of the guess-work where trial and error isn’t cost effective. It isn’t however, a complete substitute for your experience and knowledge of your alumni.

What I’m anticipating in terms of industry trends is a move away from the run-of-the-mill ‘census’ type alumni surveys that try to capture information about anything and everything. The tendency with these surveys is to end up with results that tell you simultaneously everything and nothing (information overload), and are huge team undertakings without a proportional pay-off.

I’m predicting we’ll see universities incorporating market research more seamlessly into their programmes, with targeted studies dotted through the calendar to answer timely business questions. When deployed with careful consideration and with alumni and supporter relationships at the heart, market research will pay huge dividends in terms of donor retention and alumni satisfaction to those who use it wisely.

If you have any questions or thoughts about this blog please comment below or drop me a line.

Leave a Reply