Safe surveys for evidence-based decision making

safe surveys

In our increasing drive for data, we’re inclined to want to know more information from people. The move is to evidence-based decision-making. And, increasingly, we have tools that make it easy. Surveys are one powerful tool. You likely have a feedback capability built right into your LMS or if not, you can use a third-party app and provide a link to a web page to start the process.

Anyone, these days, can generate a survey. Which means we’re at risk of two things: too many, and badly written surveys. And that’s a problem.

We can’t necessarily stop the prevalence of surveys (though we’ll address that too), but we can do something about the quality of some. There are principles from marketing, from cognitive science, and from assessment design from which we can synthesize good practices. Here we overview good survey principles, dig into principles about questions, and discuss some things to avoid and precautions.

Note that while surveys are typically tools used externally to the organization, there are roles in L&D as well. For one, a survey may be a useful way to evaluate the impact of a course. While you likely want to evaluate participants at the end of a specific learning experience, you should also be looking to see if it’s leading to a change in the workplace. While self-evaluation can be tricky, you can ask appropriate questions, as Will Thalheimer documents in Performance Focused Smile Sheets. Particularly if you delay them a while, and are focused on behaviors, not opinions. And, as you move broader in the ecosystem, you can start evaluating social and cultural elements as well.

General principles

Overall, we know that we have to be targeted in the design. Time is precious, and people’s minds are (increasingly) distracted. We can use inferences from what we know about how we think, and learned outcomes from previous attempts to craft an overall approach. We start with what and why and, ultimately, we dive into question specifics.

To start with, you should know what you want to know. That is, be very clear on what you need to know.  While you can start general, you should end up getting specific about exactly what’s important. What are the core questions that will enable you to make the decisions that are hinging on the survey?

You should also be clear about who can provide that knowledge. Who do you want? Who has the insight you need? And, of course, who don’t you want? That includes making sure your language is appropriate throughout to capture the appropriate data.

Finally, you need to know how many results it will take to create valid data.  If you get an unrepresentative segment, you could invest effort or present results that will undermine your credibility and waste resources. Will an incentive help, and do you have one to offer? Providing a reward, even the chance of one, likely will increase your uptake.

How will you reach this audience? You need to communicate to them the value of this interaction.  Determine the channels you can use, and what message will make them engage. Timing can be important, catching them at an appropriate emotional state, e.g. after an event, can be fortuitous if you can leverage that focus.

Understand the context in which the message will be received. What are the conflicts?  You need to plan to penetrate the distractions and get sufficient numbers to make your survey valid.

Keep it short. Don’t decide to load it up with other questions because you can. Survey accuracy, even completion, drops with length. Ask what you need, and no more.

In question order, go easy to begin. Ask at least a couple of simple questions at the start. Save your demographic questions to the end. People are more likely to give up that data when they’ve already committed initial effort.

Address any concerns about questions explicitly. If the survey asking for complicated or more personal data, let them know about why it’s important. It may even be necessary to explain how it will be used.

One of the worst offenses in organizations is ‘driveby’ surveys. Pretty much everyone in the organization can ask questions, and they might!  Have a survey policy in place across the organization that ensures you avoid survey fatigue.

The questions

Diligence in survey design carries over to the questions.  How do you ask good questions? Good questions ensure good data.

For one, be specific. Know what you want to ask, and ask just that. Don’t get too fancy; keep it short. The less verbiage, and the clearer the intent, the better response you’ll get. This goes to your intent, the answers you need. Keep your focus.

Ensure that it’s easy to respond. Have concise responses that are easy to discriminate between. And, having scales for responses is better than just yes/no. You can better comprehend trends if you have ‘strongly’, ‘somewhat’, ‘neutral’ or the like instead of just like/dislike. And clicks are better than typing.

Avoid grids of questions, too. While some respondent audiences can deal with the complexities, they’re ripe for errors. Ideally, you separate out each question. And, ideally, one question per ‘page’, or just a few that have a semantic relationship.

And make sure your response options are balanced and unbiased. You don’t get good data asking about performance with “Great”, “Good”, “Okay”, and “Other” as options. That’s an agenda, not a survey.

Precautions

There are some ‘don’ts’ as well as ‘dos’. Bad practices can undermine the validity of your survey as well. You want to increase the likelihood of finding out the answer.

You shouldn’t ask for personal information. Unless you’re the bank or doing a commercial transaction, you don’t ask about credit cards. Similarly, if you’re asking height, weight, income, savings, or any other data that people might feel embarrassed about, they’re likely to lie.  If it’s critical, ensure that it’s anonymized or is to their direct benefit. Help them comprehend why they should provide you with this data.

Similarly, don’t ask for information that they aren’t qualified to give. You can ask opinions, or their direct experience, but don’t ask the general public about how to remedy plumbing or electrical problems, for instance. Keep it within their area of competence.

Don’t release it without review. Have some colleagues go through the content, ideally someone with assessment design expertise. And trial it as well, both to see how the experience goes and to test the implementation. A good strategy is a phased release. Trial it yourself or with your team first. Then someone down the hall who’s agreeable but not part of the project, and then a focus group. As you go broader in audience, go narrower in focus, getting closer and closer to your target audience. Depending on the importance and breadth of audience (if you’re going public, for instance, instead of within your org), you may want a number of trials. The more it reflects on you, the more you want to make sure it’s valid.

And, for goodness sake, protect their data! Make sure there’s a limited scope of ‘need to know’, and keep it there. And have it in a secure system. No one likes to find out their data’s been stolen. It’s a headache for them and bad press for you.

Good surveys are professional and ethical. Done right, they’re a valuable tool for decision making. There are many tools for doing surveys, and most make it easy to do good things, and support the data analysis behind. But the initial design is critical to having data that’s worth evaluating. Practice safe surveys, at least that’s what my survey of the area says!