Survey design fundamentals

*This post was written by Customer Researcher and Best Buyer Persona founder Adrienne Barnes.

In this post we will cover everything you need to know to create surveys that you can use to guide your strategy and make confident decisions backed by customer data.

You have to know how to ask the right questions

Surveys are probably the most flexible and easiest form of feedback to implement, which means a lot of people are just opening up Google or Typeform, plugging questions in and shipping it out to their audience. 

It’s easy! 

Have tons of responses in a matter of minutes, hours, or days. Then boom, every decision you make from there will be data-driven and accurate. 

Well… not exactly. 

Let’s make sure you’re asking the right questions in the right way. 

Write a rough draft of your survey

Write out all the questions you think you need. This isn’t the time to consider bias or format, just put the questions on paper.

This is when you want to get clear about what you need to learn from your survey. 

Grab a whiteboard, a scratch pad, a blank Google doc or use this HBR brainstorming tactic and brainstorm the lessons you need to take away from this exercise.

Allow yourself to brainstorm freely, but try to be as specific as possible with the questions you need answered. The more specific the better. 

For example, you might want to know “why is our churn so high?” but you’ll have difficulty getting a clear answer to such a broad and heavily nuanced question. 

But if you want to find out “why so many customers churn after signing up, but before they’ve uploaded their first photo?” you’ll be able to create specific questions with specific answers. 

Edit your questions

This is the part where you cut out any unnecessary questions. Don’t waste your time or your respondents’ time asking non-vital questions.

Your survey has to pack a punch. It needs to gather as much accurate data as possible with as few questions as possible, because people are less likely to complete a long survey, no matter how much they love your product or service!

Only ask the questions that you really need the answers to. If it doesn’t matter how old the respondent is, then don’t waste the question space asking for their age. 

Every respondent is doing you a favor when they take your survey, and they have just as many daily tasks and distractions as you do. Keeping it short means you only ask the really important questions. 

SurveyGizmo says, “you should aim for 10 survey questions (or fewer, if you are using multiple text and essay box question types).”

Now, I don’t agree that you have to ask 10 or fewer questions, but there is such as thing as response burden, which means asking a lot of questions of your respondents can be really annoying, and they’ll just stop answering. 

The response burden study was about people’s willingness to answer survey questions about their health. So, it’s a fair assumption to make that if someone isn’t willing to answer 75 questions about their current health status for their doctor, you have very little chance of getting them to answer that many about your company. 

The best thing you can do is to tell your respondents how many questions they can expect and how much time the survey will take them to complete before they open the survey.

If you really need to ask 15 questions, then you should ask them. 

If it’s a long survey, it’s a good idea to provide an incentive - a drawing for a gift card or a coupon code on completion. An incentive lets your customers know you value their time. 

What Gets Cut?

Biased questions 

Bias is where you ask a question in a way that it leads the respondent to a particular thought or assumption just by how you’ve asked the question. There are a few ways to introduce bias in your questions. Here are a few examples I’ve come across.

Loaded Question:

A loaded question makes an assumption before gathering the right information. Here’s an example of a loaded question sent to me from Airtable


You have endless opportunities to get your survey in front of your audience. So, it’s important to choose the one that will result in quality responses. This question assumes that Airtable provides me value. Airtable would have gotten better, more accurate data from me had they asked:

“Do you find Airtable valuable?” and left it open-ended or  “On a scale of 1-10 how valuable do you find Airtable?” then asked me what I find valuable about Airtable. 

Omit Double-Decker Questions

Because you want your survey to be as clear as possible, make sure you only ask one question at a time. Now, this may seem obvious, but double-barreled questions can easily sneak in. 

A double-decker question is where one question has two or more components, such as...


Datacamp wants to know how much I know about their competitors, but the question itself asks me about subscription *and* product features for teams *and* businesses. Each *and* is an opportunity for confusion. 

A better way to ask this question: 

“Which of the following options do you recognize?” 

Then…

“Of those brands, which one offers a subscription for businesses?”

“Which ones offer product features for businesses?”

“Which one offers subscriptions for teams?”

“Which one offers product features for businesses?”

When we break down their one question, it’s actually asking five different questions. No wonder I was confused! 

Leading Questions

A leading question prompts a specific answer. It creates inaccurate information because all it does is support the assumptions or feelings of the survey creators. Political surveys provide a great example of leading questions.

Here’s one from the Trump administration that was trying to gather feedback on Americans’ ideas of “mainstream media”.

Source: The President’s Biased Survey on Media Bias

This question is leading in a couple of different ways. 

First, it makes the assumption that we all are in agreement of the definition of mainstream media. It doesn’t provide an example of how they are using the term mainstream media. It leaves it up to the respondent to determine what mainstream media means. 

Second, it states “reported unfairly on our movement” using the word “our,” automatically introduces a bias. If this is “our movement”, that means the respondent plays a part in the movement. This small word of ownership skews all the responses.

A better way to ask this question would have been: “When you watched (name the news channels) did the reporting appear fair?

Motivated Forgetting

Motivated forgetting is when you ask a respondent to remember a recent purchase or action. But, if you’re not able to ask the survey question as the purchase is happening, using a website pop-up survey at checkout, for example, then you’re likely to get inaccurate recollections.  

How to Format Your Questions for the Best Results

Now that you know how to ask the questions, it’s time to consider how you want to receive the answers. Here are the most popular ways to format your questions.

Open-ended questions

Open-ended questions allow the respondent to answer in their own words inside a text box. Most survey templates allow long-form answers (like a paragraph) and short text (like a sentence). 

Open-ended questions are great when you need to hear the customer’s voice in the answer and learn the why behind their answer. 

Reasons you may want to hear the customer’s voice:

  • You’re learning new information
  • The answers will be useful in copy and content creation
  • You’re unclear on what the answers could be

Airtable asked two open-ended questions in their customer feedback survey.

These are great examples of when to use an open-ended survey question. They’re looking to hear the words their customers are using to explain their product to their friends and colleagues.

I love how they divided the question by industry knowledge. The insights discovered here will be helpful in all kinds of marketing and advertising copy.

But most of your questions shouldn’t be open-ended. If you’re asking mostly open-ended questions in a survey, you’d be better off asking a few customers to schedule a phone call, rather than releasing a survey.

Knowing the difference between open-ended and closed-ended questions (and the best time to use them) is an important part of a successful survey. 

According to HotJar “Open-ended questions are broad and can be answered with detail, while closed-ended questions are narrow, multiple-choice questions that are usually answered with a single word or selection.” 

Airtable has another good example of mixing open-ended and closed-ended question in its survey. 

Question seven asks about favorite radio programs and offers a short text answer box, while number eight gives the option to check multiple boxes in order to answer the question.


Close-ended questions

Close-ended questions are questions that allow only certain kind of answers, such as; multiple choice, box-check, Likert Scale and nominal questions. 

Most survey questions are close-ended. In fact, SurveyMonkey recommends asking mostly close-ended questions. 

Ask close-ended questions when:

  • You know the answers and need to know which one is true for the respondent
  • You only want to know from specific choices
  • You’re looking for a customer satisfaction score

Here’s an example of a close-ended question from Adobe

Had they left this question open-ended, their responses wouldn’t have been clear or easily organized. 

When asking close-ended or multiple-choice questions, give the respondent an opportunity to choose “other” or “does not apply”. If you pigeonhole someone into answering you won’t get accurate information and you’re likely going to frustrate the respondent. 

Likert Scale

Chances are if you’ve taken a survey, you’ve answered a question using Likert Scale. Likert Scale asks you to rate your satisfaction on a number scale from less likely to extremely likely. 

I love to buy my groceries online, and my HEB sends me a customer satisfaction survey after each purchase. (Which means I’m getting surveys once a week in my inbox, but that’s a whole other post about survey frequency and survey fatigue!) 

Here’s a great example of a Likert Scale question: 

Likert Scale questions are best kept specific to one topic, and can provide insights into the overall attitude your customers have about that topic. 

HEB asked me about my overall satisfaction with their substitution experience. They didn’t ask about my overall online shopping experience, or my overall satisfaction with my online order. They kept the question very specific.

Choose the best format for each question and give your respondents variety! 

Make it look pleasing to the eye

If you’ve spent hours crafting the perfect questions you don’t want to throw them onto a plain template and call it a day. Use a great design help guide respondents through the survey. 

When designing, keep in mind who is taking the survey. Are you sending this to high school seniors? Mid-level managers at work? Stay-at-home moms?

There’s a huge difference in the two survey examples below. Dedoose is a scientific research organization, so many of their respondents are scientists or in academia. Their design is very basic, maybe even a little bland. 



Contrast it with the Fort Worth Museum of Science and History, whose survey went out to members of the children’s museum. 

The gist of these two surveys was the same - they were polling their audience to learn about how the customer uses their product, but their design and tone were completely different!

Test the Survey

Before you send your survey out into the world you’ll want to give it a test run. Choose a small group of people to take the survey - internal employees, friends or a select group of customers. 

Let the testers know this is a test run, and that you’d appreciate their feedback on any confusing questions, wording or format. 

Then, when you receive their feedback, make the appropriate changes. 

So you’re just about ready to send the survey out into the wild. But, there’s still one more detail… How are you going to get the survey in front of your audience?

When we shared our first survey, we simply emailed a link to our subscribers, but there were so many other ways we could have reached our audience that we didn’t even consider!

How to get the survey in front of your audience

Exit survey pop-up

An exit survey pop-up is timed to pop-up when your cursor quickly moves for the X button, or at an average time most people bounce from your website. 

According to Survicate, exit survey response rates can vary from 5% to almost 60%. 

That’s a huge variance but shows there’s an opportunity for successful responses if you ask the right questions, to the right customer, at the right time. 

Here’s an exit-response survey pop-up I received while buying business cards from Vistaprint. 

 

I’d love to see the response rate of these surveys. Vistaprint asked this before I had purchased the business cards, and it felt premature to me. I screenshot the popup and then quickly X’d out of this one to keep shopping. 

Email

SurveyGizmo found that email response rates can “soar past 85% (about 43 responses for every 50 invitations sent) when the respondent population is motivated and the survey is well-executed.”

Response rates can also fall below “2% (about 1 response for every 50 invitations sent) when the respondent population is less-targeted, when contact information is unreliable, or where there is less incentive or little motivation to respond.”

Moral of the story- write great email copy to an engaged audience.  

The Fort Worth Museum of Science and History sent me their survey via email. 

I’ve been a loyal member for eight years; I take my kids there frequently, so when they asked what matters to me about my membership, I was happy to take the survey. 


During onboarding

Onboarding is a great time to ask your customers a couple of quick questions. 

They’ve just agreed to start a trial or have signed up for your product. They know exactly why they need you and what convinced them to purchase at this moment. Since it’s fresh on their mind, why not ask - so you can have all that precious information too.

Here’s an onboarding survey question from Demio. This pop-up happened right after signing up for an account. 



The question is simple, “Is your company currently running webinars?” 

The answers they receive will help them segment their audience with greater accuracy and ultimately give each customer a better experience. 

After a completed sale - thank you page survey

The thank-you page survey is crucial for ecommerce businesses, for the same reason that an onboarding survey is critical to subscription businesses - being able to learn more about your customers at that moment in their buyer journey helps you create a better experience for that particular customer and future customers. 

Social media

Asking customers to complete a survey on your social channels should be given, because those are highly engaged customers who enjoy your product enough to follow you on social. And those are the kinds of customers you really want to hear from. 

This is a great example from Candid Athletic Training asking people on Twitter to complete the survey. What I love about this ask, is that it specifies the number of responses they need. 

Letting people know they need 22 more athletes helps to prevent social loafing. The idea, that in a large social setting, (like Twitter) someone else will do the work.

Pointing out that they are close to their goal, will help motivate athletes who haven’t yet taken the survey. 



Here’s an example of a motivated forgetting question. Walmart wanted to know if I would recommend them to a friend, but only considering the purchase experience for one item. 

Now, I’ve been to Walmart thousands of times. Do you think I was able to recall one specific instance and apply all my answers to that or was I using a lifetime of data to skew my response? Most definitely skewed. 

Asking About the Future

Similar to Motivated Forgetting, you’re also inclined to hear bias when you ask someone to tell you about the future. Joel Klettke tweeted about this after hearing Els Aerts at Learn Inbound. 

“Asking your survey respondents questions about the future is basically asking to be lied to.”

This is so true. 

Here’s an example of a company asking me about what I *might* be willing to pay in the future. 


Currently, this museum offers free parking as a part of its membership fees. While I don’t know their results, I can only imagine that very few, if any, agreed to be charged more money in the future. 

Question Order

Our brains are constantly trying to make patterns and understand the relationship between two things. This is why question order has to be considered as a type of bias in survey design.

If you mention a product, experience or person in an earlier question, the respondent will likely apply that to all of the following questions. 

Pew Research found this to be true in a 2008 political poll they conducted.

They asked:

“All in all, are you satisfied or dissatisfied with the way things are going in this country today?” immediately after having been asked, “Do you approve or disapprove of the way George W. Bush is handling his job as president?” 

88% said they were dissatisfied, compared with only 78% without the context of the prior question. 

Mentioning George W. Bush in a previous question affected the way respondents answered later questions by 10%! That’s quite a huge statistical impact. 

Bias is like a quiet toddler playing in the other room. You don’t realize it’s bad until it’s way too late. Prevent it at all costs, and if you notice it after the fact, you have to be willing to either dismiss the answers or at the very least not make huge business decisions based on them.

Subscribe to Short Loops - A newsletter by Stuart Balcombe

Insights for early-stage growth and product development that puts customer outcomes at the center.