The Gist
- Survey fears can be misleading. Early polls exaggerated fear of public speaking due to flawed question structure, not genuine ranking.
- Four major biases affect survey accuracy. From wording confusion to social desirability, biases can distort data if not addressed up front.
- Design matters more than you think. Marketers must ask the right questions the right way—starting with how they frame response options.
Don’t Believe Everything You’ve Heard About Public Speaking
Everybody knows that people fear public speaking even more than they fear dying. Right?
Well, like many such “facts,” this idea is actually the result of a flawed survey question. In 1973, a poll taker asked more than 2,500 Americans to identify their fears from a list of options, and public speaking came out on top. Forty-one percent of respondents said they feared speaking in front of a group, compared to just 19% who said they feared dying.
(Insects and bugs, cited by 22% of respondents, also edged out dying, but no one ever talks about that.)
The problem? Respondents weren’t asked to rank these fears against each other. When a research team replicated the survey in 2010, public speaking once again topped the list. But this time, researchers asked a second group of respondents to rank their top three fears. When the fears were positioned against each other like this, death claimed the top spot.
Table of Contents
- What Marketers Can Learn From Flawed Survey Design
- Better Surveys Start With Better Bias Awareness
- Core Questions About Survey Bias
What Marketers Can Learn From Flawed Survey Design
Many people don’t realize that the way a question is asked can have a dramatic impact on answers. To ensure high-quality research data, marketers should watch out for these four common types of bias in their survey questions.
Wording Bias: Watch Your Wording, Avoiding Terminology Traps
At the risk of sounding obvious, it’s important to make sure your respondents understand the key words in your questions. In my MIT class on market research and survey design, I worked with a group of central bankers, and their survey questions were full of industry-specific technical terms.
This might have been OK if they were surveying other central bankers, but their questions were aimed at politicians, who likely didn’t have any better understanding of “supplementary longer-term refinancing options” or “collateral eligibility frameworks” than I did.
Also, watch out for terms with vague definitions: sustainability, digital transformation and middle-class. Rather than forcing people to guess what you mean, consider including definitions.
Finally, watch out for ambiguity in response options. If you ask people to rank a product or service on a scale of 1 to 5, you might discover that respondents have wildly different interpretations of the number 4 – a phenomenon that researchers call “interpersonal incompatibility.” Instead, offer more specific response choices to make sure everyone is on the same page—perhaps asking whether respondents “agree” with a statement, whether they find a feature “useful,” or whether the service they received was “excellent.”
Related Article: Building Winning Customer Satisfaction (CSAT) Surveys
Acquiescence Bias: Why Survey-Takers Tend to Agree With Everything
This is basically just a fancy term for “saying yes to everything.” Survey respondents often want to be seen as agreeable, particularly when they don’t have strong feelings on a topic.
Consider a market research study on cell phone features. If you ask people whether a large screen, superior performance, long battery life or low cost are important to them, they are likely to simply say “yes” to each feature and benefit. You can get more reliable data by mixing up positive and negative statements, avoiding leading questions and asking questions in a way that puts multiple priorities in conflict with one another.
For instance, you might ask if respondents would prefer a larger screen with shorter battery life, or the opposite. These hypothetical trade-offs push respondents to actually think about their answers, rather than simply saying “yes” or “agree” to every question.
Social Desirability Bias: Virtue Signaling and the Power of Social Desirability
People care about how they appear to others, even in anonymous surveys. This means they’ll sometimes give an answer that sounds virtuous, rather than one that reflects their actual thoughts and behavior. Ask someone how often they recycle, and you may get an answer based on best intentions. Ask someone how often they drink, and they may underestimate.
Basically: If someone has 50 beers a week and then throws all of the cans into the stream behind their house, they probably aren’t going to tell you that unless you ask the question with extremely forgiving wording. One simple tactic is to preface the question with a normalizing statement such as “Many people struggle to find time to recycle” or “People often find that they drink more than they would like due to stress and other factors.”
Related Article: Harnessing Customer and Employee Feedback for Better Experiences
Question Order Bias: How Question Order and Fatigue Skew Responses
When people make their way through surveys with multiple questions, they don’t treat each question as its own blank slate. Rather, earlier questions can shape how they interpret later questions, even if that influence is subtle or unintentional.
For example, imagine a survey that begins with questions that trigger a sense of personal responsibility about the environment. After five or six questions about a warming planet and oceans full of microplastics, respondents may be much more inclined to say they care about corporate sustainability programs.
There’s also the simple matter of fatigue. Some schools rotate students’ schedules so that no one gets math at the end of every day. Similarly, it can be good practice to rotate survey questions to avoid a concentration of low-effort responses. It’s an even better practice to keep surveys to a handful of questions. The more questions you ask, the less effort respondents are going to put into their answers.
Summary of Common Survey Biases
Each of these biases can skew data if not addressed in the design phase.
Bias Type | Definition | How It Affects Data |
---|---|---|
Wording Bias | Using vague, technical or ambiguous language in questions or answer choices | Respondents may misunderstand what’s being asked, leading to inconsistent or inaccurate responses |
Acquiescence Bias | Respondents tend to agree with statements regardless of their true feelings | Can inflate support for features or opinions due to a lack of critical engagement |
Social Desirability Bias | Respondents give answers that make them look good, rather than telling the truth | Overreports of virtuous behavior and underreports of socially frowned-upon actions |
Question Order Bias | Earlier questions influence how respondents interpret later ones | Skews responses due to priming effects or respondent fatigue in long surveys |
Better Surveys Start With Better Bias Awareness
Great surveys don’t just ask questions—they ask the right questions in the right way. Eliminating bias requires not only thoughtful design but ongoing awareness. Even seasoned researchers benefit from revisiting these principles regularly to ensure their insights remain accurate and meaningful.
Core Questions About Survey Bias
Editor's note: Key questions for marketers and researchers on eliminating bias from survey design and improving data quality.
Strategies include pre-testing questions, rotating the order, offering neutral framing, limiting the number of questions and anchoring sensitive topics with normalized statements.
Vague, technical or emotionally loaded wording can confuse respondents or lead them toward certain answers. Clarity, specificity and contextual definitions help eliminate this distortion.
Survey data can be distorted by wording bias, acquiescence bias, social desirability bias and question order bias. Each interferes with accurate interpretation of what respondents really think or do.
Known as acquiescence bias, this tendency to agree is especially common when respondents don’t have strong feelings. Mixing positive and negative phrasing, or posing trade-offs, can prompt more thoughtful responses.
Learn how you can join our contributor community.