Four truths you should know about surveying

by Nov 11, 2020

Shelton Stat of the Week

25% of Republicans are very/extremely concerned about climate change, Eco Pulse®, May 2020.

If you’re familiar with Shelton Group’s approach to sustainability marketing, you know we start with insights – and you know that we spend a lot of time surveying Americans to understand what consumers and business decision-makers expect from the companies they buy from related to sustainability and corporate responsibility. And unless you’ve been hiding in a cave (understandable, actually), you know that the presidential race was called for Biden…and that much has been made of how far off the polling seemed to be, yet again.

Now, what Shelton Group does is surveying – meaning we ask lots of questions of different types (open-ended, multiple choice, slider scales, etc.). What we typically hear about during election season is polling, which is a small, simple set of open-and-shut questions (and sometimes only one question) like, “Who are you going to vote for?” with a list of the people actually running. That said, some of the challenges we see with surveying and survey data are similar to the challenges we’re seeing with polling. This article aims to both help you make sense of “what went wrong” with the polling data AND help you learn a thing or two to make your own survey work as reliable as it can be. For a thorough walk-through of what happened with the 2016 presidential polling (it’s too early for the full autopsy of the 2020 data), here’s a good piece from Pew Research Center and another good piece from Harvard Business Review.

Here are some universal truths about surveying, some of which also apply to polling:

Truth 1: Your survey data is only as good as your answer set.

In any quantitative survey, the creator of the survey is making up an answer set for respondents to choose from. In other words, the person creating the survey is imagining all the answers you might have to their questions and putting those options down on paper for you to choose from. But have you ever taken a survey and thought, “Well, I wish they had an E, ‘none of the above,’ because none of these answers actually fits what I think?” That’s a fundamental problem with surveying.

The best practice for good quantitative surveying is to first do qualitative work – ethnographies, interviews, focus groups – to probe on issues and get a deep understanding of feelings, beliefs and considerations. That work makes it a heck of a lot easier to craft a survey with the right, comprehensive answer set.

Truth 2: Your survey data is only as good as your interpretations of it.

It’s really easy to oversimplify people’s mental models…or assume that people think like you. For instance, a political survey could ask a question like, “Should women be treated as equals to men?” Overwhelmingly, women are going to answer yes. It would be easy, then, to make the assumption that a politician who speaks disparagingly and degradingly about women would not be at all appealing to the women who answered yes to that question. In fact, though, some women may bracket these facts out, and focus their decision more on a candidate’s party affiliation and policies.

Similarly, a question like, “Should companies do their part to fight climate change?” would likely also get an overwhelming yes. It would be easy to then make the assumption that Americans want companies to do that. But what if that meant the price of those companies’ products would go up? What if it meant they laid off employees? What if it meant none of that but did mean that a company’s work on climate change wasn’t a primary consideration for purchase? Or, more like what we’ve seen with presidential polling, what if it was really important to a set of consumers to buy from companies doing good in the world, but a “bad actor” also had a product with other, really important benefits that the consumer couldn’t get from the other choices? That’s when consumers will make choices that seem to fly in the face of their values. As more and more companies move towards sustainability and social responsibility (which is obviously happening), it will become easier for consumers to have it all – products that check all the core benefits AND come from companies known to be doing good in the world, and the “bad actors” won’t win their vote, so to speak.

Back to the qualitative step, though: it’s really critical to deeply understand people, their beliefs, what really matters to them, and their mental constructs BEFORE conducting a survey and interpreting its answers.

Truth 3: You should always assume a “social desirability” factor.

We see this a lot in our surveying. When we’re asking a barrage of questions about protecting people and the planet, about what companies and brands should do in that regard, and about what individuals should do in that regard, it’s easy for the respondent to see that there’s a socially desirable answer to most of the questions. Of course, I’m a good person who is taking steps to reduce my environmental footprint! Of course, I want to buy eco-friendly products! You have to come up with some “trick” questions, like one of my favorites we’ve been asking for about 10 years now:

  • Part 1: Can you think of a time when you’ve purchased or not purchased a product or brand because of the company’s environmental record?
  • Part 2 (asked of everyone who gives the socially desirable answer to part 1): Name the product or brand.

The other thing you have to do is not take the quantitative data as gospel. Yes, the presidential polling was off by several points. Directionally, it was right – Biden won. While there are likely several reasons the polling is several points off, the social desirability factor (that it’s socially undesirable in some circles to say you’re going to vote for Trump) is likely one of them.

Truth 4: You’ve got to cover rural areas, too.

This is a big deal. Rural America loves Trump. And a lot of polling is conducted in urban centers or via online platforms that many rural Americans may actually not even have access to. (Pew makes an excellent point about the very real rural/urban digital divide in this piece.) When it comes to accurately gauging opinions about climate change and about how a company’s actions for the environment will engender brand affinity and spur purchases, we have to get representative samples of rural Americans.

So, do the qual before the quant, apply the right mental models in your interpretations of the data, factor in social desirability, and carefully curate your sample. Easier said than done…but that is indeed the formula for getting good insights you can use, whether you’re trying to predict brand affinity or a presidential election.

News of the Week

Presidential pollsters got it wrong, what are the implications for consumer research?Search Engine Land

Most of the major presidential polls were off, and while Joe Biden did win, it was by margins considerably smaller than expected. For some organizations, the methods used to conduct online political polling are basically the same as those typically used for consumer market research. However, as an article in Search Engine Land reminds us, “Consumer surveys are a useful tool, but they must be reality-checked with behavioral data for a complete and more accurate picture of consumer activity. Going forward, we should receive any individual piece of data or survey with healthy skepticism and a clear understanding of the underlying methodology.” Read more…


Opinion poll ‘failure’ at Australian federal election systematically overrepresentedThe Guardian

The United States isn’t the only country to get election results that its pollsters didn’t expect. A report just issued by Australia’s Association of Market and Social Research Organizations concluded that opinion pollsters in Australia experienced a “polling failure” after they got the results of last year’s federal election badly wrong. The report says surveys were skewed towards more politically engaged voters, resulting in over-representation of the country’s Labor Party supporters. The report further highlights the importance of using proper sampling techniques. Inquiry chair Darren Pennay says, “Pollsters should seek to better understand the biases in their samples and to develop more effective sample balancing and/or weighting strategies to improve representativeness, by looking at education or other variables.” Read more…


Engaging Middle America In Recycling Solutions

Before COVID-19, 41% of Americans wanted to be seen as someone who buys green products, and many could cite an example of a brand they’d purchased (or not purchased) because of the environmental record of the manufacturer. Now, in the middle of the pandemic, the numbers have dropped dramatically. The big question is, what does this mean for engaging Americans in their number one green activity: recycling? Another question is, what does it mean for companies’ sustainability brand?

Our latest report answers these questions by digging into current consumer attitudes, how they impact consumer behavior, and how organizations should respond to ensure recycling – and other green behaviors – keep happening.

Previous Posts

About the Author

Suzanne Shelton

Where Suzanne sees opportunity, you can bet results will follow. Drawing on her extensive knowledge of both the advertising world and the energy and environment arena, Suzanne provides unparalleled strategic insights to our clients and to audiences around North America. Suzanne is a guest columnist in multiple publications and websites, such as GreenBiz, and she speaks at around 20 conferences a year, including Sustainable Brands, Fortune Brainstorm E and Green Build.

Submit a Comment

Your email address will not be published. Required fields are marked *