Being a CX professional is hard enough; misinformation just makes our work more challenging.

Misinformation or confusing information by a person with a ton of followers and a ton of influence makes our work even more challenging.

Over the weekend, Seth Godin published a post on his site titled, Sneaky Surveys (and Push Polls). I’m a big Seth Godin fan, but this post made me pause. As of today, it’s already received almost 4,000 likes (stars) on his site. 

In his post, he makes six points about surveys. I’ll address each one.

Open Access Online Surveys

His comment on this topic is: All open access online surveys are essentially inaccurate, because the group that takes the time to answer the survey is usually different from the general public.

I’ve seen very few truly open access online surveys. I have to assume he’s talking about polls posted on social media or on media sites (given some of his comments on the other five items). And polls, as you know, are different from surveys. If someone posts a poll on social media, I’m not sure they’re looking for feedback from the general public anyway; they are looking for (or will get) responses from people who care, negatively or positively, about the topic in question. If you’re not vested in the topic or if it’s not relevant to you, the likelihood that you’ll respond to this type of survey – or any survey – is pretty slim. The critical thing here is, as the author of said survey, you must know/realize this.

Now, if you’ve got a site intercept or a static survey on your site (not behind an account login), these are open access because there are no restrictions to access; anyone can come to your site to respond, But no one (general public) goes to your site just to take a survey. And you’re not looking for feedback from the general public, either; you’re looking for feedback from people who have come to your site to search, research, purchase, get support, etc. And those are the only people who will respond. Does that make the survey inaccurate? Um, no.

This is a good reminder to always define your objectives, know your audience, ask questions relevant to your audience, and present the survey in a way that gets you feedback from that audience.

Survey vs. Census

Seth states: Don’t confuse a survey with a census. A survey asks a randomized but representative group some questions and then seeks to extend the answers to the entire group as a whole. A census seeks to ask everyone in the group, so that no generalization is required. He goes on for a couple more paragraphs about this one.

A survey is not always asked of a randomized group; sometimes it is asked of your entire population; hence, a census is a type of survey (i.e., the U.S. Census), but it also refers to the sample to which the survey goes: to your entire population. For example, a point-of-sale survey goes to your entire population, unless you only serve that up on every nth receipt, to every nth customer.

He goes on to say: The huge mistake is believing that you need to survey more and more people. You don’t. And your work to reach more people actually makes your survey less accurate not more (see the first thing).

I’m not sure how the survey becomes less accurate with more and more responses. He seems to be linking that to “open access,” but very few surveys are truly open access. No well-designed market research or VoC program will just lob a survey over the fence for anyone to respond. You have a sampling plan and want specific people, e.g., prospects and customers, to respond, not the general population. Otherwise, that’s a waste of time and money. I do agree with him on this: What you need is a correctly representational group, which can be dramatically smaller than the entire population.

As customer experience professionals, we prefer to hear from more customers rather than less. The feedback we get is generally spread out over time, not point in time, so we will likely have more than less responses. He makes a generalization about surveys, but there are ongoing transactional and relationship surveys, for example, for which you will get more and more feedback over time. And that does not make them inaccurate.

Survey Anonymity

He starts off this point with: You might believe the survey someone just emailed you to fill out is anonymous. It probably isn’t. He is correct. He cites information from SurveyMonkey’s site about tracking IP addresses and email addresses, i.e., they are built-in features. And then says: If you get a survey link by email or even as you browse a site, it’s a safe guess to imagine that your answers are tied in some way to your other interactions with the organization that posted the survey. Respondent beware.

Ouch. Respondent beware. Let’s scare the public. Guess what? You’re being tracked in your everyday life, everything you do. If people want personalized experiences, they’re going to have to give up data. That doesn’t mean surveys are sneaky. What’s sneaky is Amazon or Apple recording your conversations (without you knowing) and then using that data to present you with offers for the products you were talking about. Yea, that’s happened to me. But then – all expectations of privacy are pretty much gone at this point.

Back to surveys. I don’t know of many customer experience surveys that are anonymous. If you follow some of the rules of voice of the customer surveys, you’re going to personalize your email invitation and reminder. That’s the first sign that the survey isn’t anonymous.  Not promising anonymity allows you to tie the customer response to the customer’s data so that you can close the loop with the customer and so that the analysis is far more robust and insightful.

Push Polls

Well, here’s a big problem. You can’t put surveys and push polls into the same bucket. They are very different and have very different purposes. His comment here is: Asking someone a question can change the way they feel. Done crudely, this is called a push poll (“Did you know that Bob was indicted last year?”) but even asking someone a thoughtful question about their satisfaction can increase it.

Push polls are most-commonly used in political campaigns to sway the respondent/voter. Political polls. Not your surveys. As you know, this approach is actually a huge no-no in surveys.

I’ve got two thoughts top of mind here, two rules you must adhere to: don’t ask leading questions, and don’t try to sell with your surveys! Well-designed surveys do neither of these.

Open-Ended Question

His next point is a painful one: At the conclusion of the endless surveys when they ask you if you have anything else to add, don’t bother. It’s not like the CEO is busy reading your comments.

While I can’t disagree 100% because there are still companies out there who do nothing with their feedback, this is such a broad, inaccurate, blanket statement that it ought to either piss you off or motivate you to move. Don’t throw out the baby with the bath water! In many cases, the CEO is not reading the comments (yet, some do!), but there are other people in the organization – the ones who are actually going to act on them – who are reading (or using text analytics to glean insights). Insights from customer feedback is socialized throughout the organization, in a variety of ways. In customer-centric organizations, the CEO is in the loop, getting a summary of findings in her dashboard, briefings, etc.

Focus-Group Survey

Oh dear. What’s a focus-group survey? He says: The single best way to figure out how people feel isn’t to ask them with some focus-group survey. It’s to watch what they do when given the choice. “This or that?” is a great way to get to the truth of our preferences.

Yes. Ethnographic research and other approaches for observing customers in their natural habitats are a great way to learn about customers and their preferences. As a matter of fact, focus groups are a useful tool for this, as well!

But observations don’t get at the why, and surveys aren’t just about preferences. Here’s the rub: unless you ask, you won’t/don’t understand. Surveys are about understanding. Yes, understanding customer preferences, but also expectations, reasons, how they felt about the experience and how it made them feel, pain points, problems to solve, what went well and what didn’t, what they liked and didn’t like about the experience, and more.


Taking a broad brushstroke to surveys and putting a negative spin on them does no one any good. And mixing topics doesn’t either. If you buy into – or aren’t sure about – any of the things Seth wrote about, please be sure to revisit the following posts about VoC programs, surveys, survey invitations, closing the loop, socializing insights, and more.

22 Tips for Proper Survey Design
14 Tips for Creating Your Best Survey Emails
10 Ways to Socialize Customer Insights
10 All-Too-Common VoC Program Mistakes – Part 1
10 More All-Too-Common VoC Program Mistakes – Part 2
Surveys Don’t Sell!
Successful Survey Series
Tips for Designing a Closed-Loop Feedback Process
5 Fails to Avoid with Your VoC Program

It’s also a good time to remind you that the survey is a touchpoint, so you must consider the respondent experience – and make it a good one!

Improving the Respondent Experience
How’s the Customer Experience of Your VOC Program?

And remember that you can listen to customers in other ways, not just via surveys.

Is Seth’s post a kick in the ass for customer experience professionals and market researchers to get their ducks in a row and do things better? Or has he simply confused a bunch of topics (that address a bunch of different audiences) that shouldn’t be tied together – or even written about at all? As always, I’d love to get your thoughts on this topic!

Annette Franz is an internationally recognized customer experience thought leader, coach, consultant, and speaker. She’s on the verge of publishing her first book about putting the “customer” in customer experience. Stay tuned for that! In the meantime, sign up for our newsletter for updates, insights, and other great content that you can use to up your CX game.

Image courtesy of Pixabay.