Customer Surveys – Do They Really Work?
I am behind reading my blogs and just caught up today with a great post from last week on grokdotcom that demonstrates how hard it is to get action-oriented information from customer surveys. Follow the link above to read the post—I’ll wait right here.
Scary, isn’t it? When you think about how many questions you’ve asked customers and how little information you might be getting back.
Surveys are seductive. They are easy to construct, easy to implement, they provide statistical data, and everyone understands exactly how they work. They are persuausive.
But the simple question that Bryan Eisenberg asked shows how flawed a survey can be.
Down deep, we all know that surveys are flawed, but we’re accustomed to them. We are familar with them and we overlook the flaws because getting the information seems so important. And when you think about the fact that there is no perfect way to get information, your brain hurts.
When you think about how you need to watch customers use your product or your Web site, that you need to do interviews, mine your phone logs and support e-mail queue, track opinions in the blogosphere, and watch every mouse click on your Web site, geez, it’s overwhelming.
We’d all rather retreat to our surveys. They are simple, we know how to do them, and everyone is so used to them that they don’t question the results—we just point our business in whatever direction the survey says.
The problem is that they are often wrong, just like any single method of collecting customer feedback.
What about your business? Do you listen to what customers say out on the Web? Do you watch what they do? Or is your product development and marketing campaigns driven only by customer survey results?