L09kococol

Commentary March 05, 2006 at 02:00 PM
Share & Print

It is amusing to hear political pundits ponder why certain elections do not pan out as the polls had projected.

Most thinking people just assume that survey and poll results are not exact. After all, it is well known that some people don't answer poll questions truthfully, that moods and thinking shift quickly, that poll responders may not fully represent the voting public, and that not all poll reports are delivered accurately or completely.

But pundits who stick too close to poll results seem to ignore those qualifiers. They float around in their own orbit of poll-based predictions. Then, if proven wrong, down to earth they fall. It's amusing, because they were so obviously off track.

Things are not so amusing, though, when insurance people are similarly undone by a product's failure to match projections based on surveys and polls.

We take this topic up now, not only because the nation is being deluged with election polls, but also because the insurance and financial services industry is being deluged with polls and surveys of its own–ones related to financial products, services and planning.

These industry surveys are coming from just about everywhere–national research firms, financial research shops, company research units, consulting firms, retail and trade publishers, and more. Several arrived here just this week, with more on the way.

Truth be told, I read the results with great interest. Perhaps you do too. It's interesting to learn what other people are thinking, right?

I not only read them, I save them, compare them, assess them, talk with experts about them and, yes, write about them. This has led me to reach several conclusions about using poll and survey results in the development and marketing of insurance and financial products.

A few of these conclusions follow.

Some polls zero in on what customers want, but others focus on what a company wants to build. You need to know what kind of survey it is and the orientation of the company that commissioned the survey before evaluating the strength of its findings. The former could be useful; the latter could be disastrous if applied to your own product efforts.

Poll results often conflict. One may say that boomers are not saving enough for retirement while another says boomers are saving more than you think for retirement. Which is it? If the poll samples differed sharply, it could be both are right. The devil is in the details, so you better know those details if you're planning to use the results to support a product initiative.

Poll result analyses are of uneven quality. Some state the most positive information first, and downplay or even omit the negative information, while others give the pro and con fair play. Sometimes you don't know which is which unless you see the original report, which is hard to do if you don't work for the company that commissioned it and/or if you don't want to pay for it. Fortunately, some research firms do provide the original to those who ask for it (such as news outlets). Even so, if you only see a summary from someone who read the original, you still don't know the whole story. If you're making product decisions based on the findings, get the original.

Some poll reports do not show the research parameters. Who was sampled? How? When? Where? What is the margin of error? etc. Reputable research firms routinely spell this out, but poll result summaries don't always include it. Also, some reports bury this in the footnotes or conclusion, making you hunt for it–and making you wonder what else to hunt for.

Poll analyses often include commentary from the sponsoring organization but not from independent outsiders. The commentary can be interesting, but may lack balance. Look for outside opinion too.

Not everyone who sees the results understands them the same way. When the results are swept into product discussions, this can make for healthy debate or destructive disputes. To avoid the later, use reports that define not only parameters, but also mission, terminology and other clarifying details.

Poll results get misquoted, distorted and incorrectly sourced. This often happens after the data gets some buzz. When someone cites "a survey I read" to support a product point, it's best to ask for the source of the survey so you can check it yourself.

Despite all this, polls and surveys are mission critical to informing decision makers in product development, rollout and sales. When well executed and reported, they provide telling insights to consumer needs, competitor strength, target markets and more. They spark thoughts and ideas. However, not all that glitters in poll/survey reports is gold, so … handle with care.

NOT FOR REPRINT

© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.

Related Stories

Resource Center