Do you feel confused with all the polls, etc., being thrown about these days? Don't feel bad, as not many really understand polls, but we can usually say the same for surveys or consensus. Heck, even marketing experts get them wrong a lot of times.

How do we really know what people think? In trying to find out, there's a lot of confusion created. Some of the confusion is intentional, some confusion is an unawareness of fundamental math, some of the confusion is created by a lack of understanding of what data is telling us, or maybe just simple lack of knowledge of how to collect the information. Let's try to clear things up a bit.

A census is what you get if every single person is asked what they think about something. Essentially, there are just two reasons to have a census. Either you want each person to feel personally involved or you are keeping track of each person's response(s). As an example, if you're ordering T-shirts for the basketball team, you ought to do a census of the team to find out what size each player needs in order to ensure they get the shirt that will fit properly.

A survey is merely a collection of answers from whomever cares enough to answer the survey questions. A survey can be a useful tool for brainstorming, but it shouldn't be confused with how or what the intended audience actually feels. The survey questions are critical to the outcomes and related conclusions. An old adage is that one's lack of rigor in setting it up the survey is repaid in a lack of precision in the data. Of course, survey questions can be skewed to purposefully bias the results but that's beyond the purpose of this overview.

A poll, on the other hand, is a smart shortcut. Essentially, it's a statistical method for replacing a census (where everyone is asked) with a very close approximation achieved by asking a minimum number of people (a statistical sample) required to get a useful answer. If properly done, a statistical poll should get an answer nearly as useful as an accurate census, but with less expense of resources or time.

It rarely makes sense to ask all of your customers about how they feel. You're wasting everybody's time (their's and your's) by adding more entries into the database without making the database any more accurate. Part of the problem plaguing polls, as well as surveys, is that typically only people who answer either are annoyed or have nothing better to do with their time. One fundamental truism is that if a poll isn't done statistically correct then simply making the sample size bigger won't make it any better.

Have you ever received a survey after talking to a customer service representative? Well, they're not actually taking a survey. What they're really doing is checking on their customer service people, and your answers are directly connected back to each representative so that person can be scolded (or worse) if they didn't do what was expected. The problem in many cases would be connected to a system breakdown and not due to the failure of the customer service representative, but it's easier to blame someone else for a bad report.

A poll, like a survey, won't accurately predict the future. The media has almost completely missed this point, over and over (but maybe they didn't miss it). If, on the day the iPhone was announced several years ago, a well-designed poll of adults was taken and people were asked, "Do you intend to ever buy a smartphone?" the yesses would have certainly been about 5 percent of the respondents.

Of course, a decade or so later, that poll turned out to be completely inaccurate. Does this mean the poll was in error? Well, not really.

An accurate poll is just a snapshot of how the respondents feel and think at that very minute. That's all it is. If outcomes end up being different at a later date, it may not be the poll's fault.  It may be that the poll was biased or it just might be it's a mistaken belief that the future can be predicted.

Furthermore, the question that gets asked is as important as the answer. When people are asked a question, they rarely give the straight-up truth in their answer, especially when there are social factors involved. The very best polls combine not only the right math (for independence, etc.), but more important, the right question structure.

The math is critical, so the sample size is important. Let's say you had a bag of skittles' candies. You know they come in various colors, and you want to figure out the percentage of each that's in any given bag. As long as the candies are distributed within the bag, it turns out that no matter how many are in the bag, whether it's a 2 pound bag or a 2,000 pound bag, all you need to do is randomly pull out 300 to 400 individual skittles. That's plenty for a representative. More samples won't dramatically increase the quality of this poll.

Sample size is relevant for any sized group that's consistent in its makeup. The purpose of the sample is to pick a random selection from a homogenous group. As soon as you can divide the group into buckets, segments or sub-groups, you benefit by drawing multiple samples.

Most of the well-devised polls you hear about in public do not have a sample size problem. It's just something that misleads or distracts from the more important relevant or other issues.

For instance there is power in bucketing data. What happens if you realize that there are more than one kind of skittle, and that different kinds have different color distributions? Well, you could take this into account and run much larger sample groups, or you could get smart about sample size.

It turns out, for example, that women who ride Harley Davidson motorcycles want different things from this product than men do. It also turns out that (I've heard this from Harley owners in the past but not sure it's still accurate but it's not necessary for this point) perhaps 10 percent of the people who buy a Harley are women.

Given that, you could poll 300 women (the easy minimum) and then 2,700 men (to get the balance right). OR, you could get smart, and poll 300 women and 300 men (because every time you add a new person to the mix, it can get really expensive). "But wait," you might say, "that's not right, because women are over-represented in the sample."

Well, that's true. However, after you figure out how women think, and then figure out how men think, you can weight the men's results in the final results. If, for example, you discovered that women intend to buy a new Harley every two years, but men intend to buy a new bike every six years, you could then report that the average Harley customer intends to buy a new bike every five years or so. (Certainly it's been said that it's dangerous to average averages, but in this case, it's not far wrong.)

Confusion about polls is easy. And the more you try to make decisions using polls, the more careful you need to be about the intent, structure and motivation of the poll itself. That is, if you've concerned about not being biased with the results.

But finding an accurate poll is usually fairly easy. Most pollsters (in private and in public sectors) are transparent about their methods, and the magic of statistics is that the math of how the poll is structured can be checked by others. 

Too often, marketing experts conduct surveys, not polls, or bother everyone with a census, which are poorly constructed. Worse, they present these results as an accurate prediction of the future, instead of a reliable snapshot of current thinking.

It's the surveys that are so often wrong, deceptive and confusing. It's surveys  that feel like they're accurate but rarely are.

And if we're going to challenge a poll, it's far wiser to challenge the questions (that's designed to get the respondent to "lie") or the flaws in sampling (which often requires polled individuals to have a home phone, but of course, an entire generation of young people don't have one).

It makes no sense, however, to throw out the results of polls we disagree with. The quality of the cars and trucks we drive or the effectiveness of the medicines we take are all directly related to the same statistical techniques that are used to conduct a poll. Ask the right questions to the right people and your snapshot in time is going to be helpful.

With the above said does this make you fell wiser, better, or more frustrated?