NAMA Shortcuts
Member Directory
Best of NAMA 2014
Upcoming Events
Chapters
Agri-Marketing Conf
More NAMA












TEN TIPS TO MAKE LIFE EASIER
Companies are taking a long, hard look at costs, determining where to make investments and where to make cuts. Within some organizations, the market research function has come under scrutiny, and in some cases, directly to the chopping block.

That means more and more research is funneling through independent firms to handle either all or part of a research project. If you are in a situation in which market research is moving to external resources, you should be aware of ten tips we have for making life easier for both you and your supplier. Avoiding these pitfalls will help create a smoother research process for everyone involved.

1. Setting fuzzy objectives. Successful studies begin with clear and specific objectives. Don't stop with "to know more about our customers." That vague objective leaves too much unknown. Research pioneer Sherwood Dodge said, "No research technique is so sharp that it can answer a fuzzy problem."

If, at the beginning of the project, you can look ahead and envision how you'll use the results, you're on the right track.

2. Not telling the researchers what you plan to do with the results. Simply telling researchers what questions to ask may not meet your study's objectives. Researchers do a better job of designing a project for you if they understand what you hope to accomplish.

Resist the temptation to just give a researcher a list of questions since "that will make it clear what we're after." Odds are this technique will lead to useless data. Benefit from the research company's expertise, they have probably faced situations similar to yours and can plan accordingly.

3. Excluding stakeholders from the design process. If, for example, a research objective is to develop a picture of your customers, the people who would most directly benefit from this data should be involved in the survey process.

It's happened. A survey is completed and all of a sudden folks who should have been involved see the results.

"Why wasn't this asked?" "Why didn't you find out more about that?"

4. Designing the questionnaire by committee. Having just suggested you involve the key players when it comes to framing objectives, it's equally important to avoid creating a "Frankenstein's monster" questionnaire.

To be sure, different stakeholder groups will have different information needs. Still, it's better to have lots of input that you and your research partner can distill versus leaving out something that could be important. The net effect might be an increase in design time, but a good questionnaire design agent will help you dial in on what needs to be in the questionnaire, based on your objectives, and that which may be
eliminated.

Caution however: Keep an eye on overall length of the survey as longer surveys can have a detrimental impact on overall response.

5. Assuming respondents are as interested in your survey as you are. Respondents are doing you a huge favor by answering (think about having to pay them at their hourly rate to do it). Remember, they're not nearly as interested in supplying you with information as you are in receiving it.

As a result, keep your survey short, to the point, and easy to complete.

6. Asking "interesting" questions. The interesting question (as in "I don't know what we'll use that data for, but I'll bet it will be interesting") contributes to lengthy questionnaires and low response rates.

Common examples of "interesting" questions include standard demographic queries (age, gender, marital status, etc.) when there is no action plan for data use at the back end. Frivolous questions inflate the price of a survey and can sacrifice high response rates.

7. Not following sample selection instructions. This pitfall is one of the most costly, yet one of the most easily avoided. The research company should provide you with explicit instructions of how to generate the sample from your files, assuming you own the sample frame. If working with an externally-supplied sample, the research company should work with the supplier to assure appropriate sample creation.

Creative alternatives to proper sampling procedures can lead to unusable data. Some of these problems can be caught before fieldwork, but other problems aren't identified until it's too late.

Recently, after the results of one survey were collected, the client realized they pulled their sample only from one specific group with the entire sample frame. This omitted a significant, important portion of the original target and effectively wasted a survey. The only solution was to resample and go through the expensive exercise of collecting the data a second time.

8. Being penny-wise and pound-foolish. Attempts to reduce out-of-pocket expenditures by eliminating personalization techniques, avoiding the use of some kind of incentive, offering the incentive only if a reply is received, may result in savings, but generally cause big drops in response rates.

Maximizing response should be an important aspect of the survey process, and cutting corners can have a detrimental impact on response.

9. Rejecting the results when they don't match your preconceived notions. When research uncovers facts contrary to what clients think or want to hear, their first reaction may be to reject the research. Before tossing the report out the window, it would be prudent to make sure that there was no mistake in research design or execution. After confirming that the data is sound, it's important to realize that the research really does reflect something that you ought to factor in with what you already know.

One project director was told by a client that they were just "going to bury" their survey results even though the data showed there wasn't as much of a market for or interest in a new product as they originally believed. This decision could prove to be costly.

10. Not using the results. For whatever reason inconclusive results, "negative" results, personnel turnover a disheartening proportion of research reports end up sitting on the shelf, collecting dust.

Clearly, there is no surer way to squander your research investment than to consign it to oblivion. To help ensure this doesn't happen to your project, confirm why you're conducting the project in the first place and share that knowledge with your research partner; involve those who are most likely to use the results; keep the survey focused; and don't overlook the details such as sample selection.

CONCLUSION
As already suggested, establishing objectives is the most important first step to a survey. Once those are agreed upon, the design process can unfold, including sample, questionnaire, and methodology design.

All of these should be completed by your research partner, subject to your input and final approval. While arguments can be made as to which of these steps is the most important, clearly they are all important because a downfall in one area will impact survey outcome and quality.

It's important to remember that all your objectives may not be met in a single survey. For example, a long list of diverse objectives may require too many questions for a single survey. There's a careful balance between a comprehensive survey and one that overwhelms respondents. Surveys that are too large or complicated tend to get very low response rates and won't
provide you with the results that meet your needs.

In any event, we suggest you strive for the following when designing your questionnaire, no matter the methodology that you ultimately require.

1. Keep the survey as concise and well-focused as possible.

2. Realize that you aren't filling out the survey, your "audience" is. Make it as interesting and visually
Make it as interesting and visually appealing as possible.

3. Start with a question that is easy for all sample members to answer.

4. Maximize the use of closed or structured questions and minimize the use of open-ended as well as "other, please specify" response options.

5. Do as much as possible to ensure that the questions you pose are answerable by the members of your survey sample. For example, asking the corporate engineer about overall corporate sales, in all locations, may be a stretch.

6. Limit the use of matrix questions as these are somewhat cumbersome for respondents and may, in fact, lead to missed columns and rows.

Once the questionnaire is finalized, take whatever steps are necessary to make sure your study stays on the timeline. Breakdowns in the schedule and fieldwork process can have a negative impact on overall response rates.

Finally, be sure there is clear communication as to what you will receive in the way of final report deliverables. There are a lot of possibilities, all the way from simply sending you a data set so that you may perform your own tabulation and analysis, to full-blown PowerPoint documents, or something similar. Post-fieldwork reporting can be very time consuming and expensive, so be sure everyone involved is on the same page as to what your specific needs are.

Jack Semler is Pres/CEO of Readex Research, Inc. The 60-year old firm is headquartered in Stillwater, MN. Semler can be reached at 651/439-1554;
e-mail: jsemler@readexresearch.com.


Search News & Articles

























Proudly associated with:
American Business Media Canadian Agri-Marketing Association National Agri-Marketing Association
Agricultural Relations Council National Association of Farm Broadcasters American Agricultural Editors' Association Livestock Publications Council
All content © Copyright 2014, Henderson Communications LLC. | User Agreement