It’s no secret that surveys are a great way to generate publicity for organizations. Once the province of highly-trained market research professionals who knew how to create a survey that yielded reliable results, surveys used to be so costly that only large marketing organizations could afford them.
Not anymore. Now anyone can create a survey using free or very low cost online tools, and have their survey “in the field” – posted online for anyone to answer – in minutes. Unfortunately, the new tools can seem so simple that people who aren’t trained in writing survey questions correctly can inadvertently create a survey that is filled with leading questions that skew results or other errors and omissions that reduce the value of the results to news organizations.
It’s not hard to create a survey designed for public relations and social media use that avoids common mistakes. Here’s the process for creating a survey-based news release that gets media attention:
- Set goals.
- Design and conduct the survey.
- Analyze the results and create the materials that will be used to report them to your audience.
- Write and distribute the news release about your results.
Why Setting Goals Matters
If you fail to set the goals for your survey before you begin, you’re more likely to make work for yourself. Start by deciding what you want to achieve with your survey.
Many marketers will answer the question of what they want to achieve with a survey-based news release campaign by saying, “Getting my company’s product into the media.” That’s only a partial answer, however.
When you are creating a survey and the campaign materials that go along with reporting the results, what you want to achieve matters a lot. The kinds of surveys used for brand awareness are different than those used for creating thought leadership, or securing coverage around a special event or holiday. Here are some examples of successful surveys:
- Thought leadership: The giant PR agency Edelman does an annual “Trust Barometer” which analyzes how consumers feel about brands, professions, and industries. Cartus Corporation, a relocation consulting firm, publishes annual surveys of executive attitudes about relocation, with a list of the biggest challenges they face when asked to move for a job or promotion.
- Brand awareness: Bankrate, a consumer financial services company, publishes the monthly Financial Security Index. Macy’s conducts “Dress for Success” surveys that shows photos of job candidates wardrobe choices to college students and hiring managers, and compares the differences and similarities.
- Holidays & special events: The American Automobile Association (AAA) publishes holiday auto travel projections and gas price surveys. VISA publishes spending reports around milestones like high school proms, Christmas, Mother’s Day, and other holidays. 1800Accountant surveys small business owners about tax issues and its own customer service staff about common tax questions posed by small business owners.
You don’t have to be a giant company to create and publish a survey that gets noticed. For example, here at MyPRGenie, we frequently use surveys to identify trends, preferences, and opinions on PR, social media, and digital marketing, and then use them in our PR and marketing efforts.
Once you’ve decided what you hope to achieve with your survey, the second step in setting appropriate goals is to define who will be surveyed, and who will want the data gathered from survey participants. One of the challenges with online surveys is that participants self-select, so they are not as randomized as a well-done telephone survey. So it’s important to clearly identify who you want to participate in your survey, and discard survey responses that are outside your pre-determined group.
How to Write an Effective Survey
Writing a survey questionnaire that is intended to secure media coverage is different than creating a market research survey. The goal of a PR survey is to create newsworthy results, and that often means posing questions with an attitude you won’t find in a more traditional survey.
For instance, in a survey on online privacy from a software company, the questions included standard questions about online privacy as well as these three questions:
- Would you walk around named in the locker room at the gym?
- Would you pose for sexy photos with your significant other and then post them on Facebook?
- Would your mom approve of all of the photos you’ve posted of yourself on Facebook?
The headline of the resulting press release was, From Walking Around Naked to Updating Facebook Privacy Settings, Younger Generation’s Views on Privacy Are Changing. The PR team at the software company clearly kept their end goal in mind when they wrote the survey. However, they followed best practices and avoided leading questions or questions that encouraged participants to agree to a certain question because they thought it was “expected”.
No matter who writes your release, or what survey tool you use to develop it, it’s a good idea to have a professional market researcher review it to make sure you aren’t building in credibility problems before you post it for responses. A local university or college is often a good place to find a market research expert who will review your survey at a reasonable cost, and many market research firms offer this service, too.
If you create your questionnaire correctly, it can provide material for several news releases. The survey news releases that get the best results report the findings on only a few questions. Some releases report on just one question from a longer survey, but about half cover 3-5 questions in the body of the press release.
The most common way to get more content out of a question is to discuss differences in the responses of different groups. For instance, the online privacy survey compared responses by age group. You could also break down your results by gender, income, geographic location, education or any other subgroup within your survey results, as long as you have enough results from each group to be statistically significant.
The more responses, the greater your credibility with reporters. Based on the results for surveys promoted through MyPRGenie, we find that those with fewer than 1,000 responses get significantly less coverage than larger surveys. This is especially true if the behavior you’re reporting on is likely to be specific to a small group of respondents.
Reporters remain skeptical of some online surveys. Here are some kinds of online surveys that generally get worse results because reporters don’t lend them much credence.
- Surveys conducted only on the sponsor’s website. The view is that it might be statistically representative of the company’s customers – but there is no way to know if those customers are typical. Also, employees and vendors are considered more likely to skew results by participating in surveys on the company’s website. So few reporters will cover them.
- Surveys conducted by the organization itself, rather than those done by multiple organizations or a third-party research firm. While journalists don’t automatically assume that a paid market-research firm’s results are completely unbiased, they’re more likely to give them the benefit of the doubt. Surveys from trade associations (such as LIMRA, the life insurance industry association) and non-profits (such as AAA) are viewed more positively than those from corporations.
- Surveys that report unweighted results. Most journalists aren’t statistics experts, but they look for an explanation of the methods used to create the results, and “hard numbers” that they can rely on before reporting.
That last point – reporting unweighted results – is important if you are trying to build long-term thought leadership. If you’re merely creating a fun promotion for a specific event, it may be less important. Here’s how weighted data works: it assigns a value to the response based on how representative the person answering the question is.
Frequently, the people who answer a poll are not representative of the geographic region where you are polling. Weighting adjusts answers to account for over- and under-represented groups. For example, suppose the population in an area is 55% female; but 70% of the survey answers come from women. When reporting survey results, a greater weight may be assigned to men’s answers, to adjust the data to be more representative of the neighborhood.