A Research Study Is Only as Good as Its Design
Every question and issue does not require a Coleman Insights-level research study. We encourage our clients to conduct smaller-scale surveys to solicit feedback from their consumers. We’ve all seen self-serve questionnaires, hosted on platforms like Google Forms or Survey Monkey, from companies on whose mailing lists you’ve landed. As you might expect, the consistency of surveys like this varies widely. Our SVP/Research Operations David Baird tackled this issue in the Tuesdays With Coleman blog “Three Best Practices of Questionnaire Development.”
Recently, I was served a Facebook ad from a city government. The hook was “We want to hear from you! We are polling the community to learn about priorities for next year’s city budget.”
A city government posted a survey via Facebook to solicit opinions from the community on budget priorities
I was a little concerned that the city was using a Facebook poll for budget priorities, but I respected the community reach out and was game. Sure, why not.
Question 1 asked how important it is that the city “provide each of the following services.”
There were 33 choices. Providing a safe reliable bus system, Quality downtown parking, Reducing traffic accidents and congestion, Providing parking enforcement, Quality sidewalks, Maintaining city streets, Adequate EV charging stations, Providing bike lanes, High quality spaces where people live, work, and relax, Well maintained infrastructure (water, sewer, electric/gas), Affordable housing options, Preventing fires through public education and safety inspections, Maintaining a clean downtown, Responding to community needs (fire, police, 911), Preparing for disasters (earthquakes, hurricanes, pandemics), Quality community centers, Environment that promotes diversity and inclusion, Quality city entertainment venues, Opportunities to celebrate, connect, and contribute to creative and cultural ecosystem, Preserving local history, Providing job opportunities, Supporting small business, Supporting equity and minority and women-owned businesses, Youth skill development opportunities, Protecting natural resources, Engaging with and reaching out to the community, Making it easy to report issues and make service requests, Making it easy to pay bills and fees, Supporting actions that may reduce energy bills, Reducing energy consumption and supporting renewable energy use throughout the community, Giving residents a chance to express their views before making budget decisions, Using emerging technology and data to improve city services, Offering quality garbage and recycling collection, and Giving residents the information they need about recycling in your neighborhood.
Did you really read all 33 choices? Don’t feel guilty if you didn’t. Most of the survey respondents didn’t read them either.
The next question asked about my satisfaction with each of these 33 services, followed by a verbatim question: “Anything else you’d like included in the budget?”
Oh, my goodness.
The first question didn’t ask me to rank which services were most important, just which ones I felt were important and unimportant. But as I read them, I couldn’t help but compare and contrast. What does “quality” mean to them? It could mean something different to me. And are they really asking if I think pandemic preparation and the fire department are important?
Call me crazy, but I rated disaster preparation as “Extremely Important.”
I chuckled when I got to the verbatim question, because I had to scroll and scroll and scroll to see what was covered already.
There are several ways the survey could have accomplished what the city was going after. It may have included wording the questions differently, reducing the number of answers, grouping the answers into categories, and actually ranking priorities – while that seems to be the intention of the study, unfortunately the survey design may not provide the clarity the city seeks.
When designing a questionnaire, always consider it from the vantage point of the one taking the survey. Is every question clear? Is every potential answer necessary? Will the design lead to answering the main questions/issues that inspired the survey in the first place?
If not, go back to the drawing board. The results are always only as good as the design.