Home » President's Corner

To Ask or Not to Ask? It Depends on the Question

1 July 2017 868 views No Comment

Barry D. Nussbaum

It is no secret that response rates for all types of surveys have been plummeting. There have been many useful articles concerning ways to increase response rates, but I tend to focus on the actual questions themselves. I know focus groups are supposed to ferret out concerns about wording that might lead respondents to interpret a question differently from what the survey-taker had in mind or, even more concerning, dump the survey altogether thereby adding to the nonresponse rate. But I just don’t think enough emphasis is placed on the basic questions.

In my many years of serving as chief statistician of the U.S. Environmental Protection Agency, I had plenty of opportunities to interact with staff developing questionnaires for surveys. My favorite question for them was, “Can you answer your own question?” You should have seen the looks on their faces as they answered my query with some lengthy paragraph. I interpreted that as a “no.”

But, first, one of Nussbaum’s life observations: “Smart people cannot fill out forms.” I think smart folks overthink the forms, trying to figure out “what is really meant,” rather than just answering the question. Let me give two examples.

I am always fascinated by the U.S. Customs form to be filled out as your international flight is approaching the United States. It asks you to fill out “number of family members traveling with you.” It goes to great lengths to define who a family member is, but it leaves it totally up to you to determine whether the number of family members traveling with you includes you or excludes you. While the plane is maneuvering toward a landing, I have seen many well-dressed, prominent-looking folks lean over to ask others what they think. (Yes, I know I am assuming well-dressed folks are educated and smart. One strike against me for profiling.) To me, it doesn’t matter too much. When you get in front of the Customs and Immigration officer, he or she can surely count how many people are in your party. They will not toss you out for getting that one wrong.

However, my all-time favorite involved my late wife, Debbie. She was quite well educated—a PhD in psychology—and quite smart and clever. (No comments about how smart one can possibly be to have married me!) Anyhow, it was the time of JSM 1994 in Toronto. I arrived in Toronto earlier than Debbie to take some courses. Debbie flew with the kids to Chicago, where they could be spoiled by their aunt and uncle while we were in Canada. As a dutiful husband (with a rental car), I drove to the Toronto Airport to pick Debbie up once she arrived. Waiting outside the customs area, I saw many other passengers from the Chicago flight … but no Debbie.

Finally, a very red-faced wife appeared. Given the customs card on the plane, she noted it was in French. Being smart and knowing Spanish, she assumed she could work her way through the French. She was also assisted by the fact that her birthday was May 5, thus it would be 5/5 no matter whether they used the month first or the date first convention. (What is the probability of such good fortune?) So she filled out the form.

Upon arrival in Toronto, you have a choice to go to either the English-speaking or French-speaking officials. Naturally, she joined the English-speaking line. When she got to the officer, he took one look at the form and in his best French-accented English said, “Madame, the other side!”

Now, I have some common concerns about surveys I would routinely express. In fact, I would quiz the staff on the following before we launched a survey:

  • Is there a good reason the respondent would want to answer the survey?
  • Is there something in it for the respondent, or is the respondent just doing you a favor?
  • Is the survey too long?
  • Have you only asked questions you really need answered?

These queries helped, but somehow my main concerns always returned to the questions themselves.

What do other experts say about all this? I started with that survey-taker of great prominence: Survey Monkey. To their credit, they have listed tips to enhance survey respondent participation. Addressed are such topics as survey design and analysis considerations and email invitation considerations. Within the email invitation considerations, there is a section about considering message content. But rather than focusing on the questions, they discuss the invitation message and give tips for avoiding spam, personalizing the message, using a professional reply email address, etc.

Another piece of advice comes from Ross Beard of Client Heartbeat. Truthfully, I know nothing about Client Heartbeat, but it does come up when you search “low response rates” on Google. One of Mr. Beard’s reasons for low response rates is, “You are not asking the right questions.” Ah, a person who thinks as I do.

However, Mr. Beard immediately says this doesn’t happen too often since the majority of (Client Heartbeat’s) customers use the crowd-sourced recommended questions related to their industries. He adds, “The reason why we recommend certain questions is because we’ve done research into what questions bring about the best responses.” Whoops, I guess I shouldn’t suggest my own questions.

Finally, I found a kindred spirit. In the May 17, 2017, edition of The Wall Street Journal, I noticed Alexandra Samuel’s article, “Nine Survey Questions for People Who Create Survey Questions.” Now that caught my eye. She began her article with, “I have a message for the survey creators of the world: I’m tired of answering your questions. In particular, I’m tired of online surveys that are too complicated, too cumbersome, and too annoying.” Her questions for survey designers include the following:

  • Why are you sending me a survey?
  • Don’t you know this already?
  • Would you be able to answer this question?

(For that one alone, I will renew my subscription to the WSJ).

Then there is the problem of asking questions when you don’t really use the results at all. An interesting example of this is the Federal Employee Viewpoint Survey, administered by the U.S. Office of Personnel Management. The survey has a whopping 84 questions, yet most attention appears centered on the “global satisfaction” score derived from only four of these questions, each equally weighted:

  • “I recommend my organization as a good place to work.”
  • “Considering everything, how satisfied are you with your job?”
  • “Considering everything, how satisfied are you with your organization?”
  • “Considering everything, how satisfied are you with your pay?”

A non-government group, the Partnership for Public Service, publishes an annual “Best Places to Work in the Federal Government” list. They use a proprietary weighting of only three of the measures (omitting the pay satisfaction question).

To give the Office of Personnel Management proper credit, analyses are done on all 84 questions, but the headlines always come from the four global measures. My concern is why they don’t divide it into a long survey and short survey as the Census Bureau did years ago and give the four questions to all employees and the full 84 questions to just a subset of the employees. I am sure this would increase the response rate for the short survey and probably not change any conclusions for the long one.

So I have questions about questions. Some questions are just hard or misleading to answer on surveys.

This column will be published as many of you are preparing to attend the Joint Statistical Meetings in Baltimore. You may have questions about the ASA. Please feel free to introduce yourself to me and ask me your questions. In that context, there are no stupid questions. I look forward to meeting you.

Significantly forward,

Barry

1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)
Loading...

Comments are closed.