Tuesday, 10 September 2013

More on polling

I stumbled across a rather timely piece on Kings of War entitled Polls, Proles and Plato:
There’s an interesting report out today from the House of Commons Public Affairs Select Committee on engaging the public when defining the ‘national interest’. The actual report is wafer thin, but the YouGov polling is rather interesting. Naturally a report on a poll seeking the value of polling concludes that:
The polling we commissioned demonstrates the value of engaging the public in intelligent conversations about complex national strategic issues. The responses provided to the questions asked are nuanced and subject to subtle shifts depending on the information provided by the pollsters. This shows that such polling can provide a powerful insight into the values and attitudes that underlie the views held by the public on national strategic issues. This insight is a hitherto untapped resource for the Government, and one that could meaningfully be used in the formation of national strategic goals and priorities. We recommend that the Government begin to use iterative polling as a means of supporting the development of National Strategy.
I have a few concerns with the polling beyond its conclusion, namely that one of the desired outputs was (my emphasis):
(2)...we wanted to demonstrate that it was possible to engage the public in a meaningful dialogue about the way in which it perceives the UK’s national interests. We wanted to show the Government that it could use insights gleaned from such a dialogue in order to develop and improve National Strategy.
(3) We wanted our poll to prove that presenting the public with a series of reasoned choices would yield insights that would assist the Government in the formation of a coherent National Strategy that had broad-based and informed public support.
If you go in wanting to prove something with polling, its remarkable how often the polling proves what you want. Desire to prove a point can have unintended side effects, and pressure on researchers to shape questionnaires to develop their methodology in line with the required output.

There are some signs in the questionnaire this may have occured, with subtle emotional prompting around some of the questions. For example:
In your view, how important or unimportant are the following activities in serving the United Kingdom's national interests?
Being a leading voice on the United Nations Security Council, as one of the Big Five Permanent Members (78% important)
Having aircraft carriers to send our Armed Forces anywhere in the world (69% important)
Having our own nuclear weapons (54% important)
Using the phrase "Big Five" includes a prompt that being part of it makes the nations involved "big". A more sensible phrasing would have removed the "Big Five" reference, which is unnecessary to the question. In the second and third "our" invites the respondent to collaborate with the researcher, and could also be implicitly nationalistic, again driving importance. Describing the role of aircraft carriers in the way it is done also makes them sound necessary. If needs be, other tools could be used to transport "our" (again!) armed forces overseas, if sending them overseas is indeed necessary.

Another question suggests that some of the following things are "current or possible future threats to the British way of life."
More countries, such as Iran and North Korea, developing nuclear weapons
Organized crime, including drug- and people trafficking across borders
Weak and broken states, such as Somalia, Yemen and Pakistan
I'm not sure what purpose this question is intended to serve, apart from proving that when promoted some subset of the population will agree that almost anything could be a future threat. Again, an emotive question which doesn't serve to drive any meaningful purpose.

The other problem is that the study introduces no element by which negative outcomes or any sort of costing are introduced. Respondents are not asked to make choices of "you can have X, but if you do you won't get Y". It becomes easy therefore for respondents to say that everything is necessary and desirable.

What also isn't clear is that although the research refers to questions being asked in different ways I can't immediately see a difference in how the question was asked or the supplementary information which was provided. It appears the data contains only that from the final round of fieldwork, which is a shame, considering that the main value of this type of research would be to compare arguments for their effectiveness.

Overall this research as presented doesn't demonstrate what it is intended to demonstrate, but does demonstrate that when research is designed to give a certain output it usually does. 

I recommend Jack McDonald's piece for a wider analysis of some of the interesting figures which emerge from the research. He has some good points to make about the difficult strategic choices which would result if the contradictory information contained in research was used to make choices.

Share/Bookmark

No comments:

Post a Comment