Monday, 20 August 2012

Content bubbles and political content

Much has been made of the dangers of so called content or filter bubbles. In essence this is the subtle process by which companies like Facebook and Google tailor what you see on screen to try and match your preferences. It's subtle but it means that what you individually see online will be different to what someone else with different preferences will see.

Day to day, this isn't too disastrous, for a given value of disastrous. It means that you'll be exposed to less diverse information, but a sensible person can work around a filter bubble to seek out the types of content they want.

TechDirt however, point out one danger of this technology, when it comes to politics, what happens if you are never really exposed to both sides? What happens when the filter bubble only helps to re-enforce your existing preferences? They cite some interesting statistics:
  • 86% of Americans say they do not want "political advertising tailored to your interests." Somewhat smaller majorities also said they don't want ads for products and services (61%) or news (56%) tailored to their interests.
  • 85% agreed "If I found out that Facebook was sending me ads for political candidates based on my profile information that I had set to private, I would be angry.”
  • More than 3/4 said they wouldn't return to a website if they knew if was sharing information about them with political advertisers.
  • 70% say they would be less likely to vote for a candidate they support if they found out that their campaign was using Facebook to send ads to friends of that person saying they "like" that candidate's Facebook page.
  • And two-thirds said their likelihood of voting for a candidate would decrease if they found out they were tailoring messages to them and their neighbors by purchasing information about their online activities, and then sending them different messages based on what might appeal to each.
These statistics assume that voters will be aware of the activities of data aggregators and advertisers, when all the evidence suggests that people are woefully under informed about how their information is used and shared online. I imagine you would get similar statistics about any similar scenario, but it does raise the difficult question of whether it's okay to filter on the basis of political beliefs.

The entire point of the political system is to ensure that people are able to vote on their preferences and to ensure that if they so desire those preferences can change over time. Anything else begins to undermine the democratic process, regardless of the benevolent intent of the people doing it. Candidates should be careful not just on the practical level of alienating voters, but also on the moral level (which is why voters are so enraged by the idea of it happening) as to how they use filter bubbles to their benefit.

Of course the technology will be used this way, because it's a tool and a useful one in many situations. It will be used, but anyone who does use it should do so in the knowledge that it could hit that badly. One to watch.
Share/Bookmark

No comments:

Post a Comment