How to explore dissent and controversiality about the upcoming elections or why and how we should build a glass house for Facebook. Facebook Tracking Exposed is among the projects that provide tools to test your own filter bubble, to compare how the same topic has been perceived differently by others and ultimately to pop the bubble.

After an alarm-bell rang when the Cambridge Analytica scandal emerged on global media, EU Commissioner for Justice Věra Jourová spoke about a “clear warning sign”, that the upcoming elections in Europe might suffer the same “disinformation and manipulation by private and foreign interests” that had affected the US elections.

A large part of the success of the current upsurge of right-wing populists is thought to be a result of their social media campaigns. The underlying algorithms are in fact able to determine the success or the failure of a specific campaign, dividing the candidates between those who know how to make this process more effective and those who don’t. And beyond the surface, we can expect that the mechanics of Facebook’s algorithm to continue to increase the polarisation of opinions.

Eli Pariser, the author of The Filter Bubble, gives a clear example of how this polarisation works: two of his friends googled ‘BP’. “One of them got a set of links about investment opportunities in British Petroleum. The other one got information about the oil spill”. We must acknowledge that algorithms significantly influence our perception of the world and, consequently, the entire decision-making process of individuals. Often, they prevent individuals from seeing the complete spectrum of not just opinions, but also of facts.

Citizens should have an informed and democratic choice on algorithms

Even the priorities and topics of an electoral campaign are affected: they used to be set by the competing parties or by whatever attracted the most attention in public opinion. Today however, algorithms themselves are defining the main topics by showing us only what it considers worthy of attention and hiding what it defines as less important.

As shown by analysis from the Web Foundation, “Facebook appears to define users’ information diets based on criteria that are not visible to users. Users of social networks – including researchers or journalists – may think that they are “disintermediated” when accessing information, but normally they don’t have the means to understand just how their content, or their information diet, is being produced.

You might also like:  "Once they engage with politics, young people are more demanding than their elders"

If Facebook has promised more transparency, this was only upon a clear request from institutions, never in a proactive manner. The best outcome of Facebook’s last efforts to be transparent is an API endpoint (i.e. a machine-readable data feed). It can be used by researchers to acquire data on political advertising. But it’s still the company who decides what should be considered politically relevant and what not, because the researchers must sign a non-disclosure agreement, and because it lacks any third-party accountability.

One of the core principles of the EU’s General Data Protection Regulation (GDPR) is control over one’s own data. But to get out of the bubble, the best way is to acquire total control over one’s own algorithm, over one’s information stream. In order to do that, built a simple browser extension which runs in Firefox and Chrome, and which allows to collect the public posts and the advertising Facebook picks for you. Once your data are copied, it can be used to test your own filter bubble or to compare how the same topic has been perceived differently (as in the British Petroleum example). They are now working on RSS service, which will let citizens access Facebook’s political content in a horizontal way.

It’s time to make use of our data in order to better understand possible risks for democracy

The goal of this project is not to release a report ex-post, but to be vocal before the European election. For this reason, we provide tools and support institutions and analysts who want to observe the algorithm influences during the campaign.

Too often regulation, just like the owl of Minerva, the ancient Greek symbol of wisdom and perspicacity, spreads its wings only with the falling of dusk. Therefore, new projects are trying to empower citizens with techno-political tools, so that they can act before the elections, and not post-factum. Citizens should have an informed and democratic choice on algorithms. Facebook Tracking Exposed not only empowers users by giving them the possibility to check their own informational diet, but it also draws attention to a dynamic that gives the public a completely passive role and removes the possibility for people to shape and spread the topics they care about. It’s time to make use of our data in order to better understand possible risks for democracy