Psychographic advertising: the global battle to sway minds

Psychographic advertising: the global battle to sway minds. Photo: Adit Jani (Public domain). Elcano Blog
Psychographic advertising: the global battle to sway minds. Photo: Adit Jani (Public domain).
Psychographic advertising: the global battle to sway minds. Photo: Adit Jani (Public domain). Elcano Blog
Psychographic advertising: the global battle to sway minds. Photo: Adit Jani (Public domain).

It’s nothing new. Much indeed was known or suspected: specifically, that by using data from social media it is possible to send and personalise messages –true and otherwise– to reach many millions of individual users (who, even if pseudonymous, are real) or groups of them. Using Facebook data this is what the consultancy (or rather political advertising agency) Cambridge Analytica did to support Donald Trump’s victory in 2016. Some of the details of its role in the Brexit referendum are still in need of clarification.

Harnessing psychological and sociological research, analysis of users’ “friends”, “likes” and “walls” says a great deal about those who spend time on Facebook and social media in general. It has often been said that data is the new oil, and this is true in politics too, mainly, but not exclusively, in democracies. In this case they have been used to send personalised messages to wavering voters whose inclinations have been successfully divined.

Russian interference in this and other elections has been laid bare by the chilling indictment filed by the US special counsel, Robert Mueller. Western intelligence services are increasingly worried by the vulnerability inherent in this type of connectivity to manipulation by groups, governments and foreign agencies, although they themselves use it. There could be a Russian connection in the Cambridge Analytica case (Aleksandr Kogan, the Russian-American researcher at Cambridge University with links to the company is under suspicion). We are witnessing a global battle to influence minds that is throwing up strange relationships and alliances. This manipulative use of data mining is a very broad phenomenon.

It is clear that Facebook, in this instance, is guilty of misconduct and will be held to account. The British, European and US legislatures have summoned its founder and CEO, Mark Zuckerberg, who has already promised to strengthen privacy on his network. But Facebook and Zuckerberg have not only suffered a significant loss in terms of stock market value but also in terms of reputation and trust.

Cambridge Analytica is not alone in this type of activity. Many other companies (and services) behave, or could behave, in the same way, more discreetly. What this case shows is that almost anyone –private companies, governments, non-state organisations– with access to users’ data and a certain degree of technical sophistication can carry out what McKenzie Funk, an Open Society Foundation collaborator and member of the Deca journalism cooperative, calls ‘psychographic advertising’. Anyone can do it, as long as they have enough know-how, like the know-how Cambridge Analytica gleaned from Christopher Wylie, the person who decided to leak this information to The Guardian/The Observer and The New York Times. And lest it be forgotten, the person who led with these psychographic electoral techniques was none other than Barack Obama.

How can we combat these technological possibilities, with their potential to stifle democracy, or to subvert it by exploiting its openness, as George Soros and many others have warned? What we are facing is manipulation as a means of interfering. The report drawn up by 39 experts for the European Commission prefers to use the word ‘disinformation’, because it goes far beyond the concept of fake news, the expression that Trump is so fond of using. It calls for commitment to and support for quality journalism. Google News (which does not operate in Spain because the intellectual property law would oblige it to pay a levy) is now proposing to do a better job of spreading this truthful news worldwide. Good. The experts advocate groups of volunteers –which are emerging– to report such manipulative activities. The companies that own the networks must also take their share of the responsibility for the custodianship and use of these personal data. A code of good practice for such platforms has been demanded. The proposals have suffered a certain amount of criticism, however, for failing to understand and accept the business model of some of the platforms concerned, or the usage model of these networks and other services in which users hand over their data in exchange for free connectivity. Facebook, for instance, derives income from advertising and from users’ profiles and attention, given that attention is another scarce and valuable resource. The critics are also calling for more transparency in the functioning of the algorithms used by social media and other services.

A study by MIT has found that fake news spreads wider, faster and more deeply than real news. This is due not only to the role played by bots, or automated programs, but also to our own fondness for novelty. In other words, we are also partly responsible as consumers and users for what is happening. Hence the urgency of teaching people how to protect themselves from such potential manipulation. ‘Citizens have to be equipped with the tools needed to be able to discriminate between truth and falsehood’, in the words of José María Lassalle, the Spanish Secretary of State for the Information Society and the Digital Agenda. The public can and should learn how to manage the information they receive and send. Technological and media literacy, which is not a matter of technical capability, is something that needs to be taught to adults, but also to young people in families and schools, in all civic education-type subjects or their equivalent in other countries.

The EU can also play its part with its protection measures. The new General Data Protection Regulation (GDPR) finally comes into force on 25 May. It is a major, albeit insufficient, step, and the platforms and digital services will end up having to apply it outside the EU too.

That said, social media also play a very positive role in the defence of freedoms, democracy and citizen participation. The fight against disinformation is the excuse many dictators use to restrict communication and freedom of expression. One need not go as far as China to confirm this. As Yarik Turianskyi of the South African Institute of International Affairs points out, at least 10 African countries –Burundi, Cameroon, Chad, the Democratic Republic of Congo, Ethiopia, Gabon, Gambia, Mali, Uganda and Zimbabwe– closed social media websites and/or messaging applications during or after elections in the wake of protests in 2016. Others had to backtrack. In Ghana, for example, the government felt obliged by popular pressure to re-establish these services and the opposition ended up winning.

It is not that technology on its own will enable us to fight against the excesses of the self-same technology. Another factor is the sense of protection against such attacks and manipulations. This is only the beginning. But little by little, societies and institutions are reacting. We are going to see changes and new regulations, even if technology often advances faster than the regulators themselves.