Data mining for profit and election result – how predictable are we?
Did Donald Trump become president because he hired the data mining firm Cambridge Analytica, which uses profiling and micro-targeting in political elections? Some say yes, many say no. But what we know is that we are subjected to extensive personalised commercial and political messaging on the basis of data, including metadata, collected and used without our awareness and consent. It can result in changes in our behaviour, at least to some extent.
As much as we would like to think we are able to make decisions that are impossible to predict, we are creatures of habit, with our routines and patterns, and submerged in the filter bubbles of the like-minded. In short – it is fairly easy to learn about our activities, preferences, habits, relationships, just by getting a glimpse of our digital footprint, be it our browsing history, our social network, or our location data. This comes very handy when marketing companies, data brokers or campaign strategists try to understand and predict our shopping preferences or our vote in the next elections. Our data is turned into profit and power.
Data mining refers to seeking useful insights from collected data. In other words, it’s a method to examine existing large data sets to generate new information. Profiling on the basis of data mining is problematic from at least two perspectives.
The first is that our public data are used to learn additional personal and intimate information about us, which we are not keen on disclosing to the world or about whose existence we know nothing. It is for example easily possible to learn about a person’s sexual orientation, even if he or she does not disclose this information publicly.
The second point is the fact that one can never get a full image of a person on the basis of these data, so the predictions can never be fully accurate – from a business perspective, they only need to be accurate enough to be profitable. This might seem rather harmless; you might only be slightly upset when you are being targeted with advertisement for a romantic getaway when you just broke up with your partner. But inaccurate profiling can easily turn into a nightmare. Even when relying heavily on algorithms, the processes of data mining are still subverted to human subjectivity, and there is a potential for all sorts of biases, reflecting human prejudice. Decisions based on prejudicial profiles can have real-life consequences, such as welfare, employment, credit, even education.
It may not be possible to single-handily manipulate the outcome of an election through data mining. However, it raises serious concerns for privacy and democracy. Profiling on the basis of information about our interests, personalities, activities and affiliations can service political and commercial marketing, which aims to change people’s behaviour by exposing their vulnerable spots to manipulation. This is why it is crucial to address profiling in the legislation.
The regulation intended to strengthen data protection within the EU, General Data Protection Regulation (GDPR) was adopted in April 2016, and applies from May 2018. It gives citizens more rights to information and to object, and contains more explicit requirements for consent than existing legislation. A proposal for an e-Privacy Regulation (ePR) to complement the GDPR was published in January 2017. It seeks to add more clarity and legal certainty for individuals and businesses by providing specific rules related to our freedoms in the online environment.
Data protection is about privacy, security, autonomy and, ultimately about how our society functions.
Cambridge Analytica Explained: Data and Elections (13 April 2017)
Everything you need to know about the Data Protection Regulation
New e-Privacy rules need improvements to help build trust (9 March 2017)
(Contribution Zarja Protner, EDRi intern)