In recent years, technological advancements have caused a massive increase in the ability to store and process data, along with computational power. The internet of things, smartphones and other sensors all contribute to the gathering of billions of data points. An estimated 2.5 quintillion data bytes were generated daily in 2020, to put that into perspective, that is the equivalent of the data stored on 100 million blu ray discs. Much of this data is considered to be sensitive personal data, such as travel records, medical histories and political opinions.
History shows that this sensitive personal data can be exploited with huge implications to society. In march 2018, the Cambridge Analytica scandal showed that elections can be swayed through the use of machine learning algorithms and sensitive personal data. Data privacy could be the key to preventing such scandals from happening in the future.
In may 2018, the European parliament introduced the General Data Protection Regulation (GDPR) as a means to provide greater protection and rights to the general public. The GDPR requires organizations to legally justify any data processing, limits the amount of data gathered to a minimum and grants individuals the right to object to profiling. Furthermore, organizations can be fined for up to 4% of their annual revenue upon violation of the GDPR.
All in all, the GDPR is a step in the right direction, but the new regulations do leave something to be desired. For instance, the new regulations have resulted in ‘consent’ boxes in nearly all services and applications, excluding any users who fail to comply. Therefore, these boxes are more akin to blackmail than asking for consent. Furthermore, many Artificial Intelligence applications do not require explicit personal data to target specific individuals. This is reason to believe data privacy should be extended past sensitive personal data.
Because of these shortcomings, this article defends the statement that the current general data protection regulations offer insufficient protection to the general public, and that new solutions are essential. This article aims to highlight further deficiencies of the current regulations and provide possible solutions.
‘Consent’ boxes or blackmail?
Many applications and websites these days simply do not allow access without partly agreeing to their privacy policies. A recent study found that 86% of consent forms from the most viewed websites in Europe offer only a confirmation button, so no other choice. Furthermore, most of these consent forms (57%) actively attempt to nudge users towards consenting, for example by only coloring the ‘I accept’ button.
To further illustrate the manipulative nature of these consent forms, the research also concluded that only 0.1% of site visitors would actively enable all consent options if they were disabled by default. This result shows that the majority of European internet users want to keep their personal data private.
The GDPR protects sensitive personal data, but does not protect the general public from targeted advertising
In 2017, Economist released an article claiming that oil was no longer the world’s most valuable resource, data was the new king. Companies such as Google, Facebook and Amazon use data to target users with specific advertisements and allow third parties to do the same. Many users enjoy targeted advertising, after all what is the point in getting an advertisement for clothes for the opposite sex. However, the Cambridge Analytica fiasco has shown that considerable risks are at stake if these big tech companies fail to protect user data. Moreover, targeted advertising has the potential to influence the decisions made by the general public, even if they are aware that they are being targeted. Therefore, the autonomy of the general public might decrease and big tech companies could gain power.
The introduction of the GDPR has made targeted advertising a more arduous task due to the increased data protection laws. Even though organizations are unable to use personal data for targeted advertisements, they are still 2.7 times more effective compared to traditional advertising and widely used today. Organizations have created creative ways to form personality profiles without explicitly using personal data. These personality profiles can be constructed from Facebook likes to musical preferences. Gideon Nave, a marketing professor said the following on the matter:
‘The effects of one ‘like’ are not big. But with 300 ‘likes’ you can predict one’s personality as good as his or her spouse,’Gideon Nave
Artificial Intelligence is a rapidly evolving field thanks to increases in computational power and data storage. With even more technological advances on the way, personality profiles could become even more sophisticated. Sophisticated personality profiles have been shown to be able to sway votes in the US presidential election and the Brexit referendum. Therefore, it is prudent that new legislation is discussed now, before technological advances cause a repeat of events.
Social media users are not the consumers, but the product.
Bruce Schneider, a computer security and privacy specialist was one of the first people to make the phrase ‘You are the product’ popular when talking about social media. The phrase has become commonplace with any criticism towards social media. It implies that social media platforms such as Facebook, Instagram and also Google mainly profit from online advertising which uses our data. While it may be a stretch to claim that the actual users are the product, these big tech companies at the very least treat its users as such.
Alphabet, Google’s parent company, has claimed an annual revenue of 183 billion US dollars in 2020, over 80% of this came from Google ads.
The relationship between organizations such as Facebook, Instagram and Google with its users can have some nasty side effects. The recent movie ‘The Social Dilemma’ paints an, admittedly slightly dystopian, picture of these side effects. Machine learning algorithms feed users content to keep them glued to their screens for hours based on personality profiles and previous behavior. Users rely on validation from others in the form of likes which creates huge self esteem issues and a range of other negative emotions. A study from 2017 has estimated that approximately 210 million people worldwide suffer from internet addiction. The problem will only be exacerbated without new legislation, says former Google design ethicist Tristan Harris.
Shouldn’t we take our own responsibility
One could also question, does internet and social media usage not come with its consequences? And if you are very cautious about what information is collected about you, and what is done with it, shouldn’t you take your own responsibility in ensuring that the amount of data collected is limited, by e.g. changing settings or avoiding the internet as much as possible.
But is this actually possible?
As mentioned before, a website or application already informs whether a user agrees with the actions of the website or application. But for a lot of websites, the number of functions to be used is limited when a user does not agree. This means that if someone does not approve that a website is tracking them, that they can actually not use this website. Also, almost every single website tracks the data of their visitors. There might be other possible ways to avoid data tracking but still be able to use a website. However, this does require a user to put extra effort into its internet usage.
Is it ethical to leave this responsibility to people?
It is easy to say that people should take their own responsibility if they want to preserve their data privacy rights. However, a lot of people might not be aware of the importance of data privacy, and the consequences if this is not realized. Also, they might not be able to understand what consequences their actions, such as giving permission to cookies, have on their data privacy. For example, children, elderly, or low-educated people will most likely not understand how data tracking works. Even many highly educated people who know about this issue, do not fully understand it. To simply say that these people should take their own responsibility, when they are not able to understand the process of data tracking and what they could do against it, is really just taking advantage of vulnerable people.
It has now become clear that we need new regulations for general data protection. There are several ways in which new regulations can offer better protection to people. First, new regulations should include data privacy regarding personalized advertisements on the internet.
Third, not tracking data should become standard. Right now, people get a cookie pop-up asking for permission, users are often manipulated to select the ‘accept all’ option. If a user wants to deny all, or customize the cookie setup, this requires extra clicks and decisions. Because it is easiest to accept all tracking, most people also choose this option. They are not really aware of the implications of their decision, they simply want to get rid of the pop-up and thus opt for the quickest option to get rid of it. Experiments show that if the standard option is to deny all data tracking, most people select this option, and only a small number of people actually decide to allow tracking.
It has become clear that new General Data Protection Regulations are necessary. If a website only functions with cookies or other tracking, then users do not have a fair choice to disagree with them. Currently it is also allowed to use non personal data for targeted advertisements, whereas not every user agrees with organizations tracking them for commercial purposes. The responsibility for better data privacy can be given to organizations, but most likely this will not yield significant results. Organizations use tracking of their users for their own (commercial) purposes. Organizations such as Facebook and Instagram do not have any interest in better data privacy themselves, as it will yield them less money. It would not be fair to leave all responsibility to a user either, as it would cost too much effort, or might not even be possible to preserve their privacy whilst still being able to use all websites. For vulnerable people, such as elderly or children, leaving them to their own responsibilities is even harder. New data privacy regulations should be introduced to prevent targeted advertising, clarify privacy policies and to standardize opting out of cookies by default.