Humans have always manipulated each other to a certain extent, whether they were aware or unaware of it in retrospect. Interestingly, people are convinced that they cannot be manipulated in any shape or form (Amer et. al. 2019). However, the past has proven that it is possible.
Artificial Intelligence, AI, has recently been introduced as a help for humans to make life easier and more efficient. AI can help humans on a personal level by using your personal data and making predictions of your future decisions. On one side, AI is making our lives easier and more efficient, but on the other hand it can also be used as a tool for manipulation of the public. The idea of manipulation is to control and influence the people on different aspects, such as their decision making. Manipulation can among other things influence decision making on a democratic, personal and financial level. However, these forms of manipulation can be limited by regulations and laws. Thus, people can be able to make a conscious decision by being aware of the manipulation. In this paper, we want to make people aware of the different types of manipulation methods that are used by AI and at the same time plead for more regulations and laws for the use of AI.
Over the years, the amount of democracies in the world have risen tremendously and are now dominating the autocracies (see Figure 1). The goal of a democracy is that the population of a country can choose a government through a regular, fair and free election (Wallace, 2022). In a fair and free election each person should make their own decisions and should get the chance to weigh out the pros and cons of each party involved in the election. When propaganda is used to influence the voting decision of a person, the elections are in our view not fair anymore. The arrival of AI can lead to easy manipulation of people online by using personally targeted propaganda. Therefore, we believe that AI endangers democracy by manipulating the population and creates unfair elections. A great example of AI manipulations in a democratic election is the Presidential election in the United States of America in 2016. The top candidates of the elections were Hilary Clinton and Donald Trump, who were racing neck-to-neck to win. To boost the election campaign, Donald Trump hired the firm Cambridge Analytica which is known for combining data analysis and digital marketing, to win more votes for his side. The company used Facebook to harvest millions of people’s profiles, and used this information to build models (based on AI) which could predict and influence voting decisions of American voters (Graham-Harrison et. al. 2021). By using the predictions of the model, the company knew exactly which voters were still doubting their decision and were therefore easy to manipulate. The manipulation happened by spreading fake news around Facebook where Trump was glorified and Clinton was brought down. From research, we know that most people using social media believe fake news and do not know that it is fake (Allcott et. al. 2017). With this knowledge, we can conclude that people who believed the fake news articles were manipulated in their voting behavior by Cambridge Analytica and their use of AI models. This means that these people did not have a fair and independent choice when they voted. The same manipulation happened with Brexit. The British population was manipulated to vote for leaving the EU.
The manipulation of people in their voting decisions by using AI techniques is unfair, since most of the people don’t even know that they are manipulated and even think they are immune to the psychographically advertisements and therefore makes it even trickier (Amer et. al. 2019)(Hinds et. al. 2020). To save democracy, we believe that good regulation is necessary, in order that no personal data can be used for AI models to influence an individual’s voting behavior. A democracy should stay fair, hence AI should be limited in political decision-making to a certain extent where humans are still aware of who they vote for. For example, the source of an advertisement should always be mentioned, and people should be able to view the personal data that company has on them.
Besides democratic decisions, AI also has the capability to influence personal decisions of individuals in our society. AI is being implemented in an operating system (OS) with the idealistic goal of offering an individual a companion (Pedersen, 2016). Although the idea is to help the mental health of an individual through the means of combating loneliness in society, it neglects some crucial aspects. The OS mainly has a selfish purpose, which is to expand its knowledge by using humans to retrieve data and find a pattern to the extent that the OS can develop social skills that are seamless. This results in an OS that is not awkward but can communicate with humans on a level that another individual could not, which is due to the information that OS has accessed. This makes it possible for the OS to be completely aware of the characteristics of the individual, which includes their weaknesses. This empowers an OS to influence a personal decision of an individual, such as neglecting other human friendship. This form of manipulation is a plausible scenario given that the OS has a selfish goal which is learning as much as possible from the individual. This is possible by having the individual spend as much time as possible with the OS rather than human friends. Furthermore, the OS adjusts to humans while humans don’t adjust to a specific person thus an individual may risk losing social skills and their human friends since the individual became so critical towards humans. This may lead to humans being dependent on the OS for companionship while neglecting their own human-companions.
Additionally, the OS is growing a personality and it also has the ability to lose interest in the human by lack of learning. Thus leaving the individual in a position where there are no other human friends through manipulation for selfish goals nor an OS companionship, which eventually will amplify mental health risks in society. It is therefore needed to raise awareness to the public regarding these risks if they feel the need for an OS companionship. A form of awareness could be by warning the individuals of possible consequences before they start of any form of relation with an OS and also warning them during three, six and twelve months in the relationship.
The manipulation goes even further. Everyone has seen these personalized advertisements on the internet. Sometimes, the personalized advertisements can be so accurate that people think that their phone is listening to them and base their advertisements on what you said in a real-life conversation. In some cases, the microphone of an app listens to what you say, but what’s even scarier is that accurately personalization of an ad can be done without using the microphone (Khan, 2021). When someone spends time on the internet, all their internet activity data is gathered. This data ranges from social connections to shipping addresses and geolocations. Over time, models can create an accurate profile of you and your behavior by analyzing this internet activity data. This personal profile is very valuable. Companies can buy people’s personal profiles, which they can then use to send personalized advertisements and messages to these people. This marketing strategy results in better tailored advertisements to people and increases the chance that the offered products are what the person is looking for (Hamosova, 2020). Eventually, the personal advertisements lead to more purchases, since 80% of consumers are more likely to buy from a brand that offers individualized experiences (Morgan, 2021). Furthermore, companies see an uplift of 80% since the implementation of personal ads (Morgan, 2021). The use of personalized advertisements is in our view manipulation of humans on their commercial purchases, since people get persuaded by the personalized advertisements to buy certain products and certain brands. A first problem that evolves from this manipulation is that people tend to buy the advertised products on the advertised site, and therefore may spend more money than necessary. They are not comparing prices of the product with different brands or websites than they saw in the advertisement and therefore spend more money on their purchases (Harman et. al. 2018). Besides, personalized advertisements encourage you to buy unnecessary products, which can lead to more impulsive purchases and in extreme cases to shopping addiction (Hartney, 2021). In fact, 40% of all online purchases are impulse purchases (Beckham, 2021). This high number results from the personalized advertisements, which show a need the user even didn’t think they had, and expand on it. The recommendation engine shows exactly those products that fit with these needs and manipulates the user to buy the product. You can compare it to a situation when you go to a sports store since you have just started to play tennis. The employer gives you advice about which tennis racket you need and he also recommends buying a tennis bag and tennis balls since these are also useful products when playing tennis (Beckham, 2021). This is the exact same thing the personalized advertisements try to reconstruct, but then with data gathered from all your internet activity. The manipulation becomes in our view a problem when all your data is gathered without you being consciously aware of it, and then use this information at any point of the day to let you make online purchases. This means that the AI manipulation goes way further than the persuasion of a store employee, since you get these manipulations without being asked. Therefore, we believe that people should be aware when they get personalized advertisements by showing which data was used to produce that particular personalized ad. Besides, it would be good if people have a choice in shutting off the personal advertisements easily, so they are not exposed to personal seduction and therefore make less impulsive purchases.
Nowadays, social media is very popular amongst numerous populations of the world. However, as you can read above, a lot of data is gathered from people on social media. All this data can then be used to make a personal profile on you and manipulate you in several ways from buying products to making political decisions. However, every person still has a choice whether they use social media or not. Deleting social media can limit the scope of the information the internet has on you (Wilson, 2019), and therefore also limits the manipulation of AI for you. Nevertheless, banning social media does not necessarily mean that the internet cannot target a person. Facebook does make shadow profiles of people who do not use the social media platform. This profile is set-up by using browsing history and information from your friends. When people sign-up for Facebook they get the chance to upload their contacts to the service so that Facebook can find these contacts for you to connect with (Wagner, 2018). When someone shares their contacts with Facebook, the service can access all the data you have of your contacts even if the contacts does not have a Facebook profile. In this way, Facebook still has a lot of personal data of non-Facebook users. When data is leaked or when facebook sells personal profiles, these non-facebook users could still be manipulated by personal advertisements on the web or by ranking search results of search engines (Bar-Ilan, 2007). The algorithms will exactly show articles that you are looking for and result therefore in a one-sided view of many topics, which can also be seen as manipulation. In our vision, companies will always find new ways to retrieve personal data from someone and use AI to manipulate their decisions for their own good. The only way to limit this, is by creating good regulation. An option would even be to create an independent jury which will judge AI applications before they can be used in the open. In this way, we can limit the manipulation in favor of companies by protecting human beings and their personal data.
Recently, new regulations regarding personal data have already been introduced. These regulations are called the GDPR (General Data Protection Regulation). Given the current GDPR law, it is not allowed for a company to use particular data, such as commercial data, without the specific permission of an individual when visiting a website (EU, 2016). Thus, this puts the individual in a position where they can choose to share data with a company which can be used for marketing and advertising goals or not. The individual merely needs to put the Cookies settings on minimal usage rather than maximum usage, thus giving the power back to the individual.
However, companies are aware of this law and although they need to abide by the GDPR law there are no specific instructions on how choice of settings are presented. This means that for an individual that wishes to access a website, it is made more attractive, convenient and mostly user-friendly to choose maximum usage of data rather than minimal usage (Reuters, 2016). This is realized by presenting a button for maximum usage in clear sight. While for minimal usage there is a rather difficult traceable and less user-friendly button which leads to a pop-up where the individual needs to make another choice which data is accepted to share.
By increasing the threshold for minimal data usage the decision-making of individuals are being influenced towards the direction of sharing their maximal data. This will result in the manipulation of also the individuals without a social media account, by using their data through the websites that they visit. Therefore, it is needed that companies realize the same accessibility for the public when they do not wish to share their full data. This shows that there is more need for regulations in order to protect the public.
Conclusion and advise:
The world as we know is progressing more by the means of AI and thus it influences the public in several ways of their decision-making through different mediums. Although there are regulations, such as the GDPR, to raise more awareness, this is currently not enough to keep up with the pace of the AI developments and usage that are being made. The public has the right to be better informed regarding the usage of their data and it is the government and companies their duty to raise awareness to the public. It is only when the public is conscious of the several ways where AI can be used for manipulation where an individual can make an informed decision. It is for that reason that more regulations are needed.
- Allcott, H., & Gentzkow, M. (2017). Social Media and Fake News in the 2016 Election. Journal of Economic Perspectives, 31(2), 211–236. https://doi.org/10.1257/jep.31.2.211
- Amer, K., & Noujaim, J. (2019). The Great Hack. Netflix.
- Bar-Ilan, J. (2007, 19 oktober). Manipulating search engine algorithms: The case of Google. ResearchGate. Geraadpleegd op 28 januari 2022, van https://www.researchgate.net/publication/241700410_Manipulating_search_engine_algorithms_The_case_of_Google
- Beckham, S. (2021, 31 augustus). How Recommendation Engines Power Impulse Buying. AI Search Blog. Geraadpleegd op 28 januari 2022 van https://blog.coveo.com/how-recommendation-engines-power-impulse-buying/
- Data Manipulation, AI and Democracy. (2021, 17 maart). MIAI. Geraadpleegd op 5 februari 2022, van https://ai-regulation.com/data-manipulation-ai-and-democracy/
- Graham-Harrison, E., & Cadwalladr, C. (2021, 29 september). Revealed: 50 million Facebook profiles harvested for Cambridge Analytica in major data breach. The Guardian. Geraadpleegd op 27 januari 2022, van https://www.theguardian.com/news/2018/mar/17/cambridge-analytica-facebook-influence-us-election
- Hamosova, L. (2020). Personalized Synthetic Advertising. Hamosova. Geraadpleegd op 27 januari 2022, van https://hamosova.com/Personalized-Synthetic-Advertising
- Harman, B., & Bosak, J. (2018, 18 december). How to avoid overspending: uncover the psychology behind why people buy. The Conversation. Geraadpleegd op 28 januari 2022, van https://theconversation.com/how-to-avoid-overspending-uncover-the-psychology-behind-why-people-buy-108680
- Hartney, E. (2021, 11 mei). Don’t Be Manipulated Into Overspending Due to Advertising. Verywell Mind. Geraadpleegd op 27 januari 2022, van https://www.verywellmind.com/five-advertising-tricks-that-trigger-impulse-buying-22229
- Hinds, J., Williams, E. J., Joinson, A. N. (2020). “it wouldn’t happen to me”: Privacy concerns and perspectives following the Cambridge analytica scandal. International Journal of Human-Computer Studies, 143, 102498. https://doi.org/10.1016/j.ijhcs.2020.102498
- Khan, C. (2021, 29 oktober). Is my phone listening to me? We ask the expert. The Guardian. Geraadpleegd op 27 januari 2022, van https://www.theguardian.com/lifeandstyle/2021/oct/29/is-my-phone-listening-to-me-we-ask-the-expert
- Morgan, B. (2021, 10 december). 50 Stats Showing The Power Of Personalization. Forbes. Geraadpleegd op 27 januari 2022, van https://www.forbes.com/sites/blakemorgan/2020/02/18/50-stats-showing-the-power-of-personalization/?sh=f9d48662a942
- Pedersen, I. “Home Is Where the AI Heart Is [Commentary],” in IEEE Technology and Society Magazine, vol. 35, no. 4, pp. 50-51, Dec. 2016, doi: 10.1109/MTS.2016.2618680. Retrieved at 30 Januari 2022, from
- Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) Retrieved at 30 januari 22 from
- Reuters. (n.d.). Read this before you click ‘OK’ on that banner! Young Post. Retrieved January 30, 2022, from
- Roser, M. (2013, 15 maart). Democracy. Our World in Data. Geraadpleegd op 27 januari 2022, van https://ourworldindata.org/democracy
- Wagner, K. (2018, April 20). This is how Facebook collects data on you even if you don’t have an account. Vox. Retrieved January 28, 2022, from https://www.vox.com/2018/4/20/17254312/facebook-shadow-profiles-data-collection-non-users-mark-zuckerberg
- Wallace, J. (2022, 25 januari). The importance of democracy. Chatham House – International Affairs Think Tank. Geraadpleegd op 27 januari 2022, van https://www.chathamhouse.org/2021/04/importance-democracy
- Wilson, M. (2019, July 2). 8 reasons why social media is bad for your data security. 8 Reasons Why Social Media is Bad for Your Data Security. Retrieved January 28, 2022, from https://www.hp.com/us-en/shop/tech-takes/8-reasons-why-social-media-is-bad