You should not have to trade in your privacy for security: a discussion on the issues surrounding Smart Cities

The Digital Perimeter

In May of 2021, the municipality of Amsterdam was found to be guilty of ‘blue-washing’ their technological experiments surrounding the Johan Cruijff ArenA, also called the Digital Perimeter. This conclusion was made by Bits of Freedom, an independent Dutch digital rights foundation that conducted research regarding the ethics of this ‘digital fencing’ project. Considering the recent developments to make Amsterdam smarter, accompanied by the desire to become a true ‘smart city’, this lack of transparency from the government regarding experiments on a smaller scale is worrisome, to say the least.

Johan Cruyff ArenA boulevard

What is a smart city?

According to the European Commission, a smart city is a place where traditional networks and services are made more efficient with the use of digital solutions for the benefit of its inhabitants and business. Benefits include for example improved traffic management, energy efficiency, and resident safety. Amsterdam’s smart project ‘Amsterdam Smart City’ claims to use data and tech to increase the quality of life. As good as this may sound, installing the necessary technologies for these purposes in public spaces can pose a threat to citizens’ privacy. Moreover, it raises a number of ethical issues regarding inequality, economic disparities, and conflicts of interest. It is important to discuss these issues, to ensure the benefits are for everyone and not just those in power.

Municipalities need to be more thorough in giving assurance to citizen privacy

When handling and working with sensitive citizen data, Dutch municipalities are required to comply with certain laws, such as the General Data Protection Regulation (Algemene Verordening Gegevensbescherming) and the law for the Municipal Personal Records Database (Wet Basisregistratie Personen). Such regulations exist to ensure the manner in which personal data is used, processed, and stored is done lawfully.

Despite the existence of such data protection and privacy laws, however, Bits of Freedom warns that there is a risk that municipalities may interpret these laws in a more ‘creative manner’, which would open up new possibilities with regards to for example facial recognition tools in smart cities. This is not without reason, the government has already misused citizens’ data numerous times before. The most recent fine imposed by the Dutch Data Protection Authority (DDPA, Autoriteit Persoonsgegevens) was for the Dutch Tax Administration, for unlawfully processing and using citizens’ nationality for various purposes, including risk assessment.

Additionally, there is a risk of function creep, which occurs when data is used for other means than originally intended. Although it is required to re-evaluate whether data is suitable to be used for a different means through Data Protection Impact Assessments (DPIA), function creep still happens. One example involved the Dutch national police, who were found to use the cameras of its Automated-Number-Plate-Recognition system to take photos of thousands of drivers every day, without having a basis in the law to do so. It should be noted here that the Dutch national police is involved with the Digital Perimeter, which is interesting given this previous wrongdoing. 

ANPR camera

In July of 2021, the DDPA called on municipalities to be more thorough in terms of guaranteeing citizen privacy. According to their research, municipalities do not always give sufficient consideration to privacy legislation, despite the fact that this is essential for smart city technologies that process personal data of citizens. In fact, the DDPA argues that poorly developed applications can be detrimental to the freedom of residents and visitors to that municipality. It is very worrying to know that there had to be a call on this so recently, knowing technological experiments and innovations continue every day.  
In addition to this, the DDPA found that although the overall amount of breach reports had seen a decline of 11%, there were 13% more data breach notifications concerning governmental practises in 2020 compared to 2019. According to their research, this was mostly due to more personal data being issued or sent to the wrong recipient. Moreover, their report shows an increase of 30% for reports following hacking, malware, or phishing incidents. This influx in hacking reports is concerning, as although it is mandatory to ensure that personal data is not traceable, an ordinary citizen does not know in what way his or her data has been anonymized, and therefore cannot say with certainty that they cannot be identified in the event of a hack or data leak.

The potential negative consequences of smart city realisation

According to Bits of Freedom, the municipality of Amsterdam seemed to be genuinely concerned with complying with the Tada-manifesto, a manifestation that contains principles regarding responsible data usage and technology design. However, the translation of these values into practice was not found to be flawless, most prominently with regards to the installation of facial recognition tools. Although it appears that the software that is being used is advantageous with regard to data protection, little attention has been paid to the other civil rights that will be put under pressure by the use of facial recognition tools. According to Bits of Freedom, this includes the right to freedom of assembly, association and demonstration, freedom of expression, the right to equal treatment, the privacy of citizens, and the ability to move freely and anonymously. Moreover, research has found that many facial recognition tools are prone to algorithmic bias against people of colour. Such biases can have major consequences on the lives of citizens, especially if false positives lead to wrongful arrests, as happened to Robert Julian-Borchak Williams in 2020.  

Robert Julian-Borchak Williams

Then there is also the question of who is able to access all of this biometric data. Just recently, in January of 2021, it was found that employees of the Dutch Public Health Service (GGD) were illegally trading large datasets from two corona systems. These datasets included highly sensitive data, including citizen service numbers and home addresses. Such data can be misused for things such as identity fraud, phishing, and stalking. Smart cities require a lot of data to function. It is not far-fetched to wonder what could happen to the biometric data that is collected by the smart technologies that are installed throughout the municipality. Knowing that ANPR cameras have the ability to scan faces, what impact could this then have on identity fraud in the event of a hack? And what about the developments in Deep Fake technologies, many of which are open source?

Moreover, if companies get involved in the realisation of smart cities, then this rightfully raises concerns about the privatisation of public services, as was the case with Google’s Sidewalk Labs in Toronto. The plans for this project opened up a debate on whether the acts of a private company as large and wealthy as Google would always be in the best interest of the citizens of Toronto, and not exclusively in favour of the company’s profit.

Sidewalk lab’s smart city project

And then there is also the risk of smart cities increasing economic pressure: these smart developments have the ability to influence the economic value of a city, which could lead municipalities to compete with one another. This creates a risk of cities losing out on investment, development, and progress as a consequence. Furthermore, research suggests that the implementation of such systems may disadvantage poorer citizens, as their target audience is the middle-to-upper class.

The trade-off between individual privacy and public security

Aafke Fraaije, junior researcher and teacher of public communication on science and technology at the Vrije Universiteit Amsterdam, conducted research regarding the opinion of citizens on projects such as the Digital Perimeter. To this end, pedestrians surrounding the ArenA were interviewed. One interviewee argued that they ‘’are already being filmed regardless’’, making it seem as though he felt he had no say in the matter. At the same time, however, they believed that there must be a clear reason for filming individuals, as ‘’it is at the expense of our privacy’’ and “we need to be careful not to end up with a social credit system, as in China”. 

The Chinese scoring system assigns scores to citizens based on the actions they take. Their overall score has an influence on things such as the freedom to use public transport and even applying for a visa. Amnesty International states that the system is intended to stimulate behaviour desired by the government. People with low scores are actively shamed by the government by putting them on billboards along highways. This is a clear example of mass surveillance, whereby citizens hand in their privacy with very serious consequences for individual freedom. 

Still, there were respondents that thought the Dutch administration could take an example from the Chinese system. Having smart surveillance cameras around was said to increase one’s feelings of safety. Interviewees argued that “if you don’t mean any harm, then you have nothing to hide”. This view is shared by many others, as was found in a research conducted by GetApp Nederland, who found that 80% of their respondents were positive about using facial recognition tools for police surveillance. 

The concept of having nothing to hide brings up an important quote from Edward Snowden, who said:

“Arguing that you don’t care about the right to privacy because you have nothing to hide is no different than saying you don’t care about free speech because you have nothing to say”. – Edward Snowden

MIC

There should always be a clear distinction between what you do in the private sphere and the public sphere. Moreover, this view completely disregards the risks that algorithmic bias in facial recognition tools may bring to minorities. Additionally, many privacy rights organisations argue that for example, the use of ANPR cameras (a form of mass surveillance) is a disproportionately intrusive means, given the results it achieves. Privacy First, an independent foundation that aims to preserve and promote the right to privacy, argues that although these cameras can be valuable to track and trace criminals in real-time, their use leads to mass-surveillance of millions of innocent citizens. Supervision of the use of ANPR cameras moreover lies with the public prosecutor’s office, and not with an independent party such as the DDPA. 

Although many citizens seem to be willing to be surveilled for safety, research shows a lot of people still worry about their general data privacy. In 2019, the DDPA conducted a survey among Dutch citizens, regarding their privacy concerns. Their research found that 94% of the respondents worry about the protection of their personal data. This research also shows that many people are unaware of how to exercise their privacy rights: only 12% of respondents had exercised these rights before. 

Perhaps, not knowing one’s rights, or how to execute them, makes citizens feel as though they have no choice but to hand in their privacy in order for the government to grant them security. This highlights what is arguably the largest discussion surrounding the topic of smart cities: the trade-off between individual privacy and public security. Fraaije’s research finds the same discussion within her conversations: interviewed citizens feel as though they have to choose. This idea of having to choose between these two values creates the impression that they are mutually exclusive, which is not and does not have to be the case.


Already in 2007, the DDPA advocated for a new balance between the two. According to them, there must be a balance between privacy and security in a democratic state under the rule of law. Security without privacy and privacy without security are meaningless. Both are necessary for citizens and institutions in a functioning society. It is therefore not appropriate to have to choose between security and privacy: it is the task of the government and society to safeguard both values. It seems as though this task is currently not being executed well by either party: citizens are not using their rights, and the government is violating their rights.

What can be done to preserve the balance between privacy and security when new smart technologies are introduced within a municipality?

Privacy can be central in the design of smart technologies. In their report of July 2021, the DDPA asks municipalities to let privacy be the starting point of innovation, not the closing point. Administrators and officials need to think about the rights and freedoms of their citizens, and actually include them in every step of the development towards a smart city. This includes ensuring that citizens are informed, such that they can exercise their rights. 

An example would be to regularly hold citizens’ councils to discuss developments. Most importantly, however, is that transparency between the municipality and its citizens also exists outside of these councils: it would therefore be necessary to conduct a campaign that makes the average citizen aware of existing risks, as well as how and when they can and should use their privacy rights. This campaign could for example include television advertisements, social media posts, posters, or people on the street engaging in conversation with citizens. There needs to be more transparency regarding the experiments that are being done, aside from for example only a small sign near the area

The DDPA also argues that smart city applications that are still in their pilot phase (such as the Digital Perimeter) should also fully comply with the GDPR. Moreover, prior to starting projects for any such application, it should be determined whether personal data will be processed and whether such processing is lawful. The purpose of these applications should be specific, and municipalities must examine whether there are alternative solutions that are less intrusive. Additionally, DPIA’s must be performed at regular intervals, and results should be published for more transparency. 

Another essential call from the DDPA is on either assigning a specific councillor to deal with digitalisation or seeing who within the organisation can be addressed on digitisation issues. To the authors, it is important that this is an independent third party, to ensure steps that are taken are in the best interest of the citizens, and not of the council’s profit. 

It is important to ensure that these changes in awareness, regulations, and supervision happen as soon as possible. Function creep is called function ‘creep’ for a reason: these privacy violations develop slowly. Smart technologies implemented in smart cities may pose a threat to individual privacy, and consequences can be unethical. More transparency is needed, and privacy-focused design should be central to ensure the fundamental rights of citizens are respected. Citizens should not have to choose between feeling safe and having their privacy respected: the government should ensure both. Only then can the benefits of smart cities truly be beneficial to everyone, including the average citizen. 

Leave a Reply

Your email address will not be published. Required fields are marked *

Human & Machine Power & Democracy

AI and Personal Privacy: Navigating the Fine Line Between Convenience and Surveillance

As Artificial Intelligence continues to evolve, it is integrated into almost every aspect of our lives, bringing a new level of convenience and efficiency. From smart assistants and chatbots that perform a range of tasks on our command to facial recognition software and predictive policing, AI has undoubtedly made our lives easier. But, this convenience […]

Read More
Power & Democracy Power & Inequality

Navigating the AI Era: The Imperative for Internet Digital IDs

The rapid advancement of Artificial Intelligence (AI) presents a dual-edged sword, offering unprecedented opportunities while introducing complex challenges, particularly in the realm of digital security. At the heart of these challenges is the pressing need for effective internet identification systems capable of distinguishing between human and AI interactions. We will explore the vital importance of […]

Read More
Power & Democracy Power & Inequality

Data privacy: Why it should be the next step in AI regulation

In the current stage of development in Artificial Intelligence (AI), there is nothing more important than data. It’s the fuel of any statistical-based AI method. The most popular classes of models ingest enormous amounts of data to be trained, such as ChatGPT, Google Bard, PaLM. However, in many models, the users do not explicitly give […]

Read More