Image
Marie Eneman, associate professor in informatics, Department of Applied IT.
Marie Eneman, associate professor in informatics, Department of Applied IT.
Photo: Peter Larsson
Breadcrumb

What does the increased surveillance mean for our society?

Published

Marie Eneman, associate professor in informatics, has for a long time been studying surveillance and privacy in the digital society. She has been involved in a number of research projects focusing on law enforcement surveillance. Particularly interesting is how the emerging surveillance practices are enabled and constrained by new digital technologies and AI. Also, how the practices are organized, governed and regulated and the implications for individuals' privacy.

In today's digital society, we are monitored around the clock. We have stationary cameras around us, sometimes also body cameras and drones in public environments. We leave traces when we shop and pay with our credit cards, we use mobile phones and fitness watches where various information about us is logged and collected. On social media such as Facebook and Instagram, we generously provide the platforms with personal data (sometimes also sensitive data) about ourselves, Marie Eneman says.

We are usually not aware when and to what extent data about us is collected, or what the data may be used for, now or in the future.

Increased possibilities for surveillance

Image
Surveillance camera in New York
Surveillance camera in New York.
Photo: Enrique Alarcon, Unsplash

In recent years, both governmental and private actors have gained extended mandate to use new surveillance technologies in new ways. From a political point of view, there are high expectations that increased surveillance will increase security in society. At the same time, the increased surveillance is associated with serious risks for fundamental democratic rights such as freedom of expression and privacy.

Today's surveillance systems have been refined and become more powerful in ways that enable large-scale collection, analysis, sharing and storage of data. In addition, they are subtly embedded in almost all areas of our lives, Marie Eneman says.

New complex interplay between the state and private commercial actors

To be able to study these complex questions, Marie Eneman works together with an interdisciplinary research group that includes researchers from fields such as informatics, design, law, organization, and sociology.

Marie argues that the new ways of conducting surveillance can be seen as part of a growing and complicated development of infrastructures, practices and services that are all part of a complex interplay between state and private actors.

The boundary between state and private actors is becoming increasingly blurred, which gives rise to new challenges, not least in relation to issues of transparency, control, accountability and individuals’ privacy, Marie Eneman says.

In our research, we try to understand how these emerging data-intensive surveillance practices are organized, governed, and regulated and what meaning privacy takes on in that context. Privacy should not be understood as something that is constant, it is context-dependent and constantly renegotiated.

Law changes with major impact

During 2020, there were several important changes to Swedish legislation that increased the mandate for law enforcement authorities to use surveillance.

The police authority can for example now decide for themselves whether to introduce and use surveillance technologies. Previously, the police had to turn to the Swedish Authority for Privacy Protection and seek permission and justify their interest, and also show possible consequences for individuals' privacy. Another change is the implementation of the new law on Secret Data Interception, which we are currently investigating in an ongoing project.

Secret Data Interception creates new opportunities – and new risks for individuals' privacy

Secret data interception is a law that recently came into force in Sweden, which gives law enforcement authorities legal support to 'hack' into a suspect's computers and phones by exploiting vulnerabilities in the systems, when certain crimes are suspected. Swedish authorities have previously not been allowed to intercept encrypted data, but with the new legislation they can now intercept information from messages and conversations in encrypted applications and programs. They can also activate a camera or a microphone in a digital device to capture sound or images of a suspect.

The new law was motivated by the fact that it is an important tool in the fight against organized crime and a majority in Sweden's Riksdag voted in favour. But the law has also been criticized because of its far-reaching risks for individuals' privacy, Marie Eneman says.

The law is time-limited for five years and will be evaluated before a decision is made whether it should be made permanent or not. Our research is well in time with the evaluation and there is great interest among legislators in taking part in our research results. In addition, it has recently been proposed that law enforcement authorities should be given extended mandate to use secret data interception for preventive measures, that is, disconnected from suspicion of crime, which is justified that such law change is important to fight organized crimes. Such law change would constitute a paradigm shift for legal certainty in Sweden, with tangible risks that the use may become arbitrary and unpredictable. We follow the developments in our research.

Research on body-worn cameras

Marie Eneman and her colleagues started research on the Swedish police authority's introduction and use of body-worn cameras several years ago. The results of the studies show that the police see value in using the body-worn cameras. They experience increased safety in their work, and they also see the footage as valuable evidence. All the interviewed police officers in the latest study stated that they felt comfortable using the body camera as a work tool - as long as the camera was not remotely controlled and the individual police officer had to decide when it should be on.

The body cameras are small digital mobile surveillance devices that record both sound and image. According to the Swedish Authority for Privacy Protection, the fact that the camera records sound entails additional risks for individuals' privacy, especially when the police are in a private environment. The is a need for a clear regulation to avoid that too much responsibility lands on the individual police officer discretion to make well-balanced decisions about when the camera should be turned on or off.

Surveillance based on AI

Marie Eneman and her colleagues are also looking at the use of artificial intelligence, AI, and machine learning for surveillance.

AI and machine learning lay a foundation for new powerful surveillance possibilities that we have not seen before, says Marie Eneman. Right now, biometric technologies such as facial recognition, digital fingerprints and DNA are topical. In a large Nordic research project, we are investigating the expanding surveillance at the borders of Sweden, Norway and Denmark. By the end of the year, all the EU's external borders are planned to be equipped with biometric technologies as part of border control, which means that Sweden is preparing for the system known as the European Union's Entry/Exit system. It will be a gigantic system where data is going to be shared between a number of different systems in the member countries. Our research group will study the EES implementation with the associated biometric technologies.

The expectations for what AI can enable are high from the perspective of law enforcement authorities. AI in the context of surveillance is, among other things, about automating the analysis of large amounts of data. By, for example, letting AI handle face, movement and object recognition, the identification work of a person or an object can be made more efficient, Marie Eneman says.

It is important to emphasize that the use of AI for surveillance at the same time is very controversial. There is serious concerns about risks for civil liberties such as freedom of expression and privacy. The EU therefore defines facial recognition as a 'high-risk technology'.

Given the risks that may exist with biometric identification of individuals in public settings, a proposal has been made at European Union level to ban the use of AI for facial recognition in public places. Regulation and legislation will play an important role in the area – including the proposal from the European Union about harmonizing rules for artificial intelligence. The aim is to create a regulatory framework where the potential of AI can be used, while ensuring the basic rights of individuals.

Research on a controversial facial recognition application

Marie has also studied the Swedish police's use of AI technology that has not been procured by the police authority.

It is about the controversial facial recognition application called Clearview AI, which goes far beyond traditional facial recognition technologies. The Clearview AI company uses an automated image scraper to scrape facial images from the open Internet, for example from social media platforms such as Facebook, Instagram and Twitter, says Marie.

The images are used to create a huge biometric database that contains photos of faces. Clearview then sells access to the database to law enforcement agencies and private security companies. The application can be used for free during a test period and a number of Swedish police officers have downloaded Clearview's application and used it for facial recognition in investigative work.

The European Union has pointed out that Clearview AI's ability to protect data is highly questionable and its security level has not yet been tested by an independent party. There may be a great risk that millions of EU citizens who have shared personal photos on social media platforms, now are having their portraits in the company's database.

In Sweden, a formal inspection of the police's use of Clearview AI was carried out, which concluded that the use was to be considered illegal. The case of Clearview is an interesting example of how authorities can be enticed to use powerful and easily available technologies that have not been sanctioned by the authority, but where expectations of making police work more efficient become driving, says Marie.

Citizens' views on surveillance and privacy in the digital society

How do citizens view surveillance and privacy in the digital society? Marie Eneman and her colleague Professor Jan Ljungberg have recently investigated that by participating with questions in the SOM institute's national survey. The results have been published by the SOM Institute:

To the Eneman & Ljungberg chapter in the SOM Institute anthology (in Swedish)

 

Text: Catharina Jerkbrant, 2023