Insights

AI Surveillance and Data Privacy at the Games

AI Surveillance and Data Privacy at the Games

Digital Speaks - Paris 2024 / Summer of Sports Tech Series

Sep 03, 2024
Download PDFDownload PDF
Print
Share

Summary

As the Paris 2024 Summer Olympic and Paralympic Games (the “Games”) turn onto the final straight, the Games have yet again captured widespread global attention, on and off the track. With over 15.3 million visitors in Paris this summer for the Games, data security has emerged as a critical concern. To enhance the safety of athletes, spectators and residents, the French government implemented specific measures, including a bill relating to the Games (the “OG law”), a legislative measure passed on 19 May 2023, to bolster security during the Games[1].

The “OG law” introduces advanced security measures, notably the use of experimental algorithmic video surveillance systems. This article focuses on the deployment of these augmented surveillance systems during the Games and examines the associated GDPR compliance and privacy dilemmas that subsequently arise. 

Experimental use of "augmented cameras" to ensure security during the Games

The OG law permits the use of AI-powered “augmented cameras” for unprecedented analysis with algorithmic video surveillance, to monitor and provide security at major events. These systems can detect abnormal crowd movements, abandoned objects, suspicious behavior and the general identification of potential security risks, processing real-time data with AI.

While these measures aim to enhance security, they also raise significant GDPR compliance concerns. The French data protection authority, the Commission Nationale de l'Informatique et des Libertés (CNIL) was consulted on the OG law and made recommendations to the Constitutional Council for review to ensure a fair balance between security requirements and the respect for the rights and freedoms of individuals (the “Opinion”)[2]. The Opinion specifically raised concerns about the OG law’s compliance with the General Data Protection Regulation (Regulation (EU) 2016/679) (“GDPR”), particularly regarding the use of algorithmic image processing from drone-captured footage. The CNIL concluded in its Opinion that the safeguards in place, such as limiting the duration and scope of the experiment, banning biometric data use (including facial recognition), preventing data linkage with other data files and avoiding automated decisions, were adequate to ensure the protection of personal data.

The Constitutional Council also examined the experiment’s duration, initially set to run until March 31 2025, despite the Games ending in early September 2024. The Council ruled that the Paris police commissioner was to terminate the authorisation immediately if the conditions justifying the law are no longer met[3].

Additionally, the Commission Nationale Consultative des Droits de l'Homme (the “CNCDH “) raised concerns about the privacy risks posed by algorithmic video surveillance, highlighting the challenges of clearly informing the individuals concerned about their data being processed and enabling them to exercise their rights in regards to data processing[4].

To address the transparency requirement under the GDPR, a decree issued on 27 November 2023 outlined the conditions under which the public must be informed about personal data processing by video surveillance systems under the OG law[5]. The decree mandates that information be displayed via signs or pictograms representing a camera and if it is not possible to provide all the mandatory information, the signs must at least show the identity of the system operator, the purposes of the processing and the rights of the persons concerned, with additional information to be provided by "any other means".

In practice, the presence of cameras monitoring public areas must be indicated. However, the CNCDH notes that these signs are often rare, small and unnoticed. Article 10 of the OG law introduces two exceptions that justify not informing the public; when circumstances prohibit it or when such information would contradict the security objectives pursued.

The CNCDH has also expressed concerns about the decision-making processes surrounding video surveillance. While the OG law specifies that the systems be under human control and regularly monitored, it warns of the automation bias, where agents trust and rely on the software without challenge, check or verification.

In response to these concerns, the CNCDH has recommended:

  • Ensuring video surveillance systems are both necessary and proportionate;
  • Providing proper data protection training for agents in charge of video analysis; and
  • Increasing resources for the CNIL to strengthen its ability to monitor video protection systems.

These recommendations reflect the principles outlined in the recently adopted EU AI Act, establishing a strict framework for AI systems and automated decision-making, whilst emphasising the need for human oversight and transparency. The AI Act requires clear explanations to be given for AI’s role in decision making, enhancing transparency and enabling individuals to understand and contest decisions that impact them and further requires human oversight for high-risk AI systems in order to protect individuals’ fundamental rights.

As Paris is showcasing an epic Games, it is important to understand some of the security measures in place, behind what we see on our screens, to enhance the smooth running of the Games and safety within the city. The deployment of advanced security measures like the AI-powered augmented cameras does underscore the unique and sensitive balance between safety and privacy. While these technologies work towards providing a more safe environment, they also necessitate rigorous adherence to GDPR and the principles of transparency and human oversight enshrined in the EU AI Act. The measures and safeguards in place at the Games may well serve as a benchmark for future global events, highlighting the need to protect individual rights in the age of advanced technologies.

Related Practice Areas

  • Data Privacy & Security

  • Technology Transactions

  • Sports & Entertainment

This material is not comprehensive, is for informational purposes only, and is not legal advice. Your use or receipt of this material does not create an attorney-client relationship between us. If you require legal advice, you should consult an attorney regarding your particular circumstances. The choice of a lawyer is an important decision and should not be based solely upon advertisements. This material may be “Attorney Advertising” under the ethics and professional rules of certain jurisdictions. For advertising purposes, St. Louis, Missouri, is designated BCLP’s principal office and Kathrine Dixon (kathrine.dixon@bclplaw.com) as the responsible attorney.