Opponents are fuming at the French government’s plan to trial AI video surveillance cameras at the 2024 Paris Olympics, stating that they consider it an unnecessary and dangerous security overreach.
The government claims that such systems are necessary to manage the crowds expected during Olympics and identify potential dangers. However, critics argue that the proposed law prioritizes French industry over fundamental civil liberties.
Last week, around 40 mostly left-leaning members of the European Parliament wrote an open letter to French lawmakers warning that the plan “creates a surveillance precedent never before seen in Europe,” according to Le Monde. Debates began on Monday in the National Assembly, France’s lower parliamentary chamber, and are set to continue on Friday.
Even before the debates began, MPs had already submitted 770 amendments to the government’s broad Olympics security bill, many of which were focused on Article Seven. This section allows for video footage captured by existing surveillance systems or new ones, including drone-mounted cameras, to be “processed by algorithms.”
Artificial intelligence software would then be used to “detect in real time predetermined events likely to pose or reveal a risk” of “terrorist acts or serious breaches of security,” such as unusual crowd movements or abandoned bags. The systems would alert the police or other security services to the events, who would then decide on a response.
The government is reassuring the public that the smart camera tests will not process biometric data or resort to facial recognition, which are technologies that the French public is wary of applying too broadly. Sports Minister Amelie Oudea-Castera told MPs that the experiment has a very precise time limit, and the algorithm does not substitute for human judgment, which remains decisive.
The interior ministry highlighted a February survey for the Figaro daily, which suggests that large majorities of the public back using the cameras in public spaces, particularly in stadiums. However, opponents argue that the plans overstep the bounds of the French constitution and European law.
Digital rights group La Quadrature du Net (QDN) wrote in a report, which they sent to lawmakers, that the systems would, in fact, handle sensitive “biometric” data under a broad 2022 definition from France’s rights ombudsman. QDN argues that, as biometric data, those characteristics would be shielded by the European Union’s powerful General Data Protection Regulation (GDPR).
An interior ministry spokesman rejected that finding, insisting that the planned processing did not use any biometric data or facial recognition techniques.
‘State of emergency’
The camera test period is slated by the bill to run to the end of 2024 — well after the end of the games and covering other major events including the Rugby World Cup later this year.
Once the law is passed, public authorities such as the emergency services and the bodies responsible for transport security in the Paris region will be able to request its use.
The interior ministry said it “should cover a significant number of large events” for “the most complete and relevant evaluation”.
But QDN activist Naomi Levain told AFP: “It’s classic for the Olympic Games to be used to pass things that wouldn’t pass in normal times”.
“It’s understandable for there to be exceptional measures for an exceptional event, but we’re going beyond a text aimed at securing the Olympic Games,” Socialist MP Roger Vicot told the chamber on Monday.
Elise Martin, an MP following the process for hard-left opposition party France Unbowed (LFI), told AFP that the bill was just the latest of a slew of additional security powers introduced under President Emmanuel Macron since 2017.
“The way this law is thought out is as if we live in a permanent state of emergency,” she said.
Meanwhile QDN’s Levain highlighted that “many of the leaders in this market are French businesses”, calling the bill’s provisions a “favour to industry”.
The size of the video surveillance market in France alone was estimated at 1.7 billion euros ($1.8 billion) in a 2022 article published by industry body AN2V, with the global business many times larger.
If passed, the law would make the 2024 Olympics “a shop window and a laboratory for security”, handing firms an opportunity to test systems and gather training data for their algorithms, Levain said.
Some cities in France , such as Mediterranean port Marseille, are already using “augmented” surveillance in what is at present a legal grey area.
Such data is needed to train computer programmes on what kinds of behaviour to flag as suspect, learning to recognise patterns in moving images — just as text AIs such as ChatGPT are trained on large bodies of writing before they can generate written output of their own.
But opponents say that there is little or no evidence that augmented surveillance — or even more traditional CCTV systems — can prevent crimes or other incidents around the large sporting and cultural events targeted by the draft law.
Smart cameras “wouldn’t have changed anything at the Stade de France” last year, when huge crowds of Liverpool supporters were rammed into tiny spaces as they waited to enter the Champions League final, Levain said.
“That was bad human management, there’s know-how to managing a crowd, calculations to be made about placing barriers and directing flows… no camera can do that,” she added.