Adopted for the 2024 Paris Olympics, algorithmic surveillance has never been proven
In January 24, the Senate adopted, after heated discussions, article 7 of the bill on the Olympic Games, which authorizes on an experimental basis the deployment of cameras coupled with algorithmic detection systems. Tools capable, according to their promoters, of detecting crowd movements, abandoned luggage or suspicious behavior. The heart of the debate focused, and this is quite normal, on the major risks that the trivialization of surveillance technologies poses to privacy. But another element, however crucial, has been little discussed: the effectiveness of these tools presented as “intelligent”.
Experimentation: the term suggests a supervised, time-limited, scientific application. A life-size test, the results of which would be scrutinized in complete transparency by experts, to determine whether the technology is up to date, useful, respectful of privacy and of the allocated budget.
In practice, the decade of “experiments” – already – in augmented video surveillance shows that it is systematically the opposite that occurs. In 2016, the SNCF tests “intelligent” cameras to detect attacks. No results of the experiment will ever be communicated.
In 2019, the town hall of Nice claims to have carried out tests of facial recognition cameras which succeeded in 100% of the identification tests. Six months later, the National Commission for Computing and Liberties strongly criticizes this “success” announced, the details of which have not been made public, which does not allow, according to the institution, having “an objective vision of this experiment [ni] an opinion on its effectiveness. Since then, the city has turned to another technology, without facial recognition.
In 2020, the RATP “experiments” for a few months the automatic detection of the wearing of a mask in the metro. She explains today to World not having followed up, due to a “average detection rate of 89%” who remained “lower than observations made in the field”.
Promises of an ultra-efficient tool
Abroad, where large-scale tests have been conducted in the United States and the United Kingdom, more detailed data has sometimes been revealed. They draw up a very unconvincing assessment of the usefulness of these technologies. In 2017, a face detection experiment at the Notting Hill Carnival in London resulted in an almost total failure, with very many “false positives” – people wrongly identified. In 2021, a government audit in Utah, USA, issued a report extremely critical of a ‘smart’ CCTV device purchased by the state’s police force from the Banjo company two years earlier. .
The audit had shown that the company, which had in the meantime lost its contract after the press revealed its founder’s links with the Ku Klux Klan, had grossly exaggerated what its real-time incident detection system could do. Blinded by the promises of a high-performance tool, the Utah police had actually bought the security equivalent of an oil sniffer plane, which never detected any crime.
These precedents do not seem to have discouraged public decision-makers, neither across the Atlantic nor in France. In Chicago as in Toulouse, Metz, Valenciennes, or in smaller towns, “experiments” with “intelligent” video surveillance technologies have multiplied. With big differences between the types of tools. For fire detection, for example, the technology is well established: thermal cameras, like smoke detection software on images, work. But the more the cameras promise to be able to detect and analyze human behavior, the more their reliability decreases.
Predictive policing fantasy
But it is precisely on these latter what is the focus of the device envisaged for the Paris Olympics: it excludes facial recognition but focuses on the detection “abnormal events, crowd movements, abandoned objects or situations that presume the commission of offences”.
Online courses, evening courses, workshops: develop your skills
A predictive policing fantasy that also has been the subject of a number of “experiments” in recent years. At Schiphol airport in Amsterdam, in particular, where the results of a “test” launched in 2014, and discreetly stopped since, have never been communicated. In Paris, in the Châtelet station, the RATP had deployed in 2017 a solution for the automatic detection of “abnormal events”, again stopped without a public report.
These devices promising to detect an attack plan all use a mixture of known technologies, analysis of images or data coupled with machine learning, the technology which allows a program to analyze large bodies of data to infer links between different elements. The limitations and questionable scientific assumptions of these tools are well documented. Even the most advanced software is unable to tell the difference between forgotten and abandoned luggage, or between someone waiting and someone waiting.
More worryingly, the technologies promising to be able to identify criminals or terrorists ready to take action all use stress detection systems. Either the same operating principle as polygraphs, or “lie detectors”, whose reliability is now widely discredited.
None of these technologies has been able to demonstrate its effectiveness, a fortiori in real time. How could they, when there is still no consensus on the usefulness of “classic” video surveillance, which has been deployed on a large scale for twenty years?
The embarrassment of Lyon environmentalists on video surveillance
The prefect of the Auvergne-Rhône-Alpes region, Pascal Mailhos, has been asking for several months for an agreement to provide for the relocation of video surveillance cameras from the city of Lyon to the command room of the national police. Set up in the neighboring town of Villeurbanne (Rhône), this system allows the police to take control of municipal cameras in the event of serious events. The elected environmentalists refuse to do so, claiming an audit to “assess the question of uses”.
The city of Lyon has 571 surveillance cameras, with a municipal supervision center. The environmentalist majority has added six nomadic cameras and announces that it will acquire six more in the coming months, but it does not intend to go further. “The Court of Auditors asks us to evaluate public policies, we follow in the footsteps of the Court of Auditors”justified Grégory Doucet, mayor of the city, on July 6, in the preamble to a municipal council devoted to the upgrading of the municipal police.
Seen from the prefecture, the audit on the use of cameras looks like a pretext, intended to postpone the development of video surveillance. “We have a very clear and pragmatic vision in terms of security. I make public tranquility a priority”, however assured Grégory Doucet, during the municipal council. The environmentalist mayor says he is taking his part, while recalling that security is mainly the sovereign mission of the State. “There is a question about the guarantee of the balance between security and respect for public freedoms. We don’t want to replace human resources with cameras.”, recognizes Mohamed Chihi, deputy security officer of the city of Lyon, during a conversation on the sidelines of the council. The municipal executive remains in an uncomfortable ambivalence on the question of video protection cameras, torn between its official version on the good management of equipment and its reluctance in principle to generalized surveillance, without openly assuming it.