Random Image Display on Page Reload

At the Olympics, AI Is Watching You

Jul 25, 2024 7:25 AM

At the Olympics, AI Is Watching You

A controversial new surveillance system in Paris foreshadows a future where there are too many CCTV cameras for humans to physically watch.

Image may contain Adult Person Clothing Hat Glove Officer Police Officer Accessories Bag Handbag and Glasses

Police officers walk along the Eiffel Tower ahead of the Summer Olympics.Photograph: picture alliance/Getty Images

On the eve of the Olympics opening ceremony, Paris is a city swamped in security. Forty thousand barriers divide the French capital. Packs of police officers wearing stab vests patrol pretty, cobbled streets. The river Seine is out of bounds to anyone who has not already been vetted and issued a personal QR code. Khaki-clad soldiers, present since the 2015 terrorist attacks, linger near a canal-side boulangerie, wearing berets and clutching large guns to their chests.

French interior minister Gérald Darmanin has spent the past week justifying these measures as vigilance—not overkill. France is facing the “biggest security challenge any country has ever had to organize in a time of peace,” he told reporters on Tuesday. In an interview with weekly newspaper Le Journal du Dimanche, he explained that “potentially dangerous individuals” have been caught applying to work or volunteer at the Olympics, including 257 radical Islamists, 181 members of the far left, and 95 from the far right. Yesterday, he told French news broadcaster BFM that a Russian citizen had been arrested on suspicion of plotting “large scale” acts of “destabilization” during the Games.

Parisians are still grumbling about road closures and bike lanes that abruptly end without warning, while human rights groups are denouncing “unacceptable risks to fundamental rights.” For the Games, this is nothing new. Complaints about dystopian security are almost an Olympics tradition. Previous iterations have been characterized as Lockdown London, Fortress Tokyo, and the “arms race” in Rio. This time, it is the least-visible security measures that have emerged as some of the most controversial. Security measures in Paris have been turbocharged by a new type of AI, as the city enables controversial algorithms to crawl CCTV footage of transport stations looking for threats. The system was first tested in Paris back in March at two Depeche Mode concerts.

For critics and supporters alike, algorithmic oversight of CCTV footage offers a glimpse of the security systems of the future, where there is simply too much surveillance footage for human operators to physically watch. “The software is an extension of the police,” says Noémie Levain, a member of the activist group La Quadrature du Net, which opposes AI surveillance. “It's the eyes of the police multiplied.”

Near the entrance of the Porte de Pantin metro station, surveillance cameras are bolted to the ceiling, encased in an easily-overlooked gray metal box. A small sign is pinned to the wall above the bin, informing anyone willing to stop and read that they are part of an “video surveillance analysis experiment.” The company which runs the Paris metro RATP “is likely” to use “automated analysis in real time” of the CCTV images “in which you can appear,” the sign explains to the oblivious passengers rushing past. The experiment, it says, runs until March 2025.

Porte de Pantin is on the edge of the park La Villette, home to the Olympics’ Park of Nations, where fans can eat or drink in pavilions dedicated to 15 different countries. The Metro stop is also one of 46 train and metro stations where the CCTV algorithms will be deployed during the Olympics, according to an announcement by the Prefecture du Paris, a unit of the interior ministry. City representatives did not reply to WIRED’s questions on whether there are plans to use AI surveillance outside the transport network. Under a March 2023 law, algorithms are allowed to search CCTV footage in real-time for eight “events,” including crowd surges, abnormally large groups of people, abandoned objects, weapons, or a person falling to the ground.

“What we're doing is transforming CCTV cameras into a powerful monitoring tool,” says Matthias Houllier, cofounder of Wintics, one of four French companies that won contracts to have their algorithms deployed at the Olympics. “With thousands of cameras, it's impossible for police officers [to react to every camera].”

Wintics won its first public contract in Paris in 2020, gathering data on the number of cyclists in different parts of the city to help Paris transport officials as they planned to build more bike lanes. By connecting its algorithms to 200 existing traffic cameras, Wintics’ system—which is still in operation—is able to first identify and then count cyclists in the middle of busy streets. When France announced it was looking for companies that could build algorithms to help improve security at this summer’s Olympics, Houllier considered this a natural evolution. “The technology is the same,” he says. “It's analyzing anonymous shapes in public spaces.”

After training its algorithms on both open source and synthetic data, Wintics’ systems have been adapted to, for example, count the number of people in a crowd or the number of people falling to the floor—alerting operators once the number exceeds a certain threshold.

“That's it. There is no automatic decision,” explains Houllier. His team trained interior ministry officials how to use the company’s software and they decide how they want to deploy it, he says. “The idea is to raise the attention of the operator, so they can double check and decide what should be done.”

Houllier argues that his algorithms are a privacy-friendly alternative to controversial facial recognition systems used by past global sporting events, such as the 2022 Qatar World Cup. “Here we are trying to find another way,” he says. To him, letting the algorithms crawl CCTV footage is a way to ensure the event is safe without jeopardizing personal freedoms. “We are not analyzing any personal data. We are just looking at shapes, no face, no license plate recognition, no behavioral analytics.”

However, privacy activists reject the idea that this technology protects people’s personal freedoms. In the 20th arrondissement, Noémie Levain has just received a delivery of 6,000 posters which the group plans to distribute, designed to warn her fellow Parisians about the “algorithmic surveillance” taking over their city and urging them to refuse the “authoritarian capture of public spaces.” She dismisses the idea that the algorithms are not processing personal data. “When you have images of people, you have to analyze all the data on the image, which is personal data, which is biometric data,” she says. “It's exactly the same technology as facial recognition. It's exactly the same principle.”

Levain is concerned the AI surveillance systems will remain in France long after the athletes leave. To her, these algorithms enable the police and security services to impose surveillance on wider stretches of the city. “This technology will reproduce the stereotypes of the police,” she says. “We know that they discriminate. We know that they always go in the same area. They always go and harass the same people. And this technology, as with every surveillance technology, will help them do that.”

As motorists rage in the city center at the security barriers blocking the streets, Levain is one of many Parisians planning to decamp to the south of France while the Olympics takes over. Yet she worries about the city that will greet her on her return. “The Olympics is an excuse,” she says. “They—the government, companies, the police—are already thinking about after.”

Morgan Meaker is a senior writer at WIRED, covering Europe and European business from London. She won the top prize at the BSME Awards in 2023 and was part of the team that worked on WIRED’s award-winning investigation series “Inside the Suspicion Machine.” Before she joined WIRED in 2021, her… Read more
Senior Writer

Read More

We Asked AI to Take Us On a Tour of Our Cities. It Was Chaos

We had a specialty chatbot to curate perfect days out in London and New York for under $100 each. We're still recovering from our journeys.
Natasha Bernal

Britain’s Brewing Battle Over Data Centers

The Labour Party, which is leading in UK election polls, has proposed making it easier for companies to build new server farms—risking a new type of conflict in communities across the country.
Morgan Meaker

How Keir Starmer Can Fix the UK’s Tech Industry

The new Labour government could bring about a renaissance in UK tech and bolster the country’s precarious post-Brexit startup pipeline. That’s if politics don’t get in the way.
Natasha Bernal

OpenAI Touts New AI Safety Research. Critics Say It’s a Good Step, but Not Enough

The company announced a new technique to make the workings of its systems more transparent, but people familiar with OpenAI say more oversight is needed.
Will Knight

Google DeepMind’s Game-Playing AI Tackles a Chatbot Blind Spot

Google’s new advance combines a large language model with a self-learning AI. The technique could address some shortcomings with AI—although there’s a catch.
Will Knight

SearchGPT Is OpenAI’s Direct Assault on Google

The company behind ChatGPT is expanding into search, and leaning heavily on its relationships with publishers.
Reece Rogers

OpenAI Is Testing Its Powers of Persuasion

Sam Altman is touting AI’s ability to sway people's behavior. His company is also wrestling with the risks.
Will Knight

New Jersey’s $500 Million Bid to Become an AI Epicenter

The Garden State has enacted a hefty new tax credit specifically for AI businesses. But tax incentives—particularly for data centers—don’t always create a lot of jobs.
Amanda Hoover

*****
Credit belongs to : www.wired.com

Check Also

Canadian residents are racing to save the data in Trump’s crosshairs

As U.S. President Donald Trump's administration removes access to government websites and databases with critical …