Last week, the Australian state of New South Wales announced a plan to crack down on drivers using their phones on the road. The state’s transport agency said it had integrated machine vision into roadside cameras to spot offenders. The AI automatically flags suspects, humans confirm what’s going on, and a warning letter is sent out to the driver.

“It’s a system to change the culture,” the assistant police commissioner of New South Wales, Michael Corboy, told Australian media, noting that police hoped the technology would cut fatalities on the road by a third over two years.

It seems an admirable scheme, top to bottom. The offense is clear, the desired outcome is unimpeachable, and there’s even a human in the loop to stop the machines making mistakes.

But it also demonstrates the slow creep of artificial intelligence into state and corporate surveillance — a trend that experts say could lead to some dark places: chilling civil rights, automating prejudices and biases, and pushing society slowly towards authoritarianism.

Right now, AI is primarily leveraged in the world of surveillance to identify people. The horror stories from Xinjiang in China are well-known, with networks of facial recognition cameras used to track the region’s repressed Uighur minority. In the UK, where there’s one surveillance camera for every ten citizens, private companies have begun using the tech on a much smaller scale to automate watch lists; spotting both known troublemakers and VIP customers when they walk into a store. And in the US, the Amazon team behind the company’s popular Ring surveillance camera planned to create similar watch lists for individual neighborhoods, although they ultimately scrapped the plans.

But as the roadside cameras of New South Wales show, identifying people is just the start of AI surveillance: the real power — and threat — is identifying actions. This means creating cameras that don’t just tell you who people are, but what they’re doing. Is that person moving things about? Could they be stealing something? Are they just loitering in a way you don’t like?

These sorts of features are not yet widespread, but are beginning to percolate into everyday use.

Footage from an AI surveillance camera in Japan trained to spot shoplifters.
Image: Earth Eyes Corp

In Japan, for example, you can buy an AI surveillance cameras that its claimed can automatically spot shoplifters. (A feature that’s functionally no different to Amazon’s Go stores, which use machine vision to automatically charge customers for items they grab off the shelves.) In the US, one firm is building “Google for CCTV,” which leverages machine learning to let users search surveillance footage for specific types of clothes or cars. In India, researchers say they’ve even built software that can be loaded into drones to automatically spot fights in the street.

Such applications are often treated with suspicion by AI experts. With the drone system, for example, researchers have pointed out that the software had questionable accuracy rates, and is likely to falsely flag incidents. As with other examples of algorithmic bias, experts say that if systems are fed data that skews towards certain groups of people or ethnicities, then this will become apparent in the software’s results.

Problems like these, though, aren’t likely to stop private companies and governments from adopting the technology. Biased algorithmic systems are already in use in sectors like healthcare and criminal justice, and surveillance would likely be no different. The end effect could be a more tightly-controlled and repressive society.

As Jay Stanley, a senior policy analyst, told The Verge last year: “We want people to not just be free, but to feel free. And that means that they don’t have to worry about how an unknown, unseen audience may be interpreting or misinterpreting their every movement and utterance ... The concern is that people will begin to monitor themselves constantly, worrying that everything they do will be misinterpreted and bring down negative consequences on their life.”