Police surveillance on steroids: Drones and AI systems MERGING to create flying spy robots
06/13/2018 / By JD Heyes / Comments
Police surveillance on steroids: Drones and AI systems MERGING to create flying spy robots

We’re fans of local police and we know how difficult it is getting for the thin blue line in cities and towns all across America to maintain order in an increasingly chaotic society. But sometimes technology can turn good things — and good people — into things that aren’t so good.

As they’ve gotten cheaper, commercial drones have proliferated in the private sector, but in the past few years, more police departments have begun to experiment with and use them for surveillance operations. 

At the same time, artificial intelligence (AI) software has also become more affordable and more widely utilized. 

Now, the two technologies are merging under a partnership formed by two companies that seek to market the end product as a sort of “thinking drone” that can identify human activity in public that could become problematic for police.

And problematic for our civil liberties.

As reported by NextGov, drone-maker DJI has inked a deal with Axon, the maker of Taser weapons and police body cameras, to sell AI-capable drones to police agencies.

The news site noted:

Now, not only do local police have access to drones, but footage from those flying cameras will be automatically analyzed by AI systems not disclosed to the public.

So what, you might ask. Police have been using helicopters for years to conduct visual surveillance. True, but the difference between that kind of surveillance and the kind engaged in by drones is as big as the difference between the size of the average police chopper and a commercial drone. Helicopters are large and loud; it’s impossible to conduct close-in surveillance without tipping off the target. That’s not true for drones. (Related: “Big Data” police-state surveillance the new norm: Information from multiple sources (on any citizen) now being combined to generate a “criminal risk assessment algorithm”)

Brighteon.TV

As for the AI-powered drones, here’s how it will work: Footage taken will be uploaded or streamed live to a digital cloud for police cameras operated by Axon — the same technology used for their body cams — where it will then be analyzed by an AI program. The drones will be used for many things, including monitoring crowds and search-and-rescue operations, according to the company’s website.

While that description is short on details, a paper published just days before the companies announced their joint project, discussed how such drones could be deployed and used in real-time. 

What could go wrong?

“Drone systems have been deployed by various law enforcement agencies to monitor hostiles, spy on foreign drug cartels, conduct border control operations, etc. This paper introduces a real-time drone surveillance system to identify violent individuals in public areas,” says the research, titled, “Eye in the Sky.”

Drone developers flew one of the devices to snap 2,000 pictures of people pretending to engage in violence with each other. But experts say that approach is problematic because real data wasn’t used to ‘inform’ the AI software. The subjects were photographed during carefully staged instances of violence that was not genuine.

That means there’s no guarantee that the software will be able to ‘learn’ the difference between violence and normal human motion and behavior in the real world, according to David Sumpter, author of Outnumbered: Exploring the Algorithms that Control Our Lives.

Sumpter is highly critical that the AI learning will perform as expected, meaning that police will be reliant on faulty technology when the drones are deployed in real cities.

What could go wrong there?

“What the algorithm has done is classify individuals with their hands or legs in the air as violent,” Sumpter wrote. 

Not only that, but others who are concerned about the software, like Google ethics observer Meredith Whittaker, note that the research was flawed and that “AI is facing a mounting ethical crisis.”

Researchers who put together “Eye in the Sky” don’t even discuss the issue of false positives, so how can their conclusions be remotely accurate?

Nevertheless, it looks like these things are going to wind up at local police departments anyway, and soon. Perhaps at one near you.

Read more at PoliceState.news.

J.D. Heyes is a senior writer for NaturalNews.com and NewsTarget.com, as well as editor of The National Sentinel.

Sources include:

NextGov.com

NaturalNews.com

Submit a correction >>

, , , , , , , , , ,

This article may contain statements that reflect the opinion of the author
Get Our Free Email Newsletter
Get independent news alerts on natural cures, food lab tests, cannabis medicine, science, robotics, drones, privacy and more.
Your privacy is protected. Subscription confirmation required.


Get the world's best independent media newsletter delivered straight to your inbox.
x

By continuing to browse our site you agree to our use of cookies and our Privacy Policy.