Saying you don’t care about your right to privacy because you have nothing to hide is as clueless as saying you don’t care about freedom of speech because you have nothing to say.
With a surveillance state (body cams, social media, cell phones, even toys) you don’t get protection. You get a thin illusion of security and then an above-the-law elite who can and will do anything they like, including falsify and tamper with evidence. It starts innocently. Most cops are good honest people, but what about politicians? Fear makes people stupid.
The surveillance state eventually creates a system where innocent people are treated with various amounts of cruelty based on the whims, misperceptions and personal prejudices of those in power. You wake up in another country. The truth no longer matters. The maw of the giant reptile opens. The masses line up at the teeth to walk the red carpet.
Half of all US adults are already on an unregulated law enforcement facial recognition database without even knowing it, and soon, police will be able to search through thousands of hours of footage in minutes to find anyone or anything.
Following its acquisition of tech start-ups Misfit and Dextro, Taser announced last week that it was launching an AI division named Axon AI. Its aim is to help manage and improve its evidence.com cloud storage site, which houses US law enforcement body cam footage, while also developing the Axon AI platform.
Taser now sells more body cams than it does its eponymous personal defense weapon.
Dextro created the first computer vision and deep learning system that makes visual content in video searchable in real time while researchers at Misfit were working to improve the accuracy, efficiency, and speed of processing images and video.
“To clarify, Dextro’s system offers computer vision techniques to identify faces and other objects for improving the efficiency of the redaction workflow. AI enables you to become more targeted when needed,” Steve Tuttle, Taser’s vice president of communications told Vocativ.
Now under one roof at Axon, the combined team will develop machine learning algorithms that parse footage both in real time and retrospectively while also capable of discerning objects, individual people, and actions such as a traffic stop or a foot chase. All footage can be tagged, classified and then searched when needed.
At present, it takes eight hours to redact faces from one hour of footage, but Axon AI’s future tech aims to reduce that to 90 minutes.
A study by Johns Hopkins University Applied Physics Laboratory, which was sponsored by the Department of Justice’s National Institute of Justice, found that facial recognition software had been built in or was at least optional in nine of the body cams available on the market in the United States.
Another study, carried out by Georgetown University’s Center on Privacy & Technology, found that 117 million American adults have been added to a law enforcement facial recognition database. Evidence.com boasts over 5.2 petabytes of information stored on its cloud servers (all US academic research libraries amount to to 2 petabytes of data).
Police can search through this database and find anyone that has a state-issued photo ID such as a driver’s license and identify them in photos taken in public spaces or shared online, with no warrant.
Taser wants to fully automate the documentation of each and every police encounter with live transcription and image analysis.
“Police officers are spending most of their time entering information into computers. We want to automate all of that,” Taser CEO Rick Smith said Wednesday.
The extent of this passive surveillance has generated concern among privacy advocates, especially in light of the NSA’s Prism program revelations leaked by Edward Snowden in 2013.
“We support body cams on the condition that they serve as an effective police oversight tool and not as yet another set of government surveillance cameras,” Jay Stanley, a senior policy analyst at the American Civil Liberties Union told Forbes. “The storing of video and running analytics on it does not strike the right balance between privacy, oversight, and usefulness to the police.”
Supposedly private channels on Facebook are monitored:
… Here’s the original version of Zuckerberg’s comment on AI (emphasis added):
“The long term promise of AI is that in addition to identifying risks more quickly and accurately than would have already happened, it may also identify risks that nobody would have flagged at all — including terrorists planning attacks using private channels, people bullying someone too afraid to report it themselves, and other issues both local and global. It will take many years to develop these systems.”
The Associated Press originally published the paragraph that included the mention of monitoring private channels, but its story has since been updated “to substitute a quote on artificial intelligence to reflect what was actually in the manifesto.”
While it’s common and expected for social networks to try and keep terrorists off their platforms, suggesting that Facebook plans to listen in on seemingly “private” conversations raises a flag about how the company’s stance on privacy.