Facial recognition has been introduced since 2013, 2014 in retail stores such as Walmart. Now it is becoming more and more pervasive, and combined with machine learning, AI, can deliver very detailed analysis of how you behave and move around in the public space.
This article gives an overview of this secretive practice, that is being sold to the public as an effective tool to fight shoplifting:
This is not the best video I have seen, but gives an idea how far this can go (I have seen video that descibes in real-time in text how the customer moves and their emotion that comes with it):
PS This is all software-based and can be installed / used for any camera feed, no matter where it is placed, or what kind of camera is used.
In China, facial recognition has been âsuccessfullyâ employed to identify petty criminals while they were attending concerts.
The next step: even without a smartphone to track your every move, facial recognition will allow governments or private interests to follow you, everywhere and record your actions.
Ha ha, yes, you should read the Hacker News thread. There is suggestion of wearing a cap with infrared led lights. And people responding âGeez, we are in a time where we are required to really wear a tinfoil hatâ (well, comments of that nature).
PS2. There is also drone developments which have gigapixel cameraâs (basically hundreds of smartphone cameraâs combined into one visual) and can glide above a city all day recording every move. There is no facial recoginition from above, but the ultra-high resolution allows zooming in on anyone and do so-called âgait detectionâ to identify that person.
A recent article in The Intercept talks about a new development: affect recognition. Here is an excerpt that mentions AI Now, a research institute at New York University.
AI Nowâs 2018 report is a 56-page record of how âartificial intelligenceâ â an umbrella term that includes a myriad of both scientific attempts to simulate human judgment and marketing nonsense â continues to spread without oversight, regulation, or meaningful ethical scrutiny. The report covers a wide expanse of uses and abuses, including instances of racial discrimination, police surveillance, and how trade secrecy laws can hide biased code from an AI-surveilled public. But AI Now, which was established last year to grapple with the social implications of artificial intelligence, expresses in the document particular dread over affect recognition, âa subclass of facial recognition that claims to detect things such as personality, inner feelings, mental health, and âworker engagementâ based on images or video of faces.â The thought of your boss watching you through a camera that uses machine learning to constantly assess your mental state is bad enough, while the prospect of police using âaffect recognitionâ to deduce your future criminality based on âmicro-expressionsâ is exponentially worse.