Through the efforts of the governments of many states, systems for tracking and identifying citizens are being created everywhere, which may soon put an end to public anonymity. Therefore, it is not surprising that there are always people who are looking for ways to bypass the system, in fact, total surveillance.
A group of engineers from the University of C.Yu. Leuven (Belgium) published an article last week on how you can trick an AI system for identifying images with simple printed templates.
For example, it is enough to hang a specially printed plate around your neck, and from the point of view of AI it will be tantamount to using an invisibility cloak.
It may seem strange, but in fact we are talking about a fairly well-known phenomenon in the world of AI. Such patterns are called adversarial examples, and they exploit the intelligence flaws of computer vision systems. As a result, visual deception occurs and the AI sees what is not.
In the past, adversarial systems have been used to trick facial recognition systems into transforming a person into AI, such as Mila Jovovich, after wearing glasses.
At the same time, many researchers warn about the dangers of using adversarial examples. In particular, in this way, you can deceive a drone car, which can mistake an important road sign for an ordinary lamp post.