We're living in a cyberpunk future:
“Fooling automated surveillance cameras: adversarial patches to attack person detection” https://arxiv.org/abs/1904.08653
Demo of generating adversarial patches against YOLOv2 https://youtu.be/MIbFvK2S9g8
William Gibson (@GreatDismal) wrote of the “ugly T-shirt” in his books, a T-shirt so ugly with pieces of faces on it that the facial recognition in CCTV cameras doesn’t log the data, kind of a backdoor.
“Surveillance cameras can all see it, but then they forget they’ve seen it.”
While printing this pattern on a T-shirt might be good for a fashion item, it’s going to be useless against any system that doesn’t deploy variants of YOLOv2. In the future, your adversarial outfits will also need to be adaptable in real-time.
I would've preferred a less attention-grabbing bag to flash disruptive adversarial images in the wild...
You can follow @hardmaru.
Tip: mention @threader_app on a Twitter thread with the keyword “compile” to get a link to it.
Enjoy Threader? Sign up.
Threader is an independent project created by only two developers. The site gets 500,000+ visits a month and our iOS Twitter client was featured as an App of the Day by Apple. Running this space is expensive and time consuming. If you find Threader useful, please consider supporting us to make it a sustainable project.