Sign in

Evolution Strategies can be used to create adversarial examples to fool image classifiers in a black-box setting and three variants are compared in a recently published paper. Compared to a white-box setting where the adversary has full access to the neural network and knows its architecture and parameters, a black-box setting is more realistic. The two population-less variants (1+1)-ES and Natural Evolution Strategies and the population-based variant CMA-ES are tested on the three neural network architectures VGG-16, Inception-v3 and ResNet-50 which were trained on ImageNet. It turned out that CMA-ES dominates the other variants. It achieves higher success rates and requires less queries to compute the adversarial examples. The authors of the paper assume that this is due to the use of a population which might result in a better exploration of the fitness landscape.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store