Deep Neural Networks can be attacked with correctly classified training examples so that image classifiers assign labels chosen by an adversary to target images. An approach to create these poisoned training examples was published in a paper with the title “Poison Frogs! Targeted Clean-Label Poisoning Attacks on Neural Networks”. Here is a summary.
Data Poisoning Attacks
In data poisoning attacks an adversary injects malicious training examples into the training set to manipulate the behavior of the model at test time. For instance, an adversary might try to degrade the performance of a model or an adversary might inject a backdoor…
Recently a paper with the title “Data Poisoning Attacks and Defenses to Crowdsourcing Systems” was published on arXiv in which the authors analyze data poisoning attacks on crowdsourced data labeling. Here is a summary.
For classification tasks such as image classification large labeled datasets of high quality are required in order to build machine learning models that achieve state-of-the-art performance. However, creating these datasets is often challenging as in many situations only unlabeled data is available and labeling millions or even billions of unlabeled items would require a lot of manual effort with thousands of people being involved.
Let’s assume you want to send a message and you want to ensure that a) the receiver can detect whether or not the message was modified (integrity) and b) the receiver can verify that you’re the author of this message (message authentication). In that case you typically use digital signatures to digitally sign that message.
Actually, digital signatures also provide non-repudiation, i.e. the sender cannot deny the signing of a message. However, providing integrity and message authentication are the two most common use cases for signatures.
Initially, you create a public key and a private key. You have…
IT Security Architect, Software Engineer and interested in machine learning vulnerabilities and how to use machine learning to improve security.