I had a client train an AI on images I created without extended usage and so added adversarial noise to the images next time around. The models I tested with misclassified the images and image generation seemed broken, so Im curious how it impacted their attempts, if they even attempted it again, I don't know. I don't expect them to come to me and ask why their model is so interested in ducks...
Companies put measures into place to make sure that their software only functions correctly with a paying customer. Some Video Games include intentional problem when the game believes IP laws are violated.
If the client is sophisticated enough to realize that I don't believe it's difficult to prefilter to remove said noise.
For sure. It was a test to see what would happen. I heard nothing, but also didn't see my work appear again- who knows...