As generative AI becomes more mainstream, many have raised concerns about its impact on artists; some believe image-generating AI may eventually replace human artists, while others object to companies training the AI models with images and data they didn’t legally have the rights to use. To combat both of these issues, some artists are editing their art to create bad training data for AI. Computers fundamentally process images differently from humans, meaning that with enough effort, one can edit an image so that it appears the same to the human eye, but looks wildly different to a computer. AI researchers have long theorized about the possibility that maliciously-crafted inputs could render an AI model useless, and the wanton scraping of images from the internet increases the likelihood of this happening. Some artists hope that these images will ruin AI models, thereby forcing companies to obtain their training data ethically. As AI development progresses, this method may become obsolete, but until then, companies may need to put more work into curating their AI training data.