Slop is low-quality media—including writing and images—made using generative artificial intelligence technology. Coined in the 2020s, the term has a derogatory connotation akin to ‘spam.’ It has been variously defined as ‘digital clutter,’ ‘filler content produced by AI tools that prioritize speed and quantity over substance and quality,’ and ‘shoddy or unwanted AI content in social media, art, books and, increasingly, in search results.’ Jonathan Gilmore, a professor of philosophy at the City University of New York, describes the ‘incredibly banal, realistic style’ of AI slop as being ‘very easy to process.’
After Hurricane Helene, an AI-generated image of a girl holding a puppy while sitting in a boat floating on flooded waters circulated among Republicans, who used as evidence of failures or the Biden administration to respond to the disaster. U.S. Senator Mike Lee posted the image of the girl on social media before later deleting it. The image apparently originated on the Trump-centered Internet forum Patriots.win.
As large language models accelerated the creation of high-volume but low-quality written content and images, discussion commenced for the appropriate term for the volume. Terms proposed included ‘AI garbage,’ ‘AI pollution,’ and ‘AI-generated dross.’ Early uses of the term ‘slop’ as a descriptor for low-grade AI material apparently came in reaction to the release of AI art generators in 2022. Its early use has been noted in the comments sections of 4chan, Hacker News, and YouTube as a form of in-group slang. British developer Simon Willison is credited for being an early champion of the term ‘slop’ in the mainstream.
The term gained increased popularity in Spring 2024 in part because of Google’s use of Gemini AI model to generate responses to search queries, and was widely used in media headlines by fall of 2024. AI image and video slop proliferated on social media in part because it was revenue generating for its creators on Facebook and TikTok, specially incentivizing people from lower-income countries to create images that appeal to U.S. audiences which attracted higher advertising rates. The Atlantic noted that AI slop was becoming associated with the political right in the United States, who were using it for shitposting and engagement farming on social media, the technology offering ‘cheap, fast, on-demand fodder for content.’
Journalist Jason Koebler speculated that the bizarreness of some of the content may be due to the creators using Hindi, Urdu, and Vietnamese prompts (languages which are underrepresented in the model’s training data), or using erratic speech-to-text methods to translate their intentions into English. Speaking to New York magazine, a Kenyan creator of slop images described giving ChatGPT a prompt such as ‘WRITE ME 10 PROMPT picture OF JESUS WHICH WILLING BRING HIGH ENGAGEMENT ON FACEBOOK,’ and then feeding those created prompts into a text-to-image AI service such as Midjourney. The multitude of AI generated images of a “Shrimp Jesus” are a commonly used example of Slop. Sympathy inducing images were created using prompts designed to appeal to a US audience, from a Hindi-language seminar, e.g., ‘american soldier veteran holding cardboard sign that says ‘today’s my birthday, please like’ injured in battle veteran war american flag.’



Leave a comment