You’ve in all probability encountered pictures in your social media feeds that seem like a cross between images and computer-generated graphics. Some are fantastical—suppose Shrimp Jesus—and a few are plausible at a fast look—keep in mind the little girl clutching a pet in a ship throughout a flood?
These are examples of AI slop, or low- to mid-quality content material—video, pictures, audio, textual content or a mixture—created with AI instruments, typically with little regard for accuracy. It’s fast, easy, and inexpensive to make this content material. AI slop producers usually place it on social media to use the economics of attention on the web, displacing higher-quality materials that might be extra useful.
AI slop has been increasing over the previous few years. Because the time period “slop” signifies, that’s typically not good for individuals utilizing the web.
AI slop’s many kinds
The Guardian revealed an evaluation in July 2025 analyzing how AI slop is taking over YouTube’s fastest-growing channels. The journalists discovered that 9 out of the highest 100 fastest-growing channels characteristic AI-generated content material like zombie soccer and cat cleaning soap operas.
The music “Let it Burn,” allegedly recorded by a band referred to as The Velvet Sunset, was AI-generated.
Listening to Spotify? Be skeptical of that new band, The Velvet Sundown, that appeared on the streaming service with a inventive backstory and by-product tracks. It’s AI-generated.
In lots of instances, individuals submit AI slop that’s simply ok to draw and maintain customers’ consideration, permitting the submitter to revenue from platforms that monetize streaming and view-based content material.
The convenience of producing content material with AI permits individuals to submit low-quality articles to publications. Clarkesworld, a web-based science fiction journal that accepts consumer submissions and pays contributors, stopped taking new submissions in 2024 due to the flood of AI-generated writing it was getting.
These aren’t the one locations the place this occurs—even Wikipedia is dealing with AI-generated low-quality content that strains its complete neighborhood moderation system. If the group just isn’t profitable in eradicating it, a key info useful resource individuals rely on is in danger.
Final Week Tonight with John Oliver delves into AI slop.
Harms of AI slop
AI-driven slop is making its method upstream into individuals’s media diets as properly. Throughout Hurricane Helene, opponents of President Joe Biden cited AI-generated pictures of a displaced baby clutching a pet as proof of the administration’s purported mishandling of the catastrophe response. Even when it’s obvious that content material is AI-generated, it may well nonetheless be used to unfold misinformation by fooling some individuals who briefly look at it.
AI slop additionally harms artists by inflicting job and monetary losses and crowding out content material made by actual creators. The location of this lower-quality AI-generated content material is commonly not distinguished by the algorithms that drive social media consumption, and it displaces complete lessons of creators who beforehand made their livelihood from on-line content material.
Wherever it’s enabled, you may flag content material that’s dangerous or problematic. On some platforms, you may add neighborhood notes to the content material to offer context. For dangerous content material, you may attempt to report it.
Together with forcing us to be on guard for deepfakes and “inauthentic” social media accounts, AI is now resulting in piles of dreck degrading our media setting. Not less than there’s a catchy identify for it.
Adam Nemeroff is an assistant provost for improvements in studying, instructing, and know-how at Quinnipiac University.
This text is republished from The Conversation beneath a Artistic Commons license. Learn the original article.

