Site icon USA News Hub

With ‘AI slop’ distorting our reality, the world is sleepwalking into disaster

16f56bf0-19f5-11f0-bdc4-cb8ccab618ff.jpg

Illustration: Nate Kitch/The GuardianView image in fullscreenIllustration: Nate Kitch/The GuardianOpinionArtificial intelligence (AI)With ‘AI slop’ distorting our reality, the world is sleepwalking into disasterNesrine MalikA perverse information ecosystem is being mined by big tech for profit, fooling the unwary and sending algorithms crazyMon 21 Apr 2025 07.00 CESTShareThere are two parallel image channels that dominate our daily visual consumption. In one, there are real pictures and footage of the world as it is: politics, sport, news and entertainment. In the other is AI slop, low-quality content with minimal human input. Some of it is banal and pointless – cartoonish images of celebrities, fantasy landscapes, anthropomorphised animals. And some is a sort of pornified display of women just simply … being, like a virtual girlfriend you cannot truly interact with. The range and scale of the content is staggering, and infiltrates everything from social media timelines to messages circulated onWhatsApp. The result is not just a blurring of reality, but a distortion of it.A new genre of AI slop is rightwing political fantasy. There are entireYouTube videosof made-up scenarios in which Trump officials prevail against liberal forces. The White House account on X jumped on a trend of creating images in Studio Ghibli style and posted animage of a Dominican woman in tearsas she is arrested by Immigration and Customs Enforcement (Ice). AI political memefare has, in fact, gone global. Chinese AI videos mocking overweight US workers on assembly lines after the tariff announcementraised a questionfor, and response from, the White House spokesperson last week. The videos, she said, were made by those who “do not see the potential of the American worker”. And to prove how pervasive AI slop is, I had to triple-check that even that response was not itself quickly cobbled-together AI content fabricating another dunk on Trump’s enemies.The impulse behind this politicisation of AI is not new; it is simply an extension of traditional propaganda. What is new is how democratised and ubiquitous it has become, and how it involves no real people or the physical constraints of real life, therefore providing an infinite number of fictional scenarios.The fact that AI content is also spread through huge and ubiquitous chat channels such as WhatsApp means that there are no replies or comments to challenge its veracity. Whatever you receive is imbued with the authority and reliability of the person who has sent it. I am in a constant struggle with an otherwise online-savvy elderly relative who receives and believes a deluge of AI content on WhatsApp about Sudan’s war. The images and videos look real to her and are sent by people she trusts. Even absorbing that technology is capable of producing content with such verisimilitude is difficult. Combine this with the fact that the content chimes with her political desires and you have a degree of stickiness, even when some doubt is cast on the content. What is emerging, amid all the landfill ofgiant balls of cats, is the use of AI to create, idealise and sanitise political scenarios by rendering them in triumphant or nostalgic visual language.Prof Roland Meyer, a scholar of media and visual culture,notes one particular“recent wave of AI-generated images of white, blond families presented by neofascist online accounts as models of a desirable future”. He attributes this not just to the political moment, but to the fact that “generative AI is structurally conservative, even nostalgic”. Generative AI is trained on pre-existing data, which research has shown isinherently biasedagainst ethnic diversity, progressive gender roles and sexual orientations, therefore concentrating those norms in the output.0:54Donald Trump shares bizarre AI-generated video of ‘Trump Gaza’ – videoThe same can be seen in “trad wife” content, which summons not only beautiful supplicant homemakers, but an entire throwback world in which men can immerse themselves. X timelines are awash with a sort of clothed nonsexual pornography, as AI images of women described as comely, fertile and submissive glimmer on the screen. White supremacy, autocracy, and fetishisation of natural hierarchies in race and gender are packaged as nostalgia for an imagined past. AI is already being described asthe new aesthetic of fascism.But it isn’t always as coherent as that. Most of the time, AI slop is just content-farming chaos. Exaggerated or sensationalised online material boosts engagement, giving creators the chance to make money based on shares, comments and so on. Journalist Max Readfound that Facebook AI slop– the sloppiest of them all – is, “as far as Facebook is concerned”, not “junk”, but “precisely what the company wants: highly engaging content”. To social media giants, content is content; the cheaper it is, the less human labour it involves, the better. The outcome is an internet of robots, tickling human users into whatever feelings and passions keep them engaged.But whatever the intent of its creators, this torrent of AI content leads to the desensitisation and overwhelming of visual palates. The overall effect of being exposed to AI images all the time, from the nonsensical to the soothing to the ideological, is that everything begins to land in a different way. In the real world, US politicianspose outside prison cages of deportees. Students at US universities areambushedin the street and spirited away. People in Gazaburnalive. These pictures and videos join an infinite stream of others that violate physical and moral laws. The result is profound disorientation. You can’t believe your eyes, but also what can you believe if not your eyes? Everything starts to feel both too real and entirely unreal.I can’t delete WhatsApp’s new AI tool. But I’ll use it over my dead body | Polly HudsonRead moreCombine that with the necessary trivialisation and provocative brevity of the attention economy and you have a grand circus of excess. Even when content is deeply serious, it is presented as entertainment or, as an intermission, in a sort of visual elevator music. Horrified by Donald Trump and JD Vance’s attack on Zelenskyy? Well, here is anAI renderingof Vance as a giant baby. Feeling stressed and overwhelmed? Here is some eye balm – acabin with a roaring fire and snow falling outside. Facebook has for some reason decided I need to see a constant stream of compact, cutesystudio apartmentswith a variation of “this is all I need” captions.And the rapid mutation of the algorithm then feeds users more and more of what it has harvested and deemed interesting to them. The result is that all media consumption, even for the most discerning users, becomes impossible to curate. You are immersed deeper and deeper into subjective worlds rather than objective reality. The result is a very weird disjuncture. The sense of urgency and action that our crisis-torn world should inspire is instead blunted by how information is presented. Here, there is a new way of sleepwalking into disaster. Not through lack of knowledge, but through the paralysis caused by every event being filtered through this perverse ecosystem – just another part of the maximalist visual and show.Nesrine Malik is a Guardian columnistExplore more on these topicsArtificial intelligence (AI)OpinionSocial mediaWhatsAppXFacebookYouTubecommentShareReuse this content

Illustration: Nate Kitch/The GuardianView image in fullscreenIllustration: Nate Kitch/The GuardianOpinionArtificial intelligence (AI)With ‘AI slop’ distorting our reality, the world is sleepwalking into disasterNesrine MalikA perverse information ecosystem is being mined by big tech for profit, fooling the unwary and sending algorithms crazyMon 21 Apr 2025 07.00 CESTShareThere are two parallel image channels that dominate our daily visual consumption. In one, there are real pictures and footage of the world as it is: politics, sport, news and entertainment. In the other is AI slop, low-quality content with minimal human input. Some of it is banal and pointless – cartoonish images of celebrities, fantasy landscapes, anthropomorphised animals. And some is a sort of pornified display of women just simply … being, like a virtual girlfriend you cannot truly interact with. The range and scale of the content is staggering, and infiltrates everything from social media timelines to messages circulated onWhatsApp. The result is not just a blurring of reality, but a distortion of it.A new genre of AI slop is rightwing political fantasy. There are entireYouTube videosof made-up scenarios in which Trump officials prevail against liberal forces. The White House account on X jumped on a trend of creating images in Studio Ghibli style and posted animage of a Dominican woman in tearsas she is arrested by Immigration and Customs Enforcement (Ice). AI political memefare has, in fact, gone global. Chinese AI videos mocking overweight US workers on assembly lines after the tariff announcementraised a questionfor, and response from, the White House spokesperson last week. The videos, she said, were made by those who “do not see the potential of the American worker”. And to prove how pervasive AI slop is, I had to triple-check that even that response was not itself quickly cobbled-together AI content fabricating another dunk on Trump’s enemies.The impulse behind this politicisation of AI is not new; it is simply an extension of traditional propaganda. What is new is how democratised and ubiquitous it has become, and how it involves no real people or the physical constraints of real life, therefore providing an infinite number of fictional scenarios.The fact that AI content is also spread through huge and ubiquitous chat channels such as WhatsApp means that there are no replies or comments to challenge its veracity. Whatever you receive is imbued with the authority and reliability of the person who has sent it. I am in a constant struggle with an otherwise online-savvy elderly relative who receives and believes a deluge of AI content on WhatsApp about Sudan’s war. The images and videos look real to her and are sent by people she trusts. Even absorbing that technology is capable of producing content with such verisimilitude is difficult. Combine this with the fact that the content chimes with her political desires and you have a degree of stickiness, even when some doubt is cast on the content. What is emerging, amid all the landfill ofgiant balls of cats, is the use of AI to create, idealise and sanitise political scenarios by rendering them in triumphant or nostalgic visual language.Prof Roland Meyer, a scholar of media and visual culture,notes one particular“recent wave of AI-generated images of white, blond families presented by neofascist online accounts as models of a desirable future”. He attributes this not just to the political moment, but to the fact that “generative AI is structurally conservative, even nostalgic”. Generative AI is trained on pre-existing data, which research has shown isinherently biasedagainst ethnic diversity, progressive gender roles and sexual orientations, therefore concentrating those norms in the output.0:54Donald Trump shares bizarre AI-generated video of ‘Trump Gaza’ – videoThe same can be seen in “trad wife” content, which summons not only beautiful supplicant homemakers, but an entire throwback world in which men can immerse themselves. X timelines are awash with a sort of clothed nonsexual pornography, as AI images of women described as comely, fertile and submissive glimmer on the screen. White supremacy, autocracy, and fetishisation of natural hierarchies in race and gender are packaged as nostalgia for an imagined past. AI is already being described asthe new aesthetic of fascism.But it isn’t always as coherent as that. Most of the time, AI slop is just content-farming chaos. Exaggerated or sensationalised online material boosts engagement, giving creators the chance to make money based on shares, comments and so on. Journalist Max Readfound that Facebook AI slop– the sloppiest of them all – is, “as far as Facebook is concerned”, not “junk”, but “precisely what the company wants: highly engaging content”. To social media giants, content is content; the cheaper it is, the less human labour it involves, the better. The outcome is an internet of robots, tickling human users into whatever feelings and passions keep them engaged.But whatever the intent of its creators, this torrent of AI content leads to the desensitisation and overwhelming of visual palates. The overall effect of being exposed to AI images all the time, from the nonsensical to the soothing to the ideological, is that everything begins to land in a different way. In the real world, US politicianspose outside prison cages of deportees. Students at US universities areambushedin the street and spirited away. People in Gazaburnalive. These pictures and videos join an infinite stream of others that violate physical and moral laws. The result is profound disorientation. You can’t believe your eyes, but also what can you believe if not your eyes? Everything starts to feel both too real and entirely unreal.I can’t delete WhatsApp’s new AI tool. But I’ll use it over my dead body | Polly HudsonRead moreCombine that with the necessary trivialisation and provocative brevity of the attention economy and you have a grand circus of excess. Even when content is deeply serious, it is presented as entertainment or, as an intermission, in a sort of visual elevator music. Horrified by Donald Trump and JD Vance’s attack on Zelenskyy? Well, here is anAI renderingof Vance as a giant baby. Feeling stressed and overwhelmed? Here is some eye balm – acabin with a roaring fire and snow falling outside. Facebook has for some reason decided I need to see a constant stream of compact, cutesystudio apartmentswith a variation of “this is all I need” captions.And the rapid mutation of the algorithm then feeds users more and more of what it has harvested and deemed interesting to them. The result is that all media consumption, even for the most discerning users, becomes impossible to curate. You are immersed deeper and deeper into subjective worlds rather than objective reality. The result is a very weird disjuncture. The sense of urgency and action that our crisis-torn world should inspire is instead blunted by how information is presented. Here, there is a new way of sleepwalking into disaster. Not through lack of knowledge, but through the paralysis caused by every event being filtered through this perverse ecosystem – just another part of the maximalist visual and show.Nesrine Malik is a Guardian columnistExplore more on these topicsArtificial intelligence (AI)OpinionSocial mediaWhatsAppXFacebookYouTubecommentShareReuse this content

Illustration: Nate Kitch/The GuardianView image in fullscreenIllustration: Nate Kitch/The Guardian

Illustration: Nate Kitch/The GuardianView image in fullscreenIllustration: Nate Kitch/The Guardian

Illustration: Nate Kitch/The GuardianView image in fullscreenIllustration: Nate Kitch/The Guardian

Illustration: Nate Kitch/The GuardianView image in fullscreen

Illustration: Nate Kitch/The Guardian

Illustration: Nate Kitch/The Guardian

Illustration: Nate Kitch/The Guardian

OpinionArtificial intelligence (AI)

OpinionArtificial intelligence (AI)

OpinionArtificial intelligence (AI)

With ‘AI slop’ distorting our reality, the world is sleepwalking into disasterNesrine Malik

With ‘AI slop’ distorting our reality, the world is sleepwalking into disasterNesrine Malik

With ‘AI slop’ distorting our reality, the world is sleepwalking into disasterNesrine Malik

With ‘AI slop’ distorting our reality, the world is sleepwalking into disasterNesrine Malik

Nesrine Malik

Nesrine Malik

A perverse information ecosystem is being mined by big tech for profit, fooling the unwary and sending algorithms crazy

A perverse information ecosystem is being mined by big tech for profit, fooling the unwary and sending algorithms crazy

A perverse information ecosystem is being mined by big tech for profit, fooling the unwary and sending algorithms crazy

Mon 21 Apr 2025 07.00 CESTShare

Mon 21 Apr 2025 07.00 CESTShare

Mon 21 Apr 2025 07.00 CESTShare

Mon 21 Apr 2025 07.00 CEST

Mon 21 Apr 2025 07.00 CEST

Mon 21 Apr 2025 07.00 CEST

Share

Share

There are two parallel image channels that dominate our daily visual consumption. In one, there are real pictures and footage of the world as it is: politics, sport, news and entertainment. In the other is AI slop, low-quality content with minimal human input. Some of it is banal and pointless – cartoonish images of celebrities, fantasy landscapes, anthropomorphised animals. And some is a sort of pornified display of women just simply … being, like a virtual girlfriend you cannot truly interact with. The range and scale of the content is staggering, and infiltrates everything from social media timelines to messages circulated onWhatsApp. The result is not just a blurring of reality, but a distortion of it.A new genre of AI slop is rightwing political fantasy. There are entireYouTube videosof made-up scenarios in which Trump officials prevail against liberal forces. The White House account on X jumped on a trend of creating images in Studio Ghibli style and posted animage of a Dominican woman in tearsas she is arrested by Immigration and Customs Enforcement (Ice). AI political memefare has, in fact, gone global. Chinese AI videos mocking overweight US workers on assembly lines after the tariff announcementraised a questionfor, and response from, the White House spokesperson last week. The videos, she said, were made by those who “do not see the potential of the American worker”. And to prove how pervasive AI slop is, I had to triple-check that even that response was not itself quickly cobbled-together AI content fabricating another dunk on Trump’s enemies.The impulse behind this politicisation of AI is not new; it is simply an extension of traditional propaganda. What is new is how democratised and ubiquitous it has become, and how it involves no real people or the physical constraints of real life, therefore providing an infinite number of fictional scenarios.The fact that AI content is also spread through huge and ubiquitous chat channels such as WhatsApp means that there are no replies or comments to challenge its veracity. Whatever you receive is imbued with the authority and reliability of the person who has sent it. I am in a constant struggle with an otherwise online-savvy elderly relative who receives and believes a deluge of AI content on WhatsApp about Sudan’s war. The images and videos look real to her and are sent by people she trusts. Even absorbing that technology is capable of producing content with such verisimilitude is difficult. Combine this with the fact that the content chimes with her political desires and you have a degree of stickiness, even when some doubt is cast on the content. What is emerging, amid all the landfill ofgiant balls of cats, is the use of AI to create, idealise and sanitise political scenarios by rendering them in triumphant or nostalgic visual language.Prof Roland Meyer, a scholar of media and visual culture,notes one particular“recent wave of AI-generated images of white, blond families presented by neofascist online accounts as models of a desirable future”. He attributes this not just to the political moment, but to the fact that “generative AI is structurally conservative, even nostalgic”. Generative AI is trained on pre-existing data, which research has shown isinherently biasedagainst ethnic diversity, progressive gender roles and sexual orientations, therefore concentrating those norms in the output.0:54Donald Trump shares bizarre AI-generated video of ‘Trump Gaza’ – videoThe same can be seen in “trad wife” content, which summons not only beautiful supplicant homemakers, but an entire throwback world in which men can immerse themselves. X timelines are awash with a sort of clothed nonsexual pornography, as AI images of women described as comely, fertile and submissive glimmer on the screen. White supremacy, autocracy, and fetishisation of natural hierarchies in race and gender are packaged as nostalgia for an imagined past. AI is already being described asthe new aesthetic of fascism.But it isn’t always as coherent as that. Most of the time, AI slop is just content-farming chaos. Exaggerated or sensationalised online material boosts engagement, giving creators the chance to make money based on shares, comments and so on. Journalist Max Readfound that Facebook AI slop– the sloppiest of them all – is, “as far as Facebook is concerned”, not “junk”, but “precisely what the company wants: highly engaging content”. To social media giants, content is content; the cheaper it is, the less human labour it involves, the better. The outcome is an internet of robots, tickling human users into whatever feelings and passions keep them engaged.But whatever the intent of its creators, this torrent of AI content leads to the desensitisation and overwhelming of visual palates. The overall effect of being exposed to AI images all the time, from the nonsensical to the soothing to the ideological, is that everything begins to land in a different way. In the real world, US politicianspose outside prison cages of deportees. Students at US universities areambushedin the street and spirited away. People in Gazaburnalive. These pictures and videos join an infinite stream of others that violate physical and moral laws. The result is profound disorientation. You can’t believe your eyes, but also what can you believe if not your eyes? Everything starts to feel both too real and entirely unreal.I can’t delete WhatsApp’s new AI tool. But I’ll use it over my dead body | Polly HudsonRead moreCombine that with the necessary trivialisation and provocative brevity of the attention economy and you have a grand circus of excess. Even when content is deeply serious, it is presented as entertainment or, as an intermission, in a sort of visual elevator music. Horrified by Donald Trump and JD Vance’s attack on Zelenskyy? Well, here is anAI renderingof Vance as a giant baby. Feeling stressed and overwhelmed? Here is some eye balm – acabin with a roaring fire and snow falling outside. Facebook has for some reason decided I need to see a constant stream of compact, cutesystudio apartmentswith a variation of “this is all I need” captions.And the rapid mutation of the algorithm then feeds users more and more of what it has harvested and deemed interesting to them. The result is that all media consumption, even for the most discerning users, becomes impossible to curate. You are immersed deeper and deeper into subjective worlds rather than objective reality. The result is a very weird disjuncture. The sense of urgency and action that our crisis-torn world should inspire is instead blunted by how information is presented. Here, there is a new way of sleepwalking into disaster. Not through lack of knowledge, but through the paralysis caused by every event being filtered through this perverse ecosystem – just another part of the maximalist visual and show.Nesrine Malik is a Guardian columnistExplore more on these topicsArtificial intelligence (AI)OpinionSocial mediaWhatsAppXFacebookYouTubecommentShareReuse this content

There are two parallel image channels that dominate our daily visual consumption. In one, there are real pictures and footage of the world as it is: politics, sport, news and entertainment. In the other is AI slop, low-quality content with minimal human input. Some of it is banal and pointless – cartoonish images of celebrities, fantasy landscapes, anthropomorphised animals. And some is a sort of pornified display of women just simply … being, like a virtual girlfriend you cannot truly interact with. The range and scale of the content is staggering, and infiltrates everything from social media timelines to messages circulated onWhatsApp. The result is not just a blurring of reality, but a distortion of it.A new genre of AI slop is rightwing political fantasy. There are entireYouTube videosof made-up scenarios in which Trump officials prevail against liberal forces. The White House account on X jumped on a trend of creating images in Studio Ghibli style and posted animage of a Dominican woman in tearsas she is arrested by Immigration and Customs Enforcement (Ice). AI political memefare has, in fact, gone global. Chinese AI videos mocking overweight US workers on assembly lines after the tariff announcementraised a questionfor, and response from, the White House spokesperson last week. The videos, she said, were made by those who “do not see the potential of the American worker”. And to prove how pervasive AI slop is, I had to triple-check that even that response was not itself quickly cobbled-together AI content fabricating another dunk on Trump’s enemies.The impulse behind this politicisation of AI is not new; it is simply an extension of traditional propaganda. What is new is how democratised and ubiquitous it has become, and how it involves no real people or the physical constraints of real life, therefore providing an infinite number of fictional scenarios.The fact that AI content is also spread through huge and ubiquitous chat channels such as WhatsApp means that there are no replies or comments to challenge its veracity. Whatever you receive is imbued with the authority and reliability of the person who has sent it. I am in a constant struggle with an otherwise online-savvy elderly relative who receives and believes a deluge of AI content on WhatsApp about Sudan’s war. The images and videos look real to her and are sent by people she trusts. Even absorbing that technology is capable of producing content with such verisimilitude is difficult. Combine this with the fact that the content chimes with her political desires and you have a degree of stickiness, even when some doubt is cast on the content. What is emerging, amid all the landfill ofgiant balls of cats, is the use of AI to create, idealise and sanitise political scenarios by rendering them in triumphant or nostalgic visual language.Prof Roland Meyer, a scholar of media and visual culture,notes one particular“recent wave of AI-generated images of white, blond families presented by neofascist online accounts as models of a desirable future”. He attributes this not just to the political moment, but to the fact that “generative AI is structurally conservative, even nostalgic”. Generative AI is trained on pre-existing data, which research has shown isinherently biasedagainst ethnic diversity, progressive gender roles and sexual orientations, therefore concentrating those norms in the output.0:54Donald Trump shares bizarre AI-generated video of ‘Trump Gaza’ – videoThe same can be seen in “trad wife” content, which summons not only beautiful supplicant homemakers, but an entire throwback world in which men can immerse themselves. X timelines are awash with a sort of clothed nonsexual pornography, as AI images of women described as comely, fertile and submissive glimmer on the screen. White supremacy, autocracy, and fetishisation of natural hierarchies in race and gender are packaged as nostalgia for an imagined past. AI is already being described asthe new aesthetic of fascism.But it isn’t always as coherent as that. Most of the time, AI slop is just content-farming chaos. Exaggerated or sensationalised online material boosts engagement, giving creators the chance to make money based on shares, comments and so on. Journalist Max Readfound that Facebook AI slop– the sloppiest of them all – is, “as far as Facebook is concerned”, not “junk”, but “precisely what the company wants: highly engaging content”. To social media giants, content is content; the cheaper it is, the less human labour it involves, the better. The outcome is an internet of robots, tickling human users into whatever feelings and passions keep them engaged.But whatever the intent of its creators, this torrent of AI content leads to the desensitisation and overwhelming of visual palates. The overall effect of being exposed to AI images all the time, from the nonsensical to the soothing to the ideological, is that everything begins to land in a different way. In the real world, US politicianspose outside prison cages of deportees. Students at US universities areambushedin the street and spirited away. People in Gazaburnalive. These pictures and videos join an infinite stream of others that violate physical and moral laws. The result is profound disorientation. You can’t believe your eyes, but also what can you believe if not your eyes? Everything starts to feel both too real and entirely unreal.I can’t delete WhatsApp’s new AI tool. But I’ll use it over my dead body | Polly HudsonRead moreCombine that with the necessary trivialisation and provocative brevity of the attention economy and you have a grand circus of excess. Even when content is deeply serious, it is presented as entertainment or, as an intermission, in a sort of visual elevator music. Horrified by Donald Trump and JD Vance’s attack on Zelenskyy? Well, here is anAI renderingof Vance as a giant baby. Feeling stressed and overwhelmed? Here is some eye balm – acabin with a roaring fire and snow falling outside. Facebook has for some reason decided I need to see a constant stream of compact, cutesystudio apartmentswith a variation of “this is all I need” captions.And the rapid mutation of the algorithm then feeds users more and more of what it has harvested and deemed interesting to them. The result is that all media consumption, even for the most discerning users, becomes impossible to curate. You are immersed deeper and deeper into subjective worlds rather than objective reality. The result is a very weird disjuncture. The sense of urgency and action that our crisis-torn world should inspire is instead blunted by how information is presented. Here, there is a new way of sleepwalking into disaster. Not through lack of knowledge, but through the paralysis caused by every event being filtered through this perverse ecosystem – just another part of the maximalist visual and show.Nesrine Malik is a Guardian columnistExplore more on these topicsArtificial intelligence (AI)OpinionSocial mediaWhatsAppXFacebookYouTubecommentShareReuse this content

There are two parallel image channels that dominate our daily visual consumption. In one, there are real pictures and footage of the world as it is: politics, sport, news and entertainment. In the other is AI slop, low-quality content with minimal human input. Some of it is banal and pointless – cartoonish images of celebrities, fantasy landscapes, anthropomorphised animals. And some is a sort of pornified display of women just simply … being, like a virtual girlfriend you cannot truly interact with. The range and scale of the content is staggering, and infiltrates everything from social media timelines to messages circulated onWhatsApp. The result is not just a blurring of reality, but a distortion of it.A new genre of AI slop is rightwing political fantasy. There are entireYouTube videosof made-up scenarios in which Trump officials prevail against liberal forces. The White House account on X jumped on a trend of creating images in Studio Ghibli style and posted animage of a Dominican woman in tearsas she is arrested by Immigration and Customs Enforcement (Ice). AI political memefare has, in fact, gone global. Chinese AI videos mocking overweight US workers on assembly lines after the tariff announcementraised a questionfor, and response from, the White House spokesperson last week. The videos, she said, were made by those who “do not see the potential of the American worker”. And to prove how pervasive AI slop is, I had to triple-check that even that response was not itself quickly cobbled-together AI content fabricating another dunk on Trump’s enemies.The impulse behind this politicisation of AI is not new; it is simply an extension of traditional propaganda. What is new is how democratised and ubiquitous it has become, and how it involves no real people or the physical constraints of real life, therefore providing an infinite number of fictional scenarios.The fact that AI content is also spread through huge and ubiquitous chat channels such as WhatsApp means that there are no replies or comments to challenge its veracity. Whatever you receive is imbued with the authority and reliability of the person who has sent it. I am in a constant struggle with an otherwise online-savvy elderly relative who receives and believes a deluge of AI content on WhatsApp about Sudan’s war. The images and videos look real to her and are sent by people she trusts. Even absorbing that technology is capable of producing content with such verisimilitude is difficult. Combine this with the fact that the content chimes with her political desires and you have a degree of stickiness, even when some doubt is cast on the content. What is emerging, amid all the landfill ofgiant balls of cats, is the use of AI to create, idealise and sanitise political scenarios by rendering them in triumphant or nostalgic visual language.Prof Roland Meyer, a scholar of media and visual culture,notes one particular“recent wave of AI-generated images of white, blond families presented by neofascist online accounts as models of a desirable future”. He attributes this not just to the political moment, but to the fact that “generative AI is structurally conservative, even nostalgic”. Generative AI is trained on pre-existing data, which research has shown isinherently biasedagainst ethnic diversity, progressive gender roles and sexual orientations, therefore concentrating those norms in the output.0:54Donald Trump shares bizarre AI-generated video of ‘Trump Gaza’ – videoThe same can be seen in “trad wife” content, which summons not only beautiful supplicant homemakers, but an entire throwback world in which men can immerse themselves. X timelines are awash with a sort of clothed nonsexual pornography, as AI images of women described as comely, fertile and submissive glimmer on the screen. White supremacy, autocracy, and fetishisation of natural hierarchies in race and gender are packaged as nostalgia for an imagined past. AI is already being described asthe new aesthetic of fascism.But it isn’t always as coherent as that. Most of the time, AI slop is just content-farming chaos. Exaggerated or sensationalised online material boosts engagement, giving creators the chance to make money based on shares, comments and so on. Journalist Max Readfound that Facebook AI slop– the sloppiest of them all – is, “as far as Facebook is concerned”, not “junk”, but “precisely what the company wants: highly engaging content”. To social media giants, content is content; the cheaper it is, the less human labour it involves, the better. The outcome is an internet of robots, tickling human users into whatever feelings and passions keep them engaged.But whatever the intent of its creators, this torrent of AI content leads to the desensitisation and overwhelming of visual palates. The overall effect of being exposed to AI images all the time, from the nonsensical to the soothing to the ideological, is that everything begins to land in a different way. In the real world, US politicianspose outside prison cages of deportees. Students at US universities areambushedin the street and spirited away. People in Gazaburnalive. These pictures and videos join an infinite stream of others that violate physical and moral laws. The result is profound disorientation. You can’t believe your eyes, but also what can you believe if not your eyes? Everything starts to feel both too real and entirely unreal.I can’t delete WhatsApp’s new AI tool. But I’ll use it over my dead body | Polly HudsonRead moreCombine that with the necessary trivialisation and provocative brevity of the attention economy and you have a grand circus of excess. Even when content is deeply serious, it is presented as entertainment or, as an intermission, in a sort of visual elevator music. Horrified by Donald Trump and JD Vance’s attack on Zelenskyy? Well, here is anAI renderingof Vance as a giant baby. Feeling stressed and overwhelmed? Here is some eye balm – acabin with a roaring fire and snow falling outside. Facebook has for some reason decided I need to see a constant stream of compact, cutesystudio apartmentswith a variation of “this is all I need” captions.And the rapid mutation of the algorithm then feeds users more and more of what it has harvested and deemed interesting to them. The result is that all media consumption, even for the most discerning users, becomes impossible to curate. You are immersed deeper and deeper into subjective worlds rather than objective reality. The result is a very weird disjuncture. The sense of urgency and action that our crisis-torn world should inspire is instead blunted by how information is presented. Here, there is a new way of sleepwalking into disaster. Not through lack of knowledge, but through the paralysis caused by every event being filtered through this perverse ecosystem – just another part of the maximalist visual and show.Nesrine Malik is a Guardian columnistExplore more on these topicsArtificial intelligence (AI)OpinionSocial mediaWhatsAppXFacebookYouTubecommentShareReuse this content

There are two parallel image channels that dominate our daily visual consumption. In one, there are real pictures and footage of the world as it is: politics, sport, news and entertainment. In the other is AI slop, low-quality content with minimal human input. Some of it is banal and pointless – cartoonish images of celebrities, fantasy landscapes, anthropomorphised animals. And some is a sort of pornified display of women just simply … being, like a virtual girlfriend you cannot truly interact with. The range and scale of the content is staggering, and infiltrates everything from social media timelines to messages circulated onWhatsApp. The result is not just a blurring of reality, but a distortion of it.A new genre of AI slop is rightwing political fantasy. There are entireYouTube videosof made-up scenarios in which Trump officials prevail against liberal forces. The White House account on X jumped on a trend of creating images in Studio Ghibli style and posted animage of a Dominican woman in tearsas she is arrested by Immigration and Customs Enforcement (Ice). AI political memefare has, in fact, gone global. Chinese AI videos mocking overweight US workers on assembly lines after the tariff announcementraised a questionfor, and response from, the White House spokesperson last week. The videos, she said, were made by those who “do not see the potential of the American worker”. And to prove how pervasive AI slop is, I had to triple-check that even that response was not itself quickly cobbled-together AI content fabricating another dunk on Trump’s enemies.The impulse behind this politicisation of AI is not new; it is simply an extension of traditional propaganda. What is new is how democratised and ubiquitous it has become, and how it involves no real people or the physical constraints of real life, therefore providing an infinite number of fictional scenarios.The fact that AI content is also spread through huge and ubiquitous chat channels such as WhatsApp means that there are no replies or comments to challenge its veracity. Whatever you receive is imbued with the authority and reliability of the person who has sent it. I am in a constant struggle with an otherwise online-savvy elderly relative who receives and believes a deluge of AI content on WhatsApp about Sudan’s war. The images and videos look real to her and are sent by people she trusts. Even absorbing that technology is capable of producing content with such verisimilitude is difficult. Combine this with the fact that the content chimes with her political desires and you have a degree of stickiness, even when some doubt is cast on the content. What is emerging, amid all the landfill ofgiant balls of cats, is the use of AI to create, idealise and sanitise political scenarios by rendering them in triumphant or nostalgic visual language.Prof Roland Meyer, a scholar of media and visual culture,notes one particular“recent wave of AI-generated images of white, blond families presented by neofascist online accounts as models of a desirable future”. He attributes this not just to the political moment, but to the fact that “generative AI is structurally conservative, even nostalgic”. Generative AI is trained on pre-existing data, which research has shown isinherently biasedagainst ethnic diversity, progressive gender roles and sexual orientations, therefore concentrating those norms in the output.0:54Donald Trump shares bizarre AI-generated video of ‘Trump Gaza’ – videoThe same can be seen in “trad wife” content, which summons not only beautiful supplicant homemakers, but an entire throwback world in which men can immerse themselves. X timelines are awash with a sort of clothed nonsexual pornography, as AI images of women described as comely, fertile and submissive glimmer on the screen. White supremacy, autocracy, and fetishisation of natural hierarchies in race and gender are packaged as nostalgia for an imagined past. AI is already being described asthe new aesthetic of fascism.But it isn’t always as coherent as that. Most of the time, AI slop is just content-farming chaos. Exaggerated or sensationalised online material boosts engagement, giving creators the chance to make money based on shares, comments and so on. Journalist Max Readfound that Facebook AI slop– the sloppiest of them all – is, “as far as Facebook is concerned”, not “junk”, but “precisely what the company wants: highly engaging content”. To social media giants, content is content; the cheaper it is, the less human labour it involves, the better. The outcome is an internet of robots, tickling human users into whatever feelings and passions keep them engaged.But whatever the intent of its creators, this torrent of AI content leads to the desensitisation and overwhelming of visual palates. The overall effect of being exposed to AI images all the time, from the nonsensical to the soothing to the ideological, is that everything begins to land in a different way. In the real world, US politicianspose outside prison cages of deportees. Students at US universities areambushedin the street and spirited away. People in Gazaburnalive. These pictures and videos join an infinite stream of others that violate physical and moral laws. The result is profound disorientation. You can’t believe your eyes, but also what can you believe if not your eyes? Everything starts to feel both too real and entirely unreal.I can’t delete WhatsApp’s new AI tool. But I’ll use it over my dead body | Polly HudsonRead moreCombine that with the necessary trivialisation and provocative brevity of the attention economy and you have a grand circus of excess. Even when content is deeply serious, it is presented as entertainment or, as an intermission, in a sort of visual elevator music. Horrified by Donald Trump and JD Vance’s attack on Zelenskyy? Well, here is anAI renderingof Vance as a giant baby. Feeling stressed and overwhelmed? Here is some eye balm – acabin with a roaring fire and snow falling outside. Facebook has for some reason decided I need to see a constant stream of compact, cutesystudio apartmentswith a variation of “this is all I need” captions.And the rapid mutation of the algorithm then feeds users more and more of what it has harvested and deemed interesting to them. The result is that all media consumption, even for the most discerning users, becomes impossible to curate. You are immersed deeper and deeper into subjective worlds rather than objective reality. The result is a very weird disjuncture. The sense of urgency and action that our crisis-torn world should inspire is instead blunted by how information is presented. Here, there is a new way of sleepwalking into disaster. Not through lack of knowledge, but through the paralysis caused by every event being filtered through this perverse ecosystem – just another part of the maximalist visual and show.Nesrine Malik is a Guardian columnist

There are two parallel image channels that dominate our daily visual consumption. In one, there are real pictures and footage of the world as it is: politics, sport, news and entertainment. In the other is AI slop, low-quality content with minimal human input. Some of it is banal and pointless – cartoonish images of celebrities, fantasy landscapes, anthropomorphised animals. And some is a sort of pornified display of women just simply … being, like a virtual girlfriend you cannot truly interact with. The range and scale of the content is staggering, and infiltrates everything from social media timelines to messages circulated onWhatsApp. The result is not just a blurring of reality, but a distortion of it.A new genre of AI slop is rightwing political fantasy. There are entireYouTube videosof made-up scenarios in which Trump officials prevail against liberal forces. The White House account on X jumped on a trend of creating images in Studio Ghibli style and posted animage of a Dominican woman in tearsas she is arrested by Immigration and Customs Enforcement (Ice). AI political memefare has, in fact, gone global. Chinese AI videos mocking overweight US workers on assembly lines after the tariff announcementraised a questionfor, and response from, the White House spokesperson last week. The videos, she said, were made by those who “do not see the potential of the American worker”. And to prove how pervasive AI slop is, I had to triple-check that even that response was not itself quickly cobbled-together AI content fabricating another dunk on Trump’s enemies.The impulse behind this politicisation of AI is not new; it is simply an extension of traditional propaganda. What is new is how democratised and ubiquitous it has become, and how it involves no real people or the physical constraints of real life, therefore providing an infinite number of fictional scenarios.The fact that AI content is also spread through huge and ubiquitous chat channels such as WhatsApp means that there are no replies or comments to challenge its veracity. Whatever you receive is imbued with the authority and reliability of the person who has sent it. I am in a constant struggle with an otherwise online-savvy elderly relative who receives and believes a deluge of AI content on WhatsApp about Sudan’s war. The images and videos look real to her and are sent by people she trusts. Even absorbing that technology is capable of producing content with such verisimilitude is difficult. Combine this with the fact that the content chimes with her political desires and you have a degree of stickiness, even when some doubt is cast on the content. What is emerging, amid all the landfill ofgiant balls of cats, is the use of AI to create, idealise and sanitise political scenarios by rendering them in triumphant or nostalgic visual language.Prof Roland Meyer, a scholar of media and visual culture,notes one particular“recent wave of AI-generated images of white, blond families presented by neofascist online accounts as models of a desirable future”. He attributes this not just to the political moment, but to the fact that “generative AI is structurally conservative, even nostalgic”. Generative AI is trained on pre-existing data, which research has shown isinherently biasedagainst ethnic diversity, progressive gender roles and sexual orientations, therefore concentrating those norms in the output.0:54Donald Trump shares bizarre AI-generated video of ‘Trump Gaza’ – videoThe same can be seen in “trad wife” content, which summons not only beautiful supplicant homemakers, but an entire throwback world in which men can immerse themselves. X timelines are awash with a sort of clothed nonsexual pornography, as AI images of women described as comely, fertile and submissive glimmer on the screen. White supremacy, autocracy, and fetishisation of natural hierarchies in race and gender are packaged as nostalgia for an imagined past. AI is already being described asthe new aesthetic of fascism.But it isn’t always as coherent as that. Most of the time, AI slop is just content-farming chaos. Exaggerated or sensationalised online material boosts engagement, giving creators the chance to make money based on shares, comments and so on. Journalist Max Readfound that Facebook AI slop– the sloppiest of them all – is, “as far as Facebook is concerned”, not “junk”, but “precisely what the company wants: highly engaging content”. To social media giants, content is content; the cheaper it is, the less human labour it involves, the better. The outcome is an internet of robots, tickling human users into whatever feelings and passions keep them engaged.But whatever the intent of its creators, this torrent of AI content leads to the desensitisation and overwhelming of visual palates. The overall effect of being exposed to AI images all the time, from the nonsensical to the soothing to the ideological, is that everything begins to land in a different way. In the real world, US politicianspose outside prison cages of deportees. Students at US universities areambushedin the street and spirited away. People in Gazaburnalive. These pictures and videos join an infinite stream of others that violate physical and moral laws. The result is profound disorientation. You can’t believe your eyes, but also what can you believe if not your eyes? Everything starts to feel both too real and entirely unreal.I can’t delete WhatsApp’s new AI tool. But I’ll use it over my dead body | Polly HudsonRead moreCombine that with the necessary trivialisation and provocative brevity of the attention economy and you have a grand circus of excess. Even when content is deeply serious, it is presented as entertainment or, as an intermission, in a sort of visual elevator music. Horrified by Donald Trump and JD Vance’s attack on Zelenskyy? Well, here is anAI renderingof Vance as a giant baby. Feeling stressed and overwhelmed? Here is some eye balm – acabin with a roaring fire and snow falling outside. Facebook has for some reason decided I need to see a constant stream of compact, cutesystudio apartmentswith a variation of “this is all I need” captions.And the rapid mutation of the algorithm then feeds users more and more of what it has harvested and deemed interesting to them. The result is that all media consumption, even for the most discerning users, becomes impossible to curate. You are immersed deeper and deeper into subjective worlds rather than objective reality. The result is a very weird disjuncture. The sense of urgency and action that our crisis-torn world should inspire is instead blunted by how information is presented. Here, there is a new way of sleepwalking into disaster. Not through lack of knowledge, but through the paralysis caused by every event being filtered through this perverse ecosystem – just another part of the maximalist visual and show.Nesrine Malik is a Guardian columnist

There are two parallel image channels that dominate our daily visual consumption. In one, there are real pictures and footage of the world as it is: politics, sport, news and entertainment. In the other is AI slop, low-quality content with minimal human input. Some of it is banal and pointless – cartoonish images of celebrities, fantasy landscapes, anthropomorphised animals. And some is a sort of pornified display of women just simply … being, like a virtual girlfriend you cannot truly interact with. The range and scale of the content is staggering, and infiltrates everything from social media timelines to messages circulated onWhatsApp. The result is not just a blurring of reality, but a distortion of it.

A new genre of AI slop is rightwing political fantasy. There are entireYouTube videosof made-up scenarios in which Trump officials prevail against liberal forces. The White House account on X jumped on a trend of creating images in Studio Ghibli style and posted animage of a Dominican woman in tearsas she is arrested by Immigration and Customs Enforcement (Ice). AI political memefare has, in fact, gone global. Chinese AI videos mocking overweight US workers on assembly lines after the tariff announcementraised a questionfor, and response from, the White House spokesperson last week. The videos, she said, were made by those who “do not see the potential of the American worker”. And to prove how pervasive AI slop is, I had to triple-check that even that response was not itself quickly cobbled-together AI content fabricating another dunk on Trump’s enemies.

The impulse behind this politicisation of AI is not new; it is simply an extension of traditional propaganda. What is new is how democratised and ubiquitous it has become, and how it involves no real people or the physical constraints of real life, therefore providing an infinite number of fictional scenarios.

The fact that AI content is also spread through huge and ubiquitous chat channels such as WhatsApp means that there are no replies or comments to challenge its veracity. Whatever you receive is imbued with the authority and reliability of the person who has sent it. I am in a constant struggle with an otherwise online-savvy elderly relative who receives and believes a deluge of AI content on WhatsApp about Sudan’s war. The images and videos look real to her and are sent by people she trusts. Even absorbing that technology is capable of producing content with such verisimilitude is difficult. Combine this with the fact that the content chimes with her political desires and you have a degree of stickiness, even when some doubt is cast on the content. What is emerging, amid all the landfill ofgiant balls of cats, is the use of AI to create, idealise and sanitise political scenarios by rendering them in triumphant or nostalgic visual language.

Prof Roland Meyer, a scholar of media and visual culture,notes one particular“recent wave of AI-generated images of white, blond families presented by neofascist online accounts as models of a desirable future”. He attributes this not just to the political moment, but to the fact that “generative AI is structurally conservative, even nostalgic”. Generative AI is trained on pre-existing data, which research has shown isinherently biasedagainst ethnic diversity, progressive gender roles and sexual orientations, therefore concentrating those norms in the output.

0:54Donald Trump shares bizarre AI-generated video of ‘Trump Gaza’ – video

0:54

0:54

0:54

0:54

0:54

0:54

0:54

The same can be seen in “trad wife” content, which summons not only beautiful supplicant homemakers, but an entire throwback world in which men can immerse themselves. X timelines are awash with a sort of clothed nonsexual pornography, as AI images of women described as comely, fertile and submissive glimmer on the screen. White supremacy, autocracy, and fetishisation of natural hierarchies in race and gender are packaged as nostalgia for an imagined past. AI is already being described asthe new aesthetic of fascism.

But it isn’t always as coherent as that. Most of the time, AI slop is just content-farming chaos. Exaggerated or sensationalised online material boosts engagement, giving creators the chance to make money based on shares, comments and so on. Journalist Max Readfound that Facebook AI slop– the sloppiest of them all – is, “as far as Facebook is concerned”, not “junk”, but “precisely what the company wants: highly engaging content”. To social media giants, content is content; the cheaper it is, the less human labour it involves, the better. The outcome is an internet of robots, tickling human users into whatever feelings and passions keep them engaged.

But whatever the intent of its creators, this torrent of AI content leads to the desensitisation and overwhelming of visual palates. The overall effect of being exposed to AI images all the time, from the nonsensical to the soothing to the ideological, is that everything begins to land in a different way. In the real world, US politicianspose outside prison cages of deportees. Students at US universities areambushedin the street and spirited away. People in Gazaburnalive. These pictures and videos join an infinite stream of others that violate physical and moral laws. The result is profound disorientation. You can’t believe your eyes, but also what can you believe if not your eyes? Everything starts to feel both too real and entirely unreal.

I can’t delete WhatsApp’s new AI tool. But I’ll use it over my dead body | Polly HudsonRead more

I can’t delete WhatsApp’s new AI tool. But I’ll use it over my dead body | Polly HudsonRead more

I can’t delete WhatsApp’s new AI tool. But I’ll use it over my dead body | Polly HudsonRead more

I can’t delete WhatsApp’s new AI tool. But I’ll use it over my dead body | Polly Hudson

I can’t delete WhatsApp’s new AI tool. But I’ll use it over my dead body | Polly Hudson

Read more

Read more

Combine that with the necessary trivialisation and provocative brevity of the attention economy and you have a grand circus of excess. Even when content is deeply serious, it is presented as entertainment or, as an intermission, in a sort of visual elevator music. Horrified by Donald Trump and JD Vance’s attack on Zelenskyy? Well, here is anAI renderingof Vance as a giant baby. Feeling stressed and overwhelmed? Here is some eye balm – acabin with a roaring fire and snow falling outside. Facebook has for some reason decided I need to see a constant stream of compact, cutesystudio apartmentswith a variation of “this is all I need” captions.

And the rapid mutation of the algorithm then feeds users more and more of what it has harvested and deemed interesting to them. The result is that all media consumption, even for the most discerning users, becomes impossible to curate. You are immersed deeper and deeper into subjective worlds rather than objective reality. The result is a very weird disjuncture. The sense of urgency and action that our crisis-torn world should inspire is instead blunted by how information is presented. Here, there is a new way of sleepwalking into disaster. Not through lack of knowledge, but through the paralysis caused by every event being filtered through this perverse ecosystem – just another part of the maximalist visual and show.

Nesrine Malik is a Guardian columnist

Nesrine Malik is a Guardian columnist

Nesrine Malik is a Guardian columnist

Explore more on these topicsArtificial intelligence (AI)OpinionSocial mediaWhatsAppXFacebookYouTubecommentShareReuse this content

Artificial intelligence (AI)OpinionSocial mediaWhatsAppXFacebookYouTubecomment

Artificial intelligence (AI)OpinionSocial mediaWhatsAppXFacebookYouTubecomment

Artificial intelligence (AI)

Opinion

Social media

WhatsApp

X

Facebook

YouTube

comment

ShareReuse this content

Reuse this content

Exit mobile version