The student news site of The Archer School for Girls

The Oracle

The Latest
The student news site of The Archer School for Girls

The Oracle

The student news site of The Archer School for Girls

The Oracle

Instagram Feed
Email Subscription

Column: AI deepfakes denote deep consequences for all

Photo credit: Allie Yang
Celebrity names are connected by arrows to 4chan and Telegram logos. These are just some of the public figures who have been explicitly disfigured by AI on social platforms. (Graphic Illustration by Allie Yang)

Content Warning: This column contains mention of mature and sexual themes that may not be suitable for younger readers.

Even though she made Grammys history and was named Time’s Person of the Year, Taylor Swift just can’t catch a break.

In late January, a horrifying trend emerged intending to reduce her to an objectified pawn. The message board 4chan, which is known for spreading conspiratorial hate speech, facilitated a chatroom where participants eagerly used text-to-image AI. Their end goal with this technology was to create and share explicit images of the singer-songwriter. Compliments flowed within 4chan itself, and in some cases, the images spread and became viral in a matter of weeks. For example, one photo accrued 24,000 reposts, over 100,000 likes and 45 million views in a mere 17 hours before being taken down, while others, all marked with the hashtag “Taylor Swift AI” trended and were posted across multiple X accounts.

The rise and spread of explicit AI-generated deepfakes are tainted with a game-like, albeit sickening, nature, because, according to Graphika senior analyst Cristina López, “‘These images originated from a community of people motivated by the ‘challenge’ of circumventing the safeguards of generative A.I. products, and new restrictions are seen as just another obstacle to ‘defeat.'” This pseudo-competition wouldn’t even exist if it weren’t for the abuse of AI image generation with real lives at stake.

This deepfake pattern is further fueled by collaboration. As users of sites like 4chan and Telegram share tips with each other on how to evade restrictions, this pattern of abusing technology persists, until public figures are continuously defaced in the public eye. White House Press Secretary Karine Jean-Pierre said it best: “‘[Social media has] an important role to play in enforcing their own rules to prevent the spread of misinformation and non-consensual, intimate imagery of real people.'”

Increased user freedom within the internet and access to its tools cannot come at the sacrifice of basic human decency.

These deepfakes do not only affect celebrities but also extend their influence to demographics as vulnerable as female minors. Prior to the Taylor Swift sweep, boys in southern Spain between the ages of 12 and 14 used an AI-powered platform to digitally depict real girls as being completely undressed. Over 20 female adolescents, some as young as 11 years old, had their dignity and privacy robbed from them because of AI programs they were completely unaware of, sparking well-deserved outrage around the world.

Child pornography is already a societal epidemic: Between 2008 and 2023, reports of child sexual abuse materials have risen by 15,000%. In 2021, the National Center for Missing and Exploited Children received 85 million pieces of content containing child sexual exploitation. The fact that this crisis has reached a new, digital plane further reinforces the need for a collective moral check-in: Is it right to undress, deface or create a lewd portrait out of someone from the comfort of your anonymity?

If your answer is not an immediate and resounding “no,” I encourage you to reflect on why and what your answer would mean for societal power dynamics: Whose individual rights are at risk if objectifying and disparaging photos can be created by an accessible technology for malicious gains in an instant? The truth of the matter is simple: Whether it is their bodies or their identities, no human is a chess piece — policies protecting their rights are not puzzles to be solved or games to be won.

Leave a Comment
More to Discover
About the Contributor
Allie Yang, Columnist
After serving on Archer's yearbook, Hestia's Flame, for a year as a staffer, Allie Yang joined the Oracle as a senior reporter in 2022. She became a columnist in 2023. Her column discusses aspects of rising technology such as AI, social media, and more.

Comments (0)

As part of Archer’s active and engaged community, the Editorial Board welcomes reader comments and debate and encourages community members to take ownership of their opinions by using their names when commenting. However, in order to ensure a diverse range of opinions, the editorial board does allow anonymous comments on articles as long as the perspective cannot be obtained elsewhere, and they are respectful and relevant. We do require a valid, verified email address, which will not be displayed, but will be used to confirm your comments. Because we are a 6-12 school, the Editorial Board reserves the right to omit profanity and content that we deem inappropriate for our audience. We do not publish comments that serve primarily as an advertisement or to promote a specific product. Comments are moderated and may be edited in accordance with the Oracle’s profanity policy, but the Editorial Board will not change the intent or message of comments. They will appear once approved.
All The Oracle Picks Reader Picks Sort: Newest

Your email address will not be published. Required fields are marked *