Taylor Swift No Longer Searchable on Twitter After Deepfake AI Images Go Viral

Jan 27, 2024 15:17



Taylor Swift's Name No Longer Searchable on X After AI-Generated Explicit Photos Go Viral: https://t.co/Ir07ds3hf5
- TheWrap (@TheWrap) January 27, 2024
Searching “Taylor Swift” on Twitter are yielding no results, this as the social media platform is no longer allowing certain searches of the singer after AI nudes flooded the service earlier this ( Read more... )

celebrity social media, computers and technology, taylor swift, viral

Leave a comment

Comments 90

firstknivesclub January 27 2024, 23:20:46 UTC
Musk is a fucking idiot lmao

Reply


musicnkisses January 27 2024, 23:22:06 UTC
I wish other celebs who have complained about this got the same energy. Hopefully something good comes out of it

Reply

iznanassi January 27 2024, 23:44:28 UTC
you just know it's gonna be something super vague with an insane evidence threshold that only a certain type of person can hope to even begin to seek justice for, if anything

Reply

genbu_no_miko24 January 27 2024, 23:47:19 UTC
Her fans can complain but I swear I've never seen a more powerful white woman (in the industry) than Taylor. Even though I don't believe in forcing people to speak up, I can see why people would want her to cause she clearly wields a certain level of power and attention. Like obvs she won't change the world at the drop of a hat but there's no denying she hasn't some sway.

Reply

helliosx January 27 2024, 23:50:08 UTC
Henry Cavill got straight up blocked on bing create too

Reply


genbu_no_miko24 January 27 2024, 23:22:47 UTC
On one hand good but on the other hand, I gotta admit where is this action for a lot of stuff that pops up on twitter/social media.

Reply


pyroyale January 27 2024, 23:23:29 UTC
Well, that's the bare minimum. How about banning the accounts sharing them? I don't think Twitter has the responsibility or reach to ban AI images but it can ban the people posting them. It could also stop taking advertising money from accounts trying to hawk AI but as Twitter needs every cent it can get, I doubt that will happen.

Reply

insomniachobs January 28 2024, 00:23:26 UTC
If they hadn't gutted the moderation and made it a free for all for racists, misogynists and Nazis in the name of free speech, they could at least just be dealing with pornographic images when they come up (AI or no) but this is all happening by Mollusc's design.

I actually went to report those photos when I saw them come up and they have massively weakened the category for inappropriate images. I took one look at it and the language was so weak I just knew it was pointless. Same way I reported an account that explicitly stated it existed to target black women, had a timeline full of the most overt racism, and I got told it wasn't against TOS.

Reply


tilney January 27 2024, 23:24:39 UTC
I don't know how, but this whole AI image generation thing using photos of people needs to be shut the fuck down NOW. Absolutely nothing good will ever come of this being more and more advanced and lifelike (I know it's probably too late and we're screwed but still).

Reply

the_wicker_man January 28 2024, 07:43:38 UTC
I think it kind shows how ill equipped our current system is for rapid advancements in technology. The answer right now would be that everybody's likeness is trademarked, so everybody is a commodity whether we like it not, and we protect that commodity.

Edited for brevity

Reply

tilney January 28 2024, 19:57:37 UTC
It's absolutely the case that the legislation is woefully underprepared. Yes that might be the answer!

Reply

ahnaaah January 28 2024, 11:39:27 UTC
Last year there was massive case in a private (and very expensive) school where i live of boys doing pornographic deep fake of the girls they go to school with and sharing between each other. The girls only learned because one boy that received or saw told them, when they took to admin the school was complacent with it, the boys parents were assholes who tried to treat as joking instead of the sexual harrasing that it was so the girls parent had to go to cops/public media/judicial route so they could at least have answers

Reply


Leave a comment

Up