Grok’s ability to sexualize women - are we really surprised?

Sex sells.

Any publicity is good publicity.

Whether it’s right or wrong, it doesn’t matter to many people.

It’s funny because I go out of my way to blur women, but here we are now where all kinds of women, even hijabis are being exposed with AI.

From Reddit:

there has been an increase in people misusing AI tools to turn hijabi women pics into harmful images and spreading them online without consent.

you have a bunch of guys asking Grok things like "put her in a bikini", "turn her around", "make her touch her toes while turned around"

Muslim woman if you are reading this and share your images online, you might want to check out what's happening in X because images shared online are no longer in your possession and people can do with them as they like

This one video of this known hijabi went viral and it's fully extreme NSFW, published on Corn sites.

Please be cautious with public photos May Allah protect us all.😧

Unfortunate of course, but not surprising. I mentioned this over a year ago in my other blog when discussing about “undressing” websites:

This is another reason why women and girls in general, shouldn’t post their photos online. The fitna is already there even if the photos aren’t sexualized, but this is a whole other level of just destroying a girls reputation. 

Imagine if this became rampant in the Muslim community? It would just be a huge mess, with families’ reputations being tarnished and girls being slandered against left and right. Imagine a high school or middle school boy liking a muslim girl in school, and trying this feature on her. She may not even be one who posts photos online and might not even be involved in social media, but anyone can just take your photo these days and do whatever they want with it. 

Previous
Previous

OpenAI releases ChatGPT Health.

Next
Next

What phone fasting can do for you.