Close Menu
  • Home
  • Crypto News
  • Tech News
  • Gadgets
  • NFT’s
  • Luxury Goods
  • Gold News
  • Cat Videos
What's Hot

$599 MacBook Neo for Students: Specs, Tradeoffs, and Best Uses

March 8, 2026

Funniest Cats and Dogs Clips 2026😼🐶Try Not To Laugh😜 Part 1

March 8, 2026

🔴 24/7 LIVE CAT TV NO ADS😺 Awesome Red Squirrels and Adorable Little Birds Forest Nut Party for All

March 8, 2026
Facebook X (Twitter) Instagram
  • Contact Us
  • Disclaimer
  • Privacy Policy
  • Terms of Use
  • DMCA
Facebook X (Twitter) Instagram
KittyBNK
  • Home
  • Crypto News
  • Tech News
  • Gadgets
  • NFT’s
  • Luxury Goods
  • Gold News
  • Cat Videos
KittyBNK
Home » The mother of one of Elon Musk’s children is suing xAI over nonconsensual deepfake images
Tech News

The mother of one of Elon Musk’s children is suing xAI over nonconsensual deepfake images

January 16, 2026No Comments4 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr Email
The mother of one of Elon Musk’s children is suing xAI over nonconsensual deepfake images
Share
Facebook Twitter LinkedIn Pinterest Email

Although X removed Grok’s ability to create nonconsensual digitally undressed images on the social platform, the standalone Grok app is another story. It reportedly continues to produce “nudified” deepfakes of real people. And now, Ashley St. Clair, a conservative political strategist and mother of one of Elon Musk’s 14 children, has sued xAI for nonconsensual sexualized images of her that Grok allegedly produced.

In the court filing, St. Clair accused xAI’s Grok chatbot of creating and disseminating deepfakes of her “as a child stripped down to a string bikini, and as an adult in sexually explicit poses, covered in semen, or wearing only bikini floss.” In some cases, the chatbot allegedly produced bikini-clad deepfakes of St. Clair based on a photo of her as a 14-year-old. “People took pictures of me as a child and undressed me. There’s one where they undressed me and bent me over, and in the background is my child’s backpack that he’s wearing right now,” she said.

“I am also seeing images where they add bruises to women, beat them up, tie them up, mutilated,” St. Clair told The Guardian. “These sickos used to have to go to the dark depths of the internet, and now it is on a mainstream social media app.”

St. Clair said that, after she reported the images to X, the social platform replied that the content didn’t violate any policies. In addition, she claims that X left the images posted for up to seven days after she reported them. St. Clair said xAI then retaliated against her by creating more digitally undressed deepfakes of her, therefore “making [St. Clair] the laughingstock of the social media platform.”

She accused the company of then revoking her X Premium subscription, verification checkmark and ability to monetize content on the platform. “xAI further banned [her] from repurchasing Premium,” St. Clair’s court filing states.

On Wednesday, X said it changed its policies so that Grok would no longer generate sexualized images of children or nonconsensual nudity “in those jurisdictions where it’s illegal.” However, the standalone Grok app reportedly continues to undress and sexualize photos when prompted to do so.

Neither Apple nor Google has removed the Grok app despite explicit policy violations. (Anna Moneymaker via Getty Images)

Apple and Google have thus far done, well, absolutely nothing. Despite the multi-week outrage over the deepfakes — and an open letter from 28 advocacy groups — neither company has removed the X or Grok apps from their app stores. Both the App Store and Play Store have policies that explicitly prohibit apps that generate such content.

Neither Apple nor Google has responded to multiple requests for comment from Engadget. That includes a follow-up email sent on Friday, regarding the Grok app continuing to “nudify” photos of real women and other people.

While Apple and Google fail to act, many governments have done the opposite. On Monday, Malaysia and Indonesia banned Grok. The same day, UK regulator Ofcom opened a formal investigation into X. California opened one on Wednesday. The US Senate even passed the Defiance Act for a second time in the wake of the blowback.

“If you are a woman, you can’t post a picture, and you can’t speak, or you risk this abuse,” St. Clair told The Guardian. “It’s dangerous, and I believe this is by design. You are supposed to feed AI humanity and thoughts, and when you are doing things that particularly impact women, and they don’t want to participate in it because they are being targeted, it means the AI is inherently going to be biased.”

Speaking about Musk and his team, she added that “these people believe they are above the law, because they are. They don’t think they are going to get in trouble, they think they have no consequences.”

Credit: Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Related Posts

Galaxy S26 Ultra, Galaxy Buds 4, Dell XPS 14 and more

March 7, 2026

Samsung Galaxy Buds 4 and 4 Pro review: Impressive audio, imperfect ANC

March 6, 2026

Possibly the most charming Pokémon game yet

March 6, 2026

A beautiful laptop that excels at almost everything… except typing

March 6, 2026
Add A Comment
Leave A Reply Cancel Reply

What's New Here!

Ruvi AI’s (RUVI) CoinMarketCap Listing Sparks Early Ripple (XRP) Excitement, Experts Say It’s This Summer’s Top Altcoin

July 29, 2025

Why CEOs From Smaller Fashion Houses Are Joining Luxury Giants

December 6, 2023

Star Citizen Genesis : How It Redefines Virtual Planetary Design

October 17, 2025

iPhone 18 Pro Max Leak: Apple’s New Launch Strategy Explained

January 1, 2026

8 Top MCP Servers for Smarter AI Coding and Workflows

September 28, 2025
Facebook X (Twitter) Instagram Telegram
  • Contact Us
  • Disclaimer
  • Privacy Policy
  • Terms of Use
  • DMCA
© 2026 kittybnk.com - All Rights Reserved!

Type above and press Enter to search. Press Esc to cancel.