Former child actor Mara Wilson has opened up about the distressing experience of her image being misused in pornography before she had even entered high school.
Wilson has sounded an alarm amid the uproar surrounding X’s AI feature Grok, which is known for creating sexualized images of real individuals, warning that ‘the worst may be yet to come’.
Famous for her role as the lead character in the 1996 adaptation of Roald Dahl’s Matilda at the age of nine, Wilson has recounted the ‘nightmare’ she endured upon discovering that her image was being exploited by predators to produce child sexual abuse material (CSAM).
Now 38 years old, Wilson is alerting the public to the dangers presented by generative AI, which is being increasingly utilized to produce sexualized images of women and children without their consent.
In a piece for the Guardian, Wilson recounted her own challenging childhood experiences to emphasize the severity and reach of this issue.
She pointedly remarked that it wasn’t the Hollywood studios that misused her image sexually, but rather the public.

In her column, Wilson conveyed a lesson she has learned over the three decades since acting alongside Danny DeVito and Rhea Perlman in the hit family film.
Wilson stated: “Hollywood throws you into the pool… but it’s the public that holds your head underwater.”
This is because Wilson encountered doctored sexualized images of herself even before reaching high school.
“I’d been featured on fetish websites and Photoshopped into pornography. Grown men sent me creepy letters,” she disclosed.
Despite only appearing in family-friendly films and not being – in her words – a ‘beautiful girl’, Wilson explained that predators targeted her image for creating repugnant sexual content for a specific reason.
She clarified: “But I was a public figure, so I was accessible. That’s what child sexual predators look for: access. And nothing made me more accessible than the internet.
“It didn’t matter that those images ‘weren’t me’, or that the fetish sites were ‘technically’ legal. It was a painful, violating experience; a living nightmare I hoped no other child would have to go through.”

This situation left Wilson ‘fearing the worst’ with the rise of advanced generative AI in recent years, as more women have become victims of this evolving technology.
Wilson shared her decision to leave X after a long tenure on the platform, coinciding with a time when users were engaging the AI feature Grok to manipulate women’s photos, altering them to appear as if they were in bikinis. This disturbing trend extended to children as well.
Elon Musk had to reduce the feature’s functionality, assuring the use of geoblocks to stop Grok users from creating digital undressings of real people in jurisdictions where it is illegal.
However, Wilson expressed concern that this might be just the beginning, as some companies are contemplating open-source AI models, allowing ‘anyone to access the code behind it’.
According to her, this would pose a ‘disaster for children’s safety’, stressing the need for regulation to prevent the widespread exploitation she experienced.
She urged: “We need to be the ones demanding companies that allow the creation of CSAM be held accountable.
“We need to be demanding legislation and technological safeguards. We also need to examine our own actions: nobody wants to think that if they share photos of their child, those images could end up in CSAM.
“But it is a risk, one that parents need to protect their young children from, and warn their older children about.”

