Mara Wilson, the star of Danny DeVito’s 1996 movie Matilda, has revealed how she worried about the cast of Stranger Things getting caught in the “deepfake apocalypse” after she herself was a victim of child sexual abuse material.
In an op-ed for The Guardian newspaper, the actress and writer wrote that, if unchecked by safeguards and regulation, artificial intelligence will lead to child stars and other young people being exploited on the internet. She said her own experience of being “used for child sexual abuse material” led to anxiety about other actors, including the stars of Disney films and Stranger Things.
“Before I was even in high school, my image had been used for child sexual abuse material (CSAM). I’d been featured on fetish websites and Photoshopped into pornography,” Wilson wrote. “Grown men sent me creepy letters. I wasn’t a beautiful girl – my awkward age lasted from about age 10 to about 25 – and I acted almost exclusively in family-friendly movies. But I was a public figure, so I was accessible. That’s what child sexual predators look for: access. And nothing made me more accessible than the internet.”
Watch on Deadline
She continued: “It was a painful, violating experience; a living nightmare I hoped no other child would have to go through. Once I was an adult, I worried about the other kids who had followed after me. Were similar things happening to the Disney stars, the Strangers Things cast, the preteens making TikTok dances and smiling in family vlogger YouTube channels?”
Wilson’s fears became a reality when X/Twitter users asked AI tool Grok to remove the clothing from images of 14-year-old actress Nell Fisher, who played Holly Wheeler in Stranger Things. Elon Musk‘s company later said it would stop the Grok account from “allowing the editing of images of real people in revealing clothing such as bikinis.”
“Generative AI has reinvented stranger danger,” Wilson said. “And this time, the fear is justified. It is now infinitely easier for any child whose face has been posted on the internet to be sexually exploited. Millions of children could be forced to live my same nightmare.”
The Miracle on 34th Street star added: “Boycotts aren’t enough. We need to be the ones demanding companies that allow the creation of CSAM be held accountable. We need to be demanding legislation and technological safeguards. We also need to examine our own actions: nobody wants to think that if they share photos of their child, those images could end up in CSAM. But it is a risk, one that parents need to protect their young children from, and warn their older children about.”

Leave a Reply