
They say caution is necessary, especially since in their case, the authorities were unable to take action because of “insufficient evidence”.
The first victim, who only wanted to be known as Jane, said she discovered that “nudes” of her were being circulated after her mutual friends received pictures sent from her email.
The 16-year old claims her email account was hacked and used to disseminate the deepfake nudes.
Studies have shown that victims of sexual offences often know their perpetrators. According to Jane’s brother, a mutual friend – an 18-year old – had allegedly paid others to create the deepfake nudes.
When Jane confronted the friend, he admitted producing the images, but pleaded with her not to expose him, claiming he was addicted to pornography.
Jane’s friends, meanwhile, fear that her nude images have been shared on other platforms.
She lodged a police report in Sarawak and also took to social media to highlight her plight.
Jane’s posting caught the attention of another teenager, who only wants to be known as Nancy.
Nancy also had her deepfake nudes created by the same 18-year old. And she only found out about it earlier this month through a social media post that highlighted Jane’s plight.
“But I wasn’t surprised it was him, though I was very angry,” she told FMT.
“He was already notorious back in school for being a pervert and talking dirty,” she said, adding that she had previously blocked him after learning that he had allegedly been saving girls’ photos for sexual purposes.
Nancy, who is also 18, subsequently lodged a report at the Bintulu police station.
Two arrests in Johor
In April, police remanded two teenage boys to assist in an investigation into the circulation and sale of artificial intelligence-generated nude images of women in Johor.
It was reported that one of the suspects had sourced photos from social media before using AI tools to edit the victims’ faces onto nude bodies. He then allegedly sold them online for RM2 each.