
According to initial information from TikTok, he said the accounts were blocked over their coverage of a recent sexual assault case involving a young girl at a mosque in Batang Kali, Selangor.
“The problem is TikTok’s artificial intelligence (AI) itself. Here’s a little warning, AI can sometimes go too far and not understand that media organisation reporting is different from the content produced by ordinary people.
“So I have asked for a discussion to be held in the near future to refine the functions of TikTok accounts owned by the media so that such action is not taken in the future,” he said.
Elaborating further, Fahmi said reports on such molest cases are usually carried by media organisations and should not become an issue.
“These reports should not become a problem. Here, I see the AI problem that TikTok needs to explain to us and then also to the media companies,” he said.
Fahmi said TikTok has increased the use of AI-content moderation and this sometimes causes misunderstandings regarding content uploaded onto the platform.
“I see an opportunity here for us to discuss with TikTok to give more flexibility or perhaps give a different status to media companies because they are creating reports and we also have our own guidelines or code of ethics. That’s what TikTok needs to understand,” he said.