MCMC summons Meta over profits from scam ads

MCMC summons Meta over profits from scam ads

MCMC commissioner Derek Fernandez says the commission will be initiating an investigation into the ‘very serious and disturbing’ allegation.

mcmc
MCMC said it requested Meta to remove 157,208 illegal online advertisements and 44,922 scam advertisements from Jan 1 to Nov 4. (MCMC pic)
PETALING JAYA:
The Malaysian Communications and Multimedia Commission will be summoning Meta to explain allegations that the firm internally projected in 2024 it would earn about 10% of its revenue from running advertisements for scams and banned goods.

MCMC commissioner Derek Fernandez said the commission viewed the claims, revealed in a Reuters report on Thursday, with “grave concern”.

“MCMC will be summoning Meta to seek clarification on the veracity of these claims.

“The commission will also be initiating an investigation into the matter, as the allegations are very serious and disturbing,” he told FMT.

Derek Fernandez
Derek Fernandez.

Reuters reported that the social media giant projected late last year it would earn about 10% of its overall revenue – or US$16 billion – from running advertising for scams and banned products.

It cited Meta’s internal documents showing that the company had failed to stop a flood of ads over the last three years.

This exposed billions of Facebook, Instagram and WhatsApp users to fraudulent e-commerce and investment schemes, illegal online casinos and banned medical products.

One December 2024 document stated the company showed its platform users an estimated 15 billion “higher risk” scam advertisements – those that show clear signs of being fraudulent – every day.

Meta earns about US$7 billion in annualised revenue from this category of scam ads each year, another late 2024 document stated.

Meta spokesman Andy Stone was reported as saying that the documents seen by Reuters “present a selective view that distorts Meta’s approach to fraud and scams” and that the 10.1% revenue estimate was “rough and overly inclusive”.

Stone said the company had later determined that the true number was lower because the estimate included “many” legitimate ads as well. He, however, declined to provide an updated figure.

Fernandez said MCMC had, before the Reuters report, engaged with several social media platform providers, including Meta, on similar issues.

“In the case of Meta, MCMC had highlighted its concerns regarding Facebook due to the very high number of takedown requests that had to be issued in relation to scams and online gambling content, compared with other platforms,” ge said.

He said MCMC had requested Meta to remove 157,208 illegal online advertisements and 44,922 scam advertisements between Jan 1 and Nov 4.

“These figures are unusually high compared to other platforms.

“For example, regarding online gambling advertisements, the number of takedown requests stood at 3,956 for TikTok, 269 for Telegram, 11 for X (formerly Twitter), and 45,448 for YouTube, respectively,” he said.

Fernandez also said that to effectively combat scams and illegal gambling, social media platforms should be licensed and required to demonstrate the measures they have in place to curb such content and identify criminals operating on their platforms.

“They must also be able to explain why their technology cannot deliver stronger preventive outcomes,” he added.

Stricter measures?

Fernandez said if the situation does not improve, it may be necessary to introduce statutory rights for victims to pursue legal recourse against platforms that allow criminal activities to take place or fail to identify offenders who repeatedly exploit their platforms to harm others.

“These activities have placed a significant strain on law enforcement resources, which continue to be diverted to monitor and police platforms that profit from the traffic generated by such harmful content.

“Where evidence shows that any platform has knowingly aided or abetted such offences, MCMC will not hesitate to take appropriate action in accordance with national laws,” he said.

He also said it may be timely to consider the introduction of a public safety and online harm rating system for social media and messaging platforms, accompanied by the regular publication of transparency statistics.

“This will enhance accountability, support more informed public awareness and encourage stronger self-regulatory practices within the industry,” he said.

Stay current - Follow FMT on WhatsApp, Google news and Telegram

Subscribe to our newsletter and get news delivered to your mailbox.