
Indonesia’s communications ministry has issued a strong warning to Meta Platforms Inc. over what it described as inadequate efforts to control the spread of disinformation and online gambling content across its platforms. The warning was announced on Thursday following concerns about the company’s compliance with local digital regulations.
The development came after Indonesia’s Communications and Digital Affairs Minister, Meutya Hafid, made an unscheduled visit to Meta’s operational office in Jakarta on Wednesday. During the visit, officials raised concerns about the company’s handling of harmful and illegal content circulating on services such as Facebook, Instagram and WhatsApp.
According to the ministry, Meta’s compliance with Indonesian regulations related to the removal of disinformation, online gambling material, defamation and hate speech remains insufficient. Authorities said the company had taken action on only 28.47% of flagged content linked to online gambling and disinformation.
“Disinformation, defamation, and hate content threaten lives in Indonesia, yet Meta has allowed them to persist,” Hafid said while addressing the issue.
The ministry urged Meta to strengthen its content moderation systems and accelerate the removal of harmful or illegal material from its platforms. Officials emphasized that stronger enforcement measures are necessary to ensure that digital platforms operating in the country comply with national laws designed to protect public safety and information integrity.
Meta did not immediately respond to requests for comment regarding the warning.
This is not the first time Indonesian authorities have raised concerns about content moderation on major social media platforms. Last year, the ministry summoned representatives from Meta and other social media companies, instructing them to enhance their monitoring and removal processes in response to the growing spread of misleading or harmful information online.
The latest warning highlights increasing regulatory pressure on global technology companies to improve oversight of user-generated content, particularly in countries where governments are tightening rules around online safety and misinformation.




