The online platforms now have three months to report on the measures they have taken to remove the illegal content.
The decision was made after the regulator received several takedown notices for these providers from EU authorities over the past year under the Terrorist Content Regulation on the Internet.
According to the Irish decision, TikTok, X and Meta’s Instagram will have to take specific measures to protect their services from being used to disseminate terrorist content – for example, content glorifying terrorist acts – and report back to Coimisiún na Meán within three months.
In case of non-compliance, the Irish supervisory authority may impose an administrative fine of up to four percent of the company’s global turnover.
In addition to the rules for combating terrorist content on the Internet, the regulator also oversees compliance with the Digital Services Act (DSA) and the Internet Security and Media Regulation Act in Ireland.
Under the DSA, some platforms have already been the subject of formal questions and investigations by the European Commission. In December last year, it launched an investigation against X on suspicion that the platform was disseminating illegal content and disinformation, including terrorist and violent content in the context of the war between Israel and Hamas.
Last month, Ireland adopted a new online safety code for video-sharing platforms such as TikTok and Facebook to protect people from harmful online content. It sets out binding rules for platforms with European headquarters in Ireland and will come into force this month.