The investigations will probe whether the platforms make it easy to report illegal content. Ireland’s media regulator, Coimisiún na Meán (CnaM), will be investigating LinkedIn and TikTok under the EU Digital Services Act (DSA) over suspicions that the platforms’ content reporting mechanisms are not up to code. This is not TikTok’s first brush up against the DSA. The platform was found to have broken the law in a preliminary finding earlier this year. The new investigations, announced yesterday (2 December) evening, will look into whether reporting mechanisms for illegal content implemented by TikTok and LinkedIn are easy to access and user friendly, and whether the mechanisms allow users to anonymously report suspected child sexual abuse material (CSAM). In addition, the investigations will also probe whether the content reporting mechanisms “deceive” users from reporting potentially illegal material. The investigations materialised after CnaM began reviewing a number of online platforms last year to check their compliance under DSA’s Article 16. Article 16 concerns the ‘Notice and Action’ mechanisms which service providers are required to have in place to allow people to report content that they suspect to be illegal. A part of the article also regulates reporting CSAM. During its review, concerns
Read More












