The European Commission has intensified its scrutiny of major social media platforms, including YouTube, TikTok, and Snapchat, by demanding detailed information about how their content algorithms function. This request, made on Wednesday, is part of the EU’s broader efforts to address potential risks associated with the content these platforms recommend to users. The commission is particularly concerned with how these algorithms may contribute to the amplification of harmful or dangerous material, especially regarding electoral processes, mental health, and the protection of minors.
Under the Digital Services Act (DSA), these platforms are required to explain the steps they are taking to mitigate the risks posed by their recommendation systems. The EU Commission is looking specifically into how these companies manage the spread of illegal content, such as the promotion of illicit drugs and hate speech. TikTok, in particular, is under additional pressure to reveal how it is safeguarding its platform against bad actors who might manipulate it during elections or disrupt civic discourse.
The tech companies have until November 15 to submit their reports, after which the EU will evaluate their responses. Depending on the findings, the commission may take further action, potentially leading to fines or other penalties. This latest inquiry follows earlier non-compliance proceedings initiated by the EU against Meta’s Facebook and Instagram, AliExpress, and TikTok, as part of the EU’s growing focus on ensuring that tech giants do more to regulate harmful and illegal content under the DSA.
Comments
Loading…