Leading technology companies, including Meta, Snapchat, and TikTok, are pushing back against the Australian government’s decision to exclude YouTube from a forthcoming ban on social media access for children under 16, Reuters reported. The controversial law, passed in November, is one of the most stringent regulations globally, requiring platforms to block children from creating accounts or face penalties reaching A$49.5 million.
While the ban is set to be enforced by the end of the year, YouTube has been granted an exception due to its role as an educational platform and its ability to operate within a family account system that allows parental oversight. This decision, however, has ignited criticism from competing social media firms that argue it creates an uneven regulatory landscape.
Meta, the parent company of Facebook and Instagram, expressed concerns that YouTube offers the very same features the government cited as reasons for restricting social media access to children. In a statement published recently, Meta highlighted that YouTube also relies on algorithm-driven content recommendations, interactive features, and hosts potentially harmful material. The company urged Australian authorities to apply the law uniformly across all social media platforms.
TikTok also voiced objections, labeling the exemption as “illogical, anticompetitive, and short-sighted.” In an official submission to the government, the short-form video platform emphasized the need for regulatory consistency, arguing that making exceptions for a single company undermines the effectiveness of the legislation.
Similarly, Snapchat criticized the selective enforcement of the law, stating that no individual service should receive preferential treatment. In a statement released on Friday, Snap Inc. called for all digital platforms to be held to the same standard, ensuring fairness in the implementation of the new restrictions.
YouTube, for its part, has publicly defended its content moderation efforts, asserting that its automated detection systems have become more robust. The company claims to have expanded its definitions of harmful content to enhance safety measures, but this has not quelled criticism from industry rivals and child safety advocates alike.
Comments
Loading…