Apple’s struggles with the App store continue and the New York Times sheds a light on it on a recently released piece.
A website named App Danger Project is expected to put parents at ease when it comes to their children using apps freely. The website uses an algorithm to provide resources for parents. The project flagged some apps “with at least some reviews indicating dangerousness”.
Project has 182 apps listed that can be considered dangerous according to App Store and Google Play Store’s standards. It can be seen from filtering that 146 of those apps are on the App Store, while the remaining 36 are on Google Play Store.
On the website, anyone can search and analyze reviews from both app stores.
This information comes from reviews in the search results that talk about child pornography, pedophiles, and other signs that the app might be used for harming children.
Snapchat, for instance, has 23 reviews “that indicate that this app is unsafe for children.” Whatsapp, a popular messaging app, also comes back with 16 reviews. Other popular social media platforms; Facebook has 4 reviews, Instagram has zero, and TikTok is not even on the database yet.
These user reviews do not necessarily mean the app is used for child exploitation. However, the New York Times piece stated Apple removed 10 apps from the App Store after investigating the App Danger Project list and, also, declined to share a list of those apps or the reasoning behind this action.
“Our App Review team works 24/7 to carefully review every new app and app update to ensure it meets Apple’s standards,” a spokesman stated.
It is worth noting that any online communication can be risky, especially for children. Many companies face challenges on their road to protecting children in their platforms. For example, during the past year, the UK fined TikTok for misusing children’s data, the EU fined Instagram for mishandling children’s data, and the FTC fines Epic over children’s data privacy.