Facebook's Messenger Kids Is Bad For Kids?
Facebook notified millions of parents that their app Messenger Kids isn't as kid-friendly as they thought it was. Children could be exposed to some serious threats.
Released more than two years ago, Messenger Kids was designed as a worry-free environment for children. Kids under 13 were supposed to chat with their authorized playground/school friends. Authorization was performed by parents. However, the idyllic concept proved to be flawed. And possibly dangerous.
Parental control fails
It's been reported that Facebook informed a large number of parents of some "technical error" that could possibly threaten their kids. As further investigation showed, this "error" allowed to create chats with or invite unauthorized people. In other words, a friend of your child could call their own "approved" friends, who would also invite even more friends.
FB developers haven't thought of that issue while designing Messenger. Therefore no extra-layers of checking/approving were added. In the long run, a kid could end up in a chat-room filled with complete strangers whose intentions might've been quite unfriendly.
Together with kids, some older users (including adults) could've gotten in the chats. It looks especially ironic with 2017's comments provided by Facebook. Its spokesman Marcus Davis claimed that many other kids’ apps did not filter suspicious strangers.
And this is Facebook's current position on the topic: We recently notified some parents of Messenger Kids account users about a technical error that we detected affecting a small number of group chats. We turned off the affected chats and provided parents with additional resources on Messenger Kids and online safety.
How exactly is it dangerous?
Children who used Messenger Kids might've been exposed to harmful materials: nudity, explicit imagery, sexting, scenes depicting cruelty, and so forth. Besides, their private data might've been leaked. Ending up in someone's hands.
It's not the first case of Facebook revealing its imperfections that cost people their privacy. Now there's a visible chance for the social-media giant to lose 5 billion dollars as a penalty set by the FTC.
In reality, it's impossible to create a 100% hazard-free online place for kids. Games like Minecraft, PUBG, Fortnite, Apex Legends, and others have quite toxic player communities. Mutual insults, trolling, and web-bullying are regular things in these games.
And social networks are filled with lurking predators disguised as normal users or even the child's peers. The only way out is to limit your kid's online time and presence. Many mobile, PC, and console titles have exciting offline gameplay. And does an elementary school student really need WhatsApp, Instagram, and 24/7 mobile connection?