A flaw in a Facebook app designed for children under 13 years old allows kids to chat online with people unapproved by their parents.
The messaging app is designed to give parents control over who their kids text and video chat with online, but a bug in the software lets a contact approved to chat with one child talk to another without the approval of the second child’s parents.
“We recently notified some parents of Messenger Kids account users about a technical error that we detected affecting a small number of group chats,” Facebook said in a statement provided to TechNewsWorld by spokesperson Thomas Richards. “We turned off the affected chats and provided parents with additional resources on Messenger Kids and online safety.”
In the message to parents, Facebook included links to the FAQ for the app, to the parent’s control center for the software, and to a feedback page.
The breakdown in parental control occurs when a child is part of a group chat. Any person chatting one-on-one must be approved by the child’s parents. In a group chat, however, the organizer of the group may invite members who are cleared to communicate with the organizer but not cleared to talk to some other members of the group. The bug in the app allows all group members to chat with each other whether approved by a parent or not.
False Sense of Security
The flaw found in Messenger Kids is a symptom of a problem that’s not Facebook’s alone, observed Lorrie Faith Cranor, director of the CyLab Usable Privacy and Security Laboratory at Carnegie Mellon University in Pittsburgh.
“It’s a problem with access control. Access control can be really hard,” she told TechNewsWorld.
“A lot of companies get it wrong in many ways, and this is just the latest example of how they weren’t careful with their access control,” Cranor said. “We see it all the time in the corporate world where the wrong people have access to something they shouldn’t have access to because it’s hard to do access control correctly.”
When launching Messenger Kids, Facebook made this declaration: “Messenger Kids gives parents more control. Parents fully control the contact list, and kids can’t connect with contacts that their parent does not approve.”
Facebook may have been giving parents a false sense of security with that claim.
“They made a promise they couldn’t deliver on,” Cranor said.
“It is pretty much impossible to shield your kid from everything that’s bad on the Internet,” she continued. “Even if you can do that at home, all they have to do is go over to a friend’s house and be exposed to stuff. Parents need to not assume they’re going to be able to 100 percent shield their kids from stuff they don’t want them to see on the Internet.”
Dire Consequences
Thousands of children were left in chat groups with users unknown to their parents, according to media reports.
Even one child exposed to a stranger in a chat room could turn into a nightmare for Facebook, noted Karen North, director of the Annenberg Program on Online Communities at University of Southern California in Los Angeles.
“Doing things that violate the privacy of children and allow adults to have access to children is always a big deal,” she told TechNewsWorld. “It only takes one predator for the consequences to become dire.”
This latest Facebook misstep is part of what seems to be an endless stream of bad news about the company. “It’s one thing after another. They’ve had security breaches on days they announced new security policies,” North said.
“If it wants to issue cryptocurrency, it needs to clean up its act so it can be trusted on a daily basis,” she added.
Facebook last month announced that it intends to launch its own cryptocurrency, the Libra, next year.
“The thing about Facebook,” North added, “is that no matter how secure the communication, there’s an assumption that there’s a privacy issue because it’s likely to use the data it’s collecting.”
FTC Complaint Filed
Data collection issues about Messenger Kids arose just months after the app launched in 2018. The Parent Coalition for Student Privacy, based in Boston, filed a complaint with the U.S. Federal Trade Commission alleging Messenger Kids violated the federal Children’s Online Protection Act.
Facebook’s app collected personal information from children as young as five without obtaining verifiable parental consent and failed to provide parents with clear and complete disclosures of Facebook’s data practices, the group maintained in its complaint.
Facebook’s parental consent mechanism does not meet the requirements of COPPA because it’s not reasonably calculated to ensure that the person providing consent is actually a child’s parent, the complaint states.
Any adult user can approve any Messenger Kids account, and testing confirmed that even a fictional “parent” holding a brand-new Facebook account immediately could approve a child’s account without proof of identity, the group noted in a statement.
Facebook Messenger Kids’ privacy policy is incomplete and vague, the complaint also asserts. It allows Facebook to disclose data to unnamed third parties and the “Facebook Family of Companies” for broad, undefined business purposes.
The policy does not specify what companies are in the “Facebook Family,” the complaint maintains. COPPA requires that privacy policies list the name and contact information of any third parties who have access to children’s data.
Impact on Poor
Problems with Messenger Kids are likely to have a disproportionate impact on children from lower-income households, North added.
“In the United States, the lower a family’s income, the more likely they will be using Facebook,” she noted, “so this flaw in Kids Messenger is potentially disproportionately affecting underserved communities.”
No one knows why low-income communities are attracted to Facebook, but one possible explanation is that it’s easy to use on shared devices, North said.
“If you’re poor and using the school or library computer, Snapchat and Instagram don’t work so well,” she explained. “What does work is Facebook Messenger or messaging via Facebook.”