If Facebook Messenger proves popular with children, Facebook may reap many benefits. The company could see increased messaging activity and more engaged, regularly returning users, not to mention insights and a wealth of data on how families interact with one another on Messenger.
Yet Facebook is bracing for what will most likely be a skeptical — if not outright hostile — response for creating a product specifically for children. Many parents are already concerned about the amount of screen time children spend with smartphones and other mobile gadgets, as well as how tech companies may be building up a trove of data on their children’s online habits.
The company said it had spent months talking to parenting groups, child behavioral experts and safety organizations to aid in developing the app, and spent thousands of hours interviewing families across the country, probing the ways they currently communicate with one another. Facebook said that Messenger Kids was compliant with the Children’s Online Privacy and Protection Act and that it had worked closely with online watchdog organizations.
Messenger Kids is built so that children do not sign up for new Facebook accounts for themselves; Facebook’s terms of service require that users be over the age of 13. The app requires an adult with a Facebook account to set up the app for his or her child. After adults enter their Facebook account information into the app, they are asked to set up the children’s profile and which friends or family members they will be allowed to connect with on Messenger. Every additional friend request must be approved by the parent.
The app is fairly limited in scope, allowing for text and video chat, as well as sending photos. As with Instagram, Facebook or Snapchat, children can add filters or playful drawings to the photos they send.
Facebook can ill afford more controversies. The Silicon Valley company has already been in the cross hairs of Congress for months over the role it played in the 2016 election, with the rampant spread of fake news and divisive content on all of its platforms. The company has said more than 150 million people across Facebook and Instagram could have seen content linked to Russian agencies.
Still, the company said that issue was largely separate from Messenger. Facebook said its overall mission was still centered on bringing the world closer together, however divisive the activity on its many platforms may be.
“We can’t let the current state of things prevent us from doing our jobs, which is to solve real problems in people’s lives,” Mr. Marcus, the head of Messenger at Facebook, said.
Continue reading the main story