Facebook Messenger Kids—Good for Whom?

Posted in: Juvenile Law

As a parent and a consumer, I have mixed emotions when I think about Facebook’s new initiative. Just last week it announced that it was previewing a new messenger app for kids. While I like my child to be able to reach me when she needs a ride from school, or has arrived safely somewhere, I am not sure I want to allow her to reach other peers via a new app, when she is just learning how to communicate in person, let alone digitally. And I am also worried about what will happen to the data that Facebook collects about my kid and her relationships with others.

Many ask, isn’t it happening anyway? Kids today often use their parents’ accounts to log on to Netflix, buy music or apps from the Apple iTunes store, shop on Amazon, and if they are over 13 (or not but can masquerade as a teen) log on to play video games, or conduct other activities via their own accounts. This, to me, is concern enough. I don’t want my kids leaving digital fingerprints everywhere—before they are adults and have fully contemplated the consequences of accessing content in exchange for their data. The Facebook messenger, however, is directly targeting kids and their parents. And therefore, I worry about what new data about my kid, our family life and relationships, will be collected as part of this new tool.

It’s not surprising that Facebook, along with other companies, wants to market to the next generation. They want to attract a new generation of users who will mature into adults with purchasing power, familiarize them with their platforms and apps to get them networked and hooked. After all, companies donate computer and software to schools in the hopes that students will grow up preferring proprietary systems that look and feel familiar to what they used in school.

But parents should think long and hard about when and how they want to unleash the power of messaging onto the younger generation. Do we want our kids to have easy access to such services, and what, if any safeguards would make the service more palatable? In this column, I will outline the key gestures of Facebook’s new program and discuss similar initiatives that have preceded this one. I will also explore what privacy and safety concerns arise with this business model. Finally, I call upon policymakers or advocacy groups to weigh in on this business practice, for surely it won’t be the last.

Facebook Messenger Kids: Touted as a New Model

Last week Facebook noted that it was rolling out a preview of “Messenger Kids.” This new app will allow kids “to safely video chat and message with family and friends when they can’t be together in person.” Facebook notes that it has talked to thousands of parents and the National Parent Teacher Association. The result, Facebook announced: “[W]e found that there’s a need for a messaging app that lets kids connect with people they love but also has the level of control parents want.” The app is targeted toward children aged 6-12, who are still too young to access Facebook.

Messenger Kids is meant to be an individual app that can be installed on a child’s smartphone or tablet but can be controlled from a parent’s Facebook account. It will work a bit like Apple’s FaceTime and will allow kids to talk to their parents, siblings, grandparents, and other friends. Facebook makes it sounds compelling: “Whether it’s using video chat to talk to grandparents, staying in touch with cousins who live far away, or sending mom a decorated photo while she’s working late to say hi… Messenger Kids opens up a new world of online communication to families.” The preview version is available in the United States via the Apple App Store for iPad, iPod touch and iPhone.

 Of course, parents may wonder, do we really want to allow kids to be able to talk to lots of folks rather than doing their homework? Might it be possible for a stranger to connect to my kid via the app, if I am not vigilant in controlling the app? Facebook reassures us that they have engaged with online safety experts to design the app. Facebook markets this as “more fun for kids, more control for parents”. They even note “Playful masks, emojis and sound effects bring conversations to life.”

The key to safety is that parents will approve contacts and thus, the home screen will show kids who has permission to talk to them or send photos, videos, or text messages to their parent-approved friends and adult relatives, who will then receive the messages via their regular Messenger app. An account may only be set up and changed by a parent, who must also approve and add (or remove) any contacts for their child. Facebook says it will not advertise to children within the app or sell any data it collects to advertisers. There are also, at present, no in-app purchases offered. But there is no promise that the no-advertisements, or the no in-app purchases are permanent. I am also wondering whether in-app affinity points or reward points might be offered in the future—although free, they may keep kids online and in the app for longer.

Will Parent-Approved Contacts Eliminate Privacy and Safety Concerns?

Messenger Kids may seem like a great alternative to the other messaging apps that teens and tweens like to use including WhatsApp (owned by Facebook), Kik Messenger and Snapchat. Those apps allow kids to communicate with anyone—and are meant to be restricted to teens 13 years or older. On the other hand, parents can already text kids via regular (unsmart) cell phones.

Safety and well-being may still be a concern. Parents may feel relieved that they will have to approve contacts before their child can interact. It’s a little like approving who can come over for a play date or sleeper. But it’s not that easy to control a kid’s social network. Parents may readily approve contacts, if it’s another kid. But how much due diligence can a parent do before they approve someone? Will Messenger kids allow older teens into a space that is meant for a younger generation? Will younger siblings also be able to borrow their brother or sister’s phone and message others?

More importantly, do we want younger kids even online texting and messaging other kids? The issue of social readiness to communicate digitally is something with which parents need to grapple. If a child is in her first stages of learning to interact with other peers, is it wise to inject a medium that is available at all hours, which allows kids to express emotions from a distance? The biggest concern may be cyber bullying among younger kids. Unless a parent is online with his or her child, all the time, if a bully becomes part of an approved circle, the consequences can be devastating.

More generally, other concerns arise about the general targeting of kids for this product. Is Facebook using this as a way to lure kids in to its other products and services? Will it bind both parent and kids to Facebook? Facebook may fear competition from other socal media platforms like Snapchat. And it may also want a way to drive kids towards its other programs such as Instagram, which it owns. While this may not be illegal, parents need to consider the ways in which we may be nudged into moving on and locking into Facebook.

And the biggest unknown: what will Facebook do with the data it collects about its kid users? Facebook promises no advertising now—will that be the case in the future? Will any parents’ data be collected and will kids use lead to marketing to adults about what their kids like? Will I start seeing targeted ads for toys when I use Facebook? Will the data be used to develop new products and services further targeted at youth? Will it be kept and transferred into a teen or adult user profile, once a kid reaches the age of 13, and moves to Facebook’s regular platform? These questions about user-generated data need to be clarified, to give parents a better sense of what they are unleashing by signing up for Messenger Kids.

The Policy Context

On Thursday, Democratic Senators Ed Markey (Mass.) and Richard Blumenthal (Conn.) sent a letter to Facebook CEO Mark Zuckerberg, asking for more information about Messenger Kids: “We remain concerned about where sensitive information collected through this app could end up and for what purpose it could be used,” they stated. They have asked for “assurances that this ‘walled garden’ service they describe is fully protective of children.”

The lawmakers, both on the Senate Commerce Committee, worry that Facebook will still collect, analyze, and share the information collected through the app to other Facebook companies such as WhatsApp and Instagram. While everything may be apparent to Facebook, the business model is still ambiguous to others.

The senators have asked a series of pointed questions. For example, they ask “Will the company commit that it will never change that policy and keep all its applications and services for children 12 and under advertisement free?” They also want to know about the sharing of information within the Facebook family. As they note “The Messenger Kids privacy policy says information collected by the app—including kids’ content and communications, activity, and device information—is shared with ‘Facebook’s family of companies.’” They also ask an important questing about cybersecurity. And they probe into other questions about what can be shared via the app—will kids be able to send live links, or other content via the app? What if it is objectionable or harmful?

The senators’ letter raises a host of issues. I would hope that Congress will persist in examining this new business model. When we are providing tools to kids, we should ask hard questions—before we put a smart phone in a six-year-old’s hand and allow her to start chatting with others, when she is only beginning to learn how to read, write, and communicate.