Home Meta on defense: Social media company supports cohesive legislation for youth safety
Schools, Virginia

Meta on defense: Social media company supports cohesive legislation for youth safety

Rebecca Barnabi
social media
(© Aleksei – stock.adobe.com)

In a press call with Virginia media Wednesday afternoon, Meta, the owner of Facebook, expressed support for federal legislation to protect youth on social media but with the requirement of parental consent.

Meta wants industry standards for all state- and federal-level legislation regarding safety of youth on social media.

However, so far, different laws are coming out of different states with different requirements, none of which are cohesive.

Different requirements by state means that parents will have to share sensitive information with each new application they approve to be set up for their teenager, then monitor each application.

Meta wants legislation at the application store level for parental requirement. Aside for a teen’s age at setup, Facebook would collect less private information from parents. The social media platform wants to avoid a patchwork effect of legislation, and the work is critical. Meta is trying to make the legislation easier for parents and wants state laws to have principles consistent with federal laws.

The social media platform is hoping lawmakers will create and pass legislation with a holistic, 360-degree approach to social media applications, and bring in parents to better understand their needs.

Meta cites a report from the National Academy of Sciences in which no evidence exists of social media causing mental health crisis among youth. Instead, social media helps teens to build relationships.

According to a Pew Research Center study Meta cited, parental involvement online is beneficial. Eighty-one percent of U.S. adults support parental consent for teens when they begin to have a social media presence.

Meta tracks what content teens see, what interactions they have with others and whether they have parental supervision. Ninety-percent of the time, contents that violates Meta’s policies is removed before anyone sees it.

Meta understands that teenagers develop differently than adults and absorb digital content differently. With sensitive content control, certain terms, such as suicide, are hidden from teens and they do not see some content that adults are able to view. For example, if content about an eating disorder violates Meta’s policies, adults and teens will not see that content. But if a teen’s friend posts content and mentions an eating disorder, the teen will not see the content.

Unwanted interactions between certain individuals and teens are prevented. If a teen is under age 16, their content is automatically private, not public. Suspicious adults on Facebook are prohibited contact with teens and certain words are restricted to prevent bullying.

When a parent gives consent to a teen setting up a social media account, the parent has insight into the teen’s online activity, who they follow and who follows their teen.

Meta also works to help the wellbeing of teens with intentional positive experiences online. The social media platform also has tools, such as “take a break” and “quiet mode,” to encourage teens to limit time online.

Rebecca Barnabi

Rebecca Barnabi

Rebecca J. Barnabi is the national editor of Augusta Free Press. A graduate of the University of Mary Washington, she began her journalism career at The Fredericksburg Free-Lance Star. In 2013, she was awarded first place for feature writing in the Maryland, Delaware, District of Columbia Awards Program, and was honored by the Virginia School Boards Association’s 2019 Media Honor Roll Program for her coverage of Waynesboro Schools. Her background in newspapers includes writing about features, local government, education and the arts.