Meta launches ‘teen accounts’ with new protections after harmful accusations
Sep 17, 2024, 12:17 PM | Updated: 1:12 pm
SALT LAKE CITY — Meta announced new changes to accounts created by teens on Tuesday in the wake of multiple lawsuits and scrutiny from lawmakers accusing Instagram and other social platforms of endangering young people.
According to Meta, the changes target “parents’ biggest concerns,” by making teen accounts private by default. In addition, Meta said it developed multiple functions in the app specifically for these “teen accounts,” which were effective immediately.
The full list of features, according to Meta, is as follows:
- Private accounts: Users who do not follow a teen account cannot see its content, and teen users must screen and accept or deny all follow requests.
- Messaging restrictions: Teen accounts can only be messaged by people they follow or are already connected to in the app.
- Sensitive content restrictions: A new content policy for teen accounts will limit sensitive content such as violent or artificial weight-loss depictions seen in the “explore” tab and while scrolling through reels in the app.
- Limited interactions: Teen accounts can only be tagged or mentioned in captions or comments by people they follow. “Hidden words” will also be applied, so “offensive words and phrases” will be filtered out of comments and direct message requests.
- Time limit reminders: Teen accounts will receive notifications telling them to leave the app after they’ve used it for 60 minutes in a day.
- Sleep mode: Sleep mode will default to be enabled and will be turned on between 10 p.m. and 7 a.m.
Teens under 16 will need to gain their parent’s permission to change any of the default settings, but parents can also activate parental supervision if they’d like to have more oversight for their older teens. Through parental supervision, parents can approve or any changes to settings after their teen requests them in the app.
Instagram created a landing page for parents to learn more about teen accounts and how they can manage them. Further, parents will be able to: see insights on who their teen is messaging, set more specific daily time limits, block teens from using the app for specific time periods and view topics their teen is browsing.
Meta additionally published an extensive guide for parents and guardians on how best to operate the app, how to speak with teens about it and tips to enforce rules.
Age verification
Meta acknowledged the possibility of some users lying about their age. The platform said it will require teen users to verify their age in “more places, like if they attempt to use a new account with an adult birthday.”
Meta ran tests in 2022 on verifying age which involved the upload of video selfies or “social vouching” which allowed teen users to ask mutual followers to confirm their age.
Besides uploading an ID, it was unclear if there were any other ways for teens to verify their age in the new release of teen accounts.
“We’re also building technology to proactively find accounts belonging to teens, even if the account lists an adult birthday,” the company said. “This technology will allow us to proactively find these teens and place them in the same protections offered by teen account settings.”
The company said tests for the proactive age verification feature would begin in the U.S. in early 2025.
‘They do not go far enough’
Utah’s Gov. Spencer Cox made a statement early Tuesday after the rollout of teen accounts, commending the work but stating the measures Meta took were “not sufficient to ensure the comprehensive safety of Utah’s youth online.”
Just six days before Meta’s announcement, a federal judge in Utah blocked a set of social media access laws that aimed to replace the Utah Minor Protection in Social Media Act in 2023 with language they believed would stand up in court.
The laws would have required social media companies like TikTok and Meta to verify the ages of all users and certain default privacy settings for minors.
The laws also aimed to hold protections in place while the state dealt with multiple lawsuits it waged against TikTok and Meta.
“Utah has always been at the forefront of protecting our children in the digital age, and we appreciate Meta for taking a step in the right direction with the announcement of teen accounts. Many of these new features mirror our recently passed laws, demonstrating a growing awareness of the responsibility that social media companies have toward their younger users,” Cox said Tuesday. “However, while these are positive steps, we believe they do not go far enough to ensure the safety and well-being of Utah kids online. We encourage Meta, and all social media platforms, to continue to innovate and implement even stronger protections for minors.”
Utah Attorney General Sean D. Reyes and Utah’s Department of Commerce Executive Director Margaret Busse echoed Cox’s comments, calling for better “quality age assurance” and removal of “addictive features.”
‘Step in the right direction’
Former police officer and internet safety expert Sean Blair said he believes Meta is making an important effort to make the platform safer. Blair, who previously investigated sex crimes in Arizona and recently worked with various tech companies, said the move to make all teen accounts private helps.
“There’s technology now that can grab (their) images and use them in bad ways. And so at a minimum, the people are going to have a much harder time getting access to content that they want to use for bad purposes,” he said.
According to Blair, the protections set to filter some of the content they see is also a “step in the right direction.”
“Pictures and content that is not necessarily good for their developing brains,” he said. “This will be one of those things that will eliminate a lot of that.”
Utah leaders react to call for warning label on social media platforms
Contributing: Mike Anderson, KSL TV