Parents, lawmakers, and health experts alike have been sounding the alarm on the potential dangers of social media for kids and teens for years, with a growing body of research linking social media use to mental health concerns including depression, anxiety, eating disorders and body image issues, as well as poor sleep hygiene and attention issues.
To date, few guardrails have been put in place to protect young people on most platforms. Earlier this year, Meta—the parent company of Facebook, Instagram, Threads, and WhatsApp—announced more than 30 tools and resources to “support teens and their families,” including hiding or updating content geared towards teens. Now, the company is taking it one step further, automatically making teen Instagram accounts private by default in an effort to limit contact from strangers and age-inappropriate content, such as violence and the promotion of cosmetic procedures. But are these Instagram teen restrictions really going as far as they can to keep kids safe online?
According to Meta, this “brand new, reimagined Instagram experience for teens” will offer “built-in protections which limit who can contact teens and the content they see, and help better support parents.” The change is effective immediately for all new accounts in the U.S., U.K., Canada and Australia, with existing accounts due to change within 60 days. Teens in Europe will see the changes “later this year,” per the company’s announcement.
Additional changes will include:
- Messaging restrictions and content limits
- A new Sleep Mode feature which mutes notifications at night
- Additional parental supervision tools allowing parents to see who their teen is messaging
- Parental settings for limits on when and for how long their teens can use the app
- Teen-specific Explore content so they can “can focus on the fun, positive content they love”
Currently, anyone 13 and over can sign up for an Instagram account. Under the new guidelines, teens under 16 can only change these protective settings with a parent’s permission.
But is this enough? After all, kids lying about their age to access social media apps is a tale as old as time, and experts believe they will simply find new ways to circumvent the rules if the desire is strong enough.
Nicole Gill, the co-founder and executive director of Accountable Tech, called the move “yet another hollow policy announcement” by the company, adding, “Meta has known since at least 2019 that teens reported an increase in body image issues from using Instagram, along with increases in the rates of anxiety and depression, yet the company executives including Mark Zuckerberg and Adam Mosseri did nothing. … Today’s PR exercise falls short of the safety by design and accountability that young people and their parents deserve and only meaningful policy action can guarantee.”
New York Attorney General Letitia James told the Associated Press that Meta’s plans mark “an important first step, but much more needs to be done to ensure our kids are protected from the harms of social media.” Her office is working with other New York officials on implementing laws to help curb children’s access to addictive social media apps like Instagram.
As of 2023, more than 40 states and school districts nationwide have filed lawsuits against Meta for ignoring the detrimental impacts of its platforms on young people. Only time will tell if the new measures actually help to reduce the risks, but either way, it’s abundantly clear that so much more must be done.