Meta Platforms has announced major changes to Instagram, enhancing privacy settings and parental controls for users under 18. This move addresses the growing concerns about the negative effects of social media on young users.
Highlights
- Instagram accounts for users under 18 will automatically switch to private “Teen Accounts.”
- New privacy controls limit messaging and tagging from strangers.
- Users under 16 must get parental permission to change the default settings in their accounts.
- Teens’ Instagram accounts get a privacy boost.
Starting Tuesday, Meta will automatically migrate all Instagram accounts of users under 18 to “teen accounts.” These accounts will have the highest privacy settings by default, making them private. Consequently, only users the teen follows or is already connected to can message or tag them, thereby adding an extra layer of safety.
For younger teens, the most restrictive, sensitive content settings will be applied. Users under 16 will need parental permission to change these settings, giving parents more control over what their children see and how they interact on the platform.
Enhanced parental controls
To further protect young users, Meta is offering parents a suite of tools to monitor and limit their children’s Instagram use. Parents can now oversee who their teens engage with on the platform and control the amount of time spent online.
The company also introduced a “sleep mode,” which automatically silences notifications for users under 18 from 10 p.m. to 7 a.m. This feature can be turned off by 16 and 17-year-olds but requires parental consent for users under 16.
READ ALSO:The Top 10 Most Followed Instagram Accounts in 2024
Addressing Safety Concerns and Legal Pressure
Meta’s decision comes amid growing concerns about social media’s impact on youth mental health. Several studies have linked excessive social media use to issues like depression, anxiety, and learning disabilities, particularly among young users.
Meta, along with other social media giants like TikTok and YouTube, currently faces multiple lawsuits accusing the platforms of fostering addictive behaviors in children. In 2023, 33 U.S. states, including California and New York, filed suits claiming Meta misled the public about the dangers of its platforms.
Stricter Age Verification and Content Restrictions
Aware that some teenagers might lie about their age, Meta is building technology to identify accounts falsely claiming to be adult-run. As a result, these accounts will automatically switch to the restricted teen account settings.
The “Teen Accounts” will be private by default, and private messages will only be allowed from people the user follows. Additionally, Meta will limit “sensitive content,” such as violent videos or those promoting cosmetic procedures, to create a safer online environment.
In response to parental concerns about excessive screen time, Instagram will send notifications to teens after they’ve used the app for 60 minutes in a day. While older teens (16 and 17-year-olds) can override this reminder, younger users will need a parent’s permission to bypass it.
Parents can also use the “parental supervision” mode to set specific daily limits, such as 15 minutes, ensuring that teens don’t spend too much time on the platform.
The Road Ahead
Meta’s enhanced privacy settings for teens are part of a larger overhaul aimed at creating a safer online experience for younger users. Although critics have met past efforts with skepticism, these changes give parents more options to protect their children.
As the company faces mounting pressure from both legal authorities and public concerns, these new controls represent Meta’s attempt to address the complex issue of youth safety in the digital age.