Meta, the parent of Instagram, has introduced a new account option specifically for users under 18 years old in an effort to improve safety.
Starting Tuesday, teens in the US, UK and Canada will be automatically placed in a “teen account” that has enhanced parental supervision options.
In the next 60-days, all accounts with users under 18 years of age will be converted to this new setting.
Meta plans to introduce similar changes to teen account in the European Union by the end of this year.
This move comes amid a growing public backlash against social media’s impact on young people’s health. Parents, lawmakers, and advocacy groups have criticized tech companies for failing in their duty to protect children from harmful online content and predators.
In January, Mark Zuckerberg’s social media giant announced it would implement new content guidelines in order to ensure that teenagers who use the platform have a safe and age-appropriate environment.
In June, however, a Wall Street Journal report revealed that the platform continued to recommend adult content for underage users.
New features: Parental control, content restrictions
The enhanced parental control options are one of the most important updates to the new teen account.
Parents can now monitor their children’s Instagram use by setting time limits, blocking access to the app at night, and monitoring whom their teens are messaging.
Teens under 16 years old will need parental consent to make changes to their account settings. 16 and 17 year-olds can modify certain restrictions on their own.
Parents are concerned that their teens may be seeing content they don’t like, being contacted by people who they don’t want to contact them, or spending too much time using the app.
“Teen accounts really focus on addressing these three concerns.”
These accounts will also limit “sensitive content” such as videos of violent behaviour or cosmetic procedures.
Meta will also implement features that remind teens if they have been on Instagram longer than 60 minutes. It will also introduce a “sleep” mode, which disables notifications and automatically replies to messages between the hours of 10 p.m. – 7 a.m.
This feature helps teens to manage their time and avoid excessive usage at night.
These restrictions are enabled for all teens by default, but those aged 16-17 will be able to turn them off. Children under 16 years old will need parental consent to change the settings.
Mental health and legal issues
The introduction of teen account coincides with ongoing litigation Meta is facing. Dozens of US states have filed lawsuits against the company, accusing them of designing addictive features for Instagram and Facebook which harm young users.
The lawsuits claim Meta’s platforms are contributing to the worsening of youth mental health crises, with teens being exposed to unhealthy screen time, harmful contents, and online bullying.
Last year, US Surgeon General Vivek M. Murthy expressed concern about the pressures placed on parents to manage children’s online experience without adequate support from technology companies.
In a May 2023 statement, he said:
“We are asking parents to manage technology that is rapidly evolving and that fundamentally changes the way their children think about themselves, build friendships, or experience the world.”
Meta’s latest attempt to improve online safety among teens follows a number of previous attempts, which were criticised for not going far.
Parents can enable “parental supervision mode” to tighten parental controls. Teens, for example, will be able bypass the 60-minute notification if they want to scroll.
Nick Clegg (President of Global Affairs at Meta) acknowledged last week that parental controls are underused. He said, “One thing we find is that parents don’t even use these controls when we build them.”
Teenage accounts are part of a global strategic plan
The teen accounts are a part of a global plan, unlike some of Meta’s recent actions such as allowing EU users to opt-out of having their data be used to train AI model (a feature that is not yet available in any other region).
Meta plans to implement these changes in the European Union, as well as the US, UK and Canada by the end this year.
Antigone Davis is Meta’s director for global safety. She stressed that this feature was inspired by parents, not government mandates. Davis told Guardian Australia that parents everywhere are thinking about this issue.
Parents are concerned about the technology that is available today. It makes sense from the perspective of youth security to think about these types of things globally.
Countries look at social media regulation for teens
The timing of Meta’s announcement is in line with broader government efforts to regulate children’s access to social media platforms.
Just a few days earlier, the Australian government had proposed new legislation that would raise the age of teens who can access social media platforms from 14 to 16.
If this law is passed, Australia would be among the first nations to enforce a similar ban. Other nations, such as the UK, will closely monitor its progress.
Meta’s new teen account reflects a growing awareness around the world of the need to protect young users online. Countries like Australia and the UK are exploring further restrictions on social networking for teens.
Meta’s new features aim to strike a balanced between empowering parents while keeping Instagram a safe place for teens.
This post Instagram launches a teen account with enhanced parental control options may be updated as new information becomes available.
This site is for entertainment only. Click here to read more