SALT LAKE CITY — Good news for parents fighting the social media battle with their teens, Instagram is adding a range of new features specifically designed to make the platform safer for its youngest, most vulnerable users.
The social media giant outlined the updates to its safety features on the company blog Tuesday, noting that "protecting young people on Instagram is important to us." The company worked alongside youth tech safety experts to develop the new tools.
First things first: Instagram is using new artificial intelligence and machine learning to verify that new users are actually age 13 or older when they sign up.
"We ask for age when you first sign on, we also rely on reporting tools – so if you see someone's that's not 13, you can report and we'll take that account down," Instagram's Head of Global Policy Programs Carolyn Merrell told CBS News. "And we're using machine learning to look at behavioral signals of how old someone is to make sure we're also getting ahead of that."
That verification process will come in handy when enforcing another drastic change: adult users will no longer be able to direct message teenagers who don't already follow them on the platform. Direct message attempts will be met with a notification informing the adult that this type of contact is prohibited.
"There are cases where it is appropriate for adults and teens to interact on Instagram, but it's important that teens be protected against unwanted contact from adults," Larry Magid, CEO of ConnectSafely.org wrote on Instagram's blog. "Requiring that the teen – not the adult – establish the connection empowers teens to protect themselves."
There are cases where it is appropriate for adults and teens to interact on Instagram, but it's important that teens be protected against unwanted contact from adults.
–Larry Magid, CEO of ConnectSafely.org
Additionally, Instagram will soon send "safety notices" to teens when it detects potentially suspicious behavior in adults they're connected to on the platform.
"If an adult is sending a large amount of friend of message requests to people under 18, we'll use this tool to alert the recipients within their DMs," the company statement reads.
From there, the teen user will be given the option to end the conversation and block the adult in question.
The company also launched a tool that encourages new users under the age of 18 to make their account private, giving the teen ultimate control over who can view their content.
Instagram collaborated with The Child Mind Institute and ConnectSafety to publish an updated Parents Guide that outlines the new safety measures, along with insight from experts and recommended conversation topics to help both parent and child develop healthy social media habits.
"Instagram can provide young people the opportunity to strengthen connections, practice social skills and find supportive communities," Dr. Dave Anderson, a clinical psychologist with the Child Mind Institute, wrote on the Instagram blog. "It's important that teens and parents are equipped with information on how to manage their time on the platform so that it's thoughtful, safe and intentional."
The new safety features will start popping up on Instagram as early as this month.
Some updates on what we're doing to keep teens safe on Instagram ❤️ Including:— Instagram (@instagram) March 16, 2021
🔒 Restricting DMs between teens and adults they don't follow
🤔 Prompting teens to be careful in DMs even with adults they're connected to
🙋 Encouraging teens to make their accounts private pic.twitter.com/l1PZ9uwzeG