Instagram is introducing new policies limiting interactions between teenagers and adults to make its platform safer for young users. The app has banned adults from direct messaging teenagers who don’t follow them and is introducing “safety prompts” that will be shown to teens when they DM adults who have been “exhibiting potentially suspicious behavior.”
Safety prompts will give teenage users the option to report or block adults who are messaging them. The prompts will remind young users not to feel pressured to respond to messages and to “be careful sharing photos, videos, or information with someone you don’t know.”
Notices will appear when Instagram’s moderation systems spot suspicious behaviour from adult users. The company is not sharing detail on how these systems operate but says such suspicious behaviour could include sending “a large amount of friend or message requests to people under 18.” The Facebook-owned Instagram says this feature will be available in some countries this month (it did not specify which) and available globally “soon.”
Instagram also says it’s developing new “artificial intelligence and machine learning technology” to try and detect someone’s age when they sign up for an account. Officially, the app requires that users are aged 13 and above, but it’s easy to lie about one’s age. The company said it wants to do “more to stop this from happening” but did not go into any detail about how new machine learning systems might help with this problem.
New teenage users who sign up to Instagram will also now be encouraged to make their profile private. If they choose to create a public account anyway, Instagram will send them a notification later “highlighting the benefits of a private account and reminding them to check their settings.”