fbpx

Instagram has introduced safety features for young ones

Instagram is introducing new policies limiting interactions between teenagers and adults to make its platform safer for young users. The app has banned adults from direct messaging teenagers who don’t follow them and is introducing “safety prompts” that will be shown to teens when they DM adults who have been “exhibiting potentially suspicious behavior.”

Safety prompts will give teenage users the option to report or block adults who are messaging them. The prompts will remind young users not to feel pressured to respond to messages and to “be careful sharing photos, videos, or information with someone you don’t know.”

Notices will appear when Instagram’s moderation systems spot suspicious behaviour from adult users. The company is not sharing detail on how these systems operate but says such suspicious behaviour could include sending “a large amount of friend or message requests to people under 18.” The Facebook-owned Instagram says this feature will be available in some countries this month (it did not specify which) and available globally “soon.”

Instagram also says it’s developing new “artificial intelligence and machine learning technology” to try and detect someone’s age when they sign up for an account. Officially, the app requires that users are aged 13 and above, but it’s easy to lie about one’s age. The company said it wants to do “more to stop this from happening” but did not go into any detail about how new machine learning systems might help with this problem.

New teenage users who sign up to Instagram will also now be encouraged to make their profile private. If they choose to create a public account anyway, Instagram will send them a notification later “highlighting the benefits of a private account and reminding them to check their settings.”

TikTok has parental controls coming VERY SOON..

TikTok is now bringing in parental controls which will be managed under ‘Family Pairing’. The following parental settings will be included, screen time management, Direct Messages and Restricted mode for its teen users.

If a child under 16 is using the platform, parents will be able to disable direct messages.

Similar settings are already enabled in the UK for younger users which were in keeping with European laws and regulations, which was called ‘Family Safety Mode’

Parental controls will be rolling out worldwide over the coming weeks.

To use the new controls, parents of a teenage user age 13 and older will be able to link their account to their child’s, which requires the parent to set up their own TikTok account. This will allow the parent to set controls on how long their child is able to use the TikTok app, turn on or off who the teen can direct message with and they can opt to turn on TikTok’s “restricted” mode for the child’s account in order to limit inappropriate content.

The latter is not a well-explained feature. But for an app of TikTok’s scale, it’s likely based in large part on users flagging inappropriate videos they come across. Parents should be aware, then, that this is not equivalent to setting parental controls on a video streaming app, like Netflix, or restricting what a child can download from the App Store on their phone. In other words, some inappropriate content or more adult material could slip through.

TikTok has parental controls coming VERY SOON..

Both Screen Time Management and Restricted Mode are existing controls that TikTok users can set for themselves via the app’s Digital Wellbeing section. But with Family Pairing, the parent will be able to set these controls for their child, instead of relying on the teen to do it for themselves.

TikTok also already offered a number of controls on Direct Messaging before today, which allow users to restrict messages to only approved followers, restrict the audience or disable direct messages altogether. TikTok also blocks images and videos in messages to cut down on other issues, as well.

But with Family Pairing, parents can choose to what extent teens can message privately on the platform, if at all.

TikTok has now decided to automatically disable Direct Messages for any registered accounts for those under the age of 16. (Prepare to see a lot more activity and private conversations taking place in the TikTok comments section!) This change goes live on April 30.

The changes give parents far more control over their child’s use of TikTok compared with any other social media app on the market today, outside of those designed exclusively with families and children in mind. However, the parental controls are only a subset of the controls users can opt to set for themselves. For example, users can choose to make their accounts private, turn off comments and control who can duet with them, among other things.

But the options may relieve some parents’ stress about how addictive the TikTok app has become. Teen users are spending significant amounts of time on the short video app — so much that TikTok itself even launched its own in-app PSA that encourages users to “take a break” from their phone.

TikTok offers other resources for parents, as well, including educational safety videos and parental guides. 

It’s an interesting decision on TikTok’s part to launch screen time-limiting features and other restrictions amid a global pandemic, when teens are stuck at home with nothing much to do but watch videos, chat and play games. But with families at home together, there may be no better time than now to have a conversation about how much social media is too much.

“More than ever, families are turning to internet platforms like TikTok to stay entertained, informed, and connected. That was, of course, happening before COVID-19, but it has only accelerated since the outbreak began and social distancing brought families closer together,” writes TikTok director of Trust & Safety, Jeff Collins, in an announcement. “The embrace of platforms like ours is providing families with joint tools to express their creativity, share their stories, and show support for their communities. At the same time, they are often learning to navigate the digital landscape together and focused on ensuring a safe experience,” he said.

MY THOUGHTS:
I personally think this is a great idea as there is so much risqué content on TikTok that if I had a child, I wouldn’t want them to see that, let alone want them to use it. Plus parents aren’t actually aware of how toxic the TikTok environment can be. So if you have a child on TikTok, I urge you to get these controls when they become live in New Zealand.

>