Business Standard

Snapchat's new features to protect teens from potential online risks

The platform requires teens to be Snapchat friends or have phone book contacts with another user before they can begin communicating

Snapchat

Photo: Bloomberg

Sourabh Lele New Delhi

Listen to This Article

Instant messaging platform Snapchat on Thursday announced new features to protect its teenage users between the age of 13 and 17 years from potential online risks.

The new features, which will begin to roll out in the coming weeks, are designed to protect teenagers from being contacted by people they may not know in real life; provide a more age-appropriate viewing experience on the content platform; and enable more effective removal of accounts that may be trying to market and promote age-inappropriate content through a new strike system and new detection technologies.

The platform requires teens to be Snapchat friends or have phone book contacts with another user before they can begin communicating. This is a part of Snapchat's policy to allow messaging access to teenagers from only someone they know in real life – such as a friend, family member or other trusted person.
 

Snapchat is also launching a new feature that creates a pop-up warning message to a teen if someone tries to add them as a friend when they don’t share mutual contacts or the person isn’t in their contacts. This message will urge the teen to carefully consider if they want to be in contact with this person and not to connect with them if it isn’t someone they trust.

Additionally, the messaging platform will increase the bar of minimum mutual friends with other users a teenager's account needs to appear in search results. It is already required for a 13-to-17-year-old to have several mutual friends in common with another user before they can show up in search results. This bar is getting raised based on the number of friends of the users with the goal of further reducing the ability for teens to connect with people they may not already be friends with.

Alarming incidents for young users including cyberbullying, predators, scams, and being exposed to inappropriate content have previously triggered scrutiny of platforms like Snapchat, Telegram and Instagram among others.

Snapchat's default mode of disappearing messages increased the concerns of parents about inappropriate content or bullying content being shared on the platform.

"Snapchat is designed to have fun and communicate openly with your closest friends. At Snap, nothing is more important than the safety of our users and we believe that design plays a powerful role in ensuring this. Our latest features are thoughtful in-app features that are designed to empower teens to make smarter choices, and talk openly about staying safe online," said Uthara Ganesh, Head Public Policy-South Asia at Snap Inc.

As per Snapchat's policy, illegal and harmful content such as sexual exploitation, pornography, violence, self-harm, misinformation and other inappropriate content is prohibited. To enforce policies and take quick action to help protect the community, Snap has long used a "zero-tolerance approach" for users who try to commit severe harms. If accounts are found engaging in this activity, they are immediately banned, with measures to prevent the account holder from getting back on Snapchat, and may have that account holder reported to law enforcement.

"We're committed to making sure Snapchat is a place where you can be creative and stay safe and above all, the safety and well-being of our community in India, which includes over 200 million users, is our top priority," Ganesh said.

Don't miss the most important news and views of the day. Get them on our Telegram channel

First Published: Sep 07 2023 | 7:10 PM IST

Explore News