Parents in Japan to Get Instagram Notifications When Teens Repeatedly Search for Suicide Content

Reuters
The Instagram app icon is seen on a smartphone.

The Japanese arm of U.S.-based Meta Platforms, Inc., which runs Instagram, said Tuesday that it would introduce a feature in Japan this year to notify parents if children ages 13-17 repeatedly try to search for content related to suicide or self-harm on the photo-sharing app.

To further protect children, it will also soon introduce a feature that restricts access to posts about drugs and dangerous behavior.

For users ages 13-17, who are allowed on Instagram under the app’s terms of use, the “Teen Accounts” feature, which limits certain functions, will notify parents via the app or by email if children repeatedly try to search for suicide-related content. For this to work, parents must link their account to their child’s.

While this feature is already available in the United States and Britain, it had not been introduced in Japan.

Instagram will also soon introduce a feature to restrict teens up to age 17 from viewing posts containing drug-related content, extreme language such as threats and dangerous acts like shooting guns. The platform already limits the display of posts when they contain sexual imagery or relate to alcohol or tobacco.

While social media allows for easy communication with friends and others, it has created concerns worldwide that it can lead to bullying and suicide.

In the United States, lawsuits have been filed against operating companies. In Australia, a law banning social media use by those under 16 took effect in December last year.