Parents in Japan to Get Instagram Notifications When Teens Repeatedly Search for Suicide Content

Reuters
The Instagram app icon is seen on a smartphone.

A new feature will be added to Instagram in Japan this year that notifies parents if children age 13-17 repeatedly search for content related to suicide or self-harm on the app, the Japanese arm of Meta Platforms, Inc. has announced.

To further protect children, it will also soon introduce a feature that restricts access to posts about drugs and dangerous behavior.

For users age 13-17, who are allowed on Instagram under the app’s terms of use, the “Teen Accounts” feature, which limits certain functions, will notify parents via the app or by email if children repeatedly try to search for suicide-related content. For this to work, parents must link their account to their child’s.

While this feature is already available in the United States and Britain, it had not been introduced in Japan.

Instagram will also soon introduce a feature to restrict teens up to age 17 from viewing posts containing drug-related content, extreme language such as threats and dangerous acts like shooting guns. The platform already limits the display of posts when they contain sexual imagery or relate to alcohol or tobacco.

While social media allows for easy communication with friends and others, it has created concerns worldwide that it can lead to bullying and suicide.

In the United States, lawsuits have been filed against operating companies. In Australia, a law banning social media use by those under 16 took effect in December last year.