
MANILA — Instagram will soon notify parents when their teenage children repeatedly search for content related to suicide or self-harm, platform owner Meta announced Thursday, amid growing legal scrutiny over its impact on young users.
The feature will roll out in the coming weeks in the United States, Britain, Australia and Canada, and will expand to other regions later in 2026. Alerts will be triggered when a teen conducts multiple searches for suicide- or self-harm-related terms within a short span of time.
Parents enrolled in Instagram’s parental supervision tools will receive notifications through email, text message or WhatsApp, as well as via the app. The alerts will also include expert resources intended to guide parents in discussing sensitive topics with their children.
Instagram currently blocks searches tied to suicide and self-harm, redirecting users to helplines and support organizations. The new measure is aimed at identifying cases where teenagers persistently attempt to access such content despite those safeguards.
Meta said it consulted its Suicide and Self-Harm Advisory Group in determining the threshold for sending alerts, noting that it chose to take a cautious approach even if some notifications may ultimately prove unnecessary.
The move comes as Meta faces increasing legal and regulatory pressure over youth use of social media. Chief Executive Officer Mark Zuckerberg recently testified in a California trial in which plaintiffs accused his company and others of deliberately fostering addiction among minors. The case marks the first time such allegations have been presented to a jury.
Meta is also confronting broader international efforts to limit children’s access to social media platforms. Australia banned users under 16 from social media in December, while countries including France, Denmark, Spain and the United Kingdom are advancing similar restrictions.





Leave a comment