← NewsAll
Instagram will alert parents when teens search for suicide or self-harm
Summary
Meta says Instagram will notify parents who use supervision tools if a teen repeatedly searches for terms related to suicide or self-harm; the alerts begin next week and will first roll out in the U.S., the U.K., Australia and Canada.
Content
Meta-owned Instagram said it will begin notifying parents when a teenage account repeatedly searches for terms tied to suicide or self-harm. The company says the notices will be sent to parents who activate Instagram’s supervision tools, and they will include resources intended to help with sensitive conversations about mental health. Meta framed the change as part of broader safety steps and noted the move comes amid public scrutiny of social platforms’ effects on young users. The company also said it already blocks some search results for teens and uses age-based content restrictions.
Key details:
- Notifications will be sent by email, text, WhatsApp or as an in-app message to parents using Instagram supervision tools.
- The message will report that a teen repeatedly searched for suicide or self-harm content and will offer resources on discussing mental health.
- Meta said it set a threshold of "a few searches within a short period of time" but did not specify an exact number that triggers alerts.
- The feature will start next week in the U.S., the United Kingdom, Australia and Canada, with broader rollout planned later this year.
Summary:
Instagram’s announced change is intended to connect parents with information when a supervised teen repeatedly searches for suicide or self-harm terms. The company has not disclosed the precise search threshold that will prompt alerts. The rollout begins next week in four countries and is scheduled to expand to other regions later in the year. Ongoing legal scrutiny of social platforms was cited as part of the context for these and other safety measures.
