← NewsAll
Instagram will notify parents if teens repeatedly search self-harm terms.
Summary
Instagram says it will alert parents enrolled in its parental supervision program when teens repeatedly search terms tied to suicide or self-harm, and the company already blocks such content from teen search results.
Content
Instagram announced a new measure to notify parents who are enrolled in its parental supervision program if their teenage children repeatedly search terms clearly associated with suicide or self-harm. The company says it already prevents such content from appearing in teen account search results and directs people to helplines. Meta also said it is working on similar notifications for certain types of teen interactions with its artificial intelligence.
What is known:
- Alerts will be sent only to parents who have joined Instagram’s parental supervision program and may arrive by email, text, WhatsApp or a notification through the parent's Instagram account.
- Instagram already blocks search results for terms linked to suicide and self-harm on teen accounts and provides links to helplines instead.
- The announcement comes while Meta is facing legal trials in Los Angeles and New Mexico over alleged harms to children; company executives have disputed that social media has been proven to cause addiction.
Summary:
The policy adds parental alerts for repeated searches tied to suicide or self-harm and the company says it will expand this approach to some AI interactions in coming months. Meta is currently involved in trials in Los Angeles and New Mexico related to harms to children; the next procedural steps are undetermined at this time.
