Beginning next week, Instagram is set to roll out a novel feature in the United States, the United Kingdom, Australia, and Canada. Specifically, if a teenage user's account repeatedly searches for content associated with self-harm or suicide within a short timeframe, the platform's system will promptly alert the parents through email, text message, or WhatsApp. Meta has announced its intention to expand this proactive warning system to encompass its AI chatbot offerings later in the year.
