Meta to restrict teen Instagram and Facebook accounts from seeing content about self-harm and eating disorders

Meta to restrict teen Instagram and Facebook accounts from seeing content about self-harm and eating disorders

technology By Jan 09, 2024 No Comments

Meta to Restrict Teen Access to Harmful Content on Instagram and facebook

Meta, the parent company of Instagram and facebook, has recently announced a significant update that aims to protect teenage users from accessing harmful content on their platforms.

Automatic Restrictions for Teen Accounts

Teen Instagram and facebook accounts will now be automatically restricted from viewing content related to self-harm, graphic violence, and eating disorders. This comes as part of Meta’s efforts to address concerns about the potential negative impact of its platforms on young users.

Impact of the Changes

While meta has previously taken steps to limit access to sensitive content in certain areas of its platforms, these new changes will expand the restrictions to the Feed and Stories sections, ensuring that such content is no longer visible, even if shared by someone the teen follows.

Consultation with Experts

Meta has emphasized its commitment to consulting with experts in adolescent development, psychology, and mental Health. This engagement allows the company to make informed decisions about what type of content is appropriate and safe for young users to access on their platforms.

policy on Sharing Personal Struggles

While Instagram and facebook permit users to share their personal struggles with issues like self-harm and eating disorders, Meta’s policy is to not actively recommend such content and to make it less accessible. The company acknowledges the importance of these discussions but recognizes the complexity and sensitivity of the topics.

Promoting Expert Resources

When users search for terms related to sensitive topics, meta will hide certain results and instead display expert resources to provide support and guidance. This approach seeks to ensure that individuals seeking help for these issues are directed to appropriate resources.

Adjusting Content Control Settings

All teen accounts on Instagram and facebook will now be automatically placed in the platforms’ most restrictive content control setting. This setting ensures that potentially sensitive content or accounts are harder for teens to encounter in areas such as Search and Explore.

Notification Encouraging privacy Settings

meta plans to introduce notifications prompting teens to update their settings to enhance their privacy on the platforms. These notifications will appear when a teen has an interaction with an account they are not friends with, aiming to guide them toward a more secure online experience.

Rollout and Congressional Testimony

meta has announced that these measures will be implemented in the coming weeks. The timing of the changes coincides with meta‘s scheduled testimony before the Senate on child safety, where the company’s executives will face scrutiny on their platforms’ impact on young users.

Legal Challenges and Allegations

This move also comes amidst legal Action, with over 40 states suing Meta over allegations that its services have negatively impacted the mental health of young users. The lawsuit accuses Meta of disregarding the potential dangers of its platforms in pursuit of profits.


Meta’s decision to automatically restrict teen access to harmful content on Instagram and Facebook reflects the company’s commitment to prioritizing the safety and well-being of young users. By consulting with experts, adjusting content control settings, and promoting privacy measures, Meta aims to create a safer online Environment for adolescents.

Source: techcrunch

No Comments

Leave a comment

Your email address will not be published. Required fields are marked *