Blog content

Instagram will start hiding more ‘potentially dangerous’ content from your feed

Instagram has announced a new moderation feature that hopes to clean up user feeds. The platform now acts more on “potentially harmful” content as well as on posts containing content that you regularly report.

In one blog post on company website Yesterday, the platform announced its new moderation changes. It will not delete any additional messages as part of this change. But it will start showing potentially dangerous content much lower on users’ feeds.

Of course, the platform already has moderation practices to deal with content that goes against its Community rules. Posts known to contain bullying, hate speech or promote violence are automatically removed by the platform.

This new policy deals with posts that the platform thinks can contain potentially harmful content. For posts that fall into the maybe category, Instagram will now start showing them lower in users’ feeds.

Image: KnowTechie

In addition to downgrading these “maybe” posts, the platform also reviews users on an individual basis to address the content each user tends to flag. Posts containing this type of content will also be displayed lower in that individual user’s feed.

Now your reports will (hopefully) prevent other posts with similar content from appearing so frequently and heavily on your feed.

This is not an Instagram extension Community Rules. Instead, it’s an effort to curb some harmful content that’s about to break the platform’s rules. It should also give users more control over the type of content they see on the platform, which is always a good thing.

Do you have any thoughts on this? Let us know below in the comments or forward the discussion to our Twitter Where Facebook.

Editors recommendations: