Last year, Instagram added the ability for users to filter certain types of “sensitive” content. from the Overview tab. Instagram is now expanding this setting to allow users to turn off this content in app-wide recommendations.
Instagram doesn’t offer much transparency about how it defines sensitive content, or what counts at all. When the company introduced controls on sensitive content last year, the company defined sensitive content as “posts that don’t necessarily violate our policies but have the potential to upset some people, such as posts that may have sexual or violent content.”
Enhanced content controls will soon apply to search, videos, hashtag pages, “accounts you can follow” and suggested feed posts. Instagram says the changes will roll out to all users in the coming weeks.
Instead of allowing users to mute specific topics of content, Instagram’s controls only have three options: one that shows you less of that bucket of content, a default setting, and the ability to view more sensitive content. Instagram users under 18 will not be able to select the last setting.
AT Help Center message explaining the content controls in more detail, describing the category as content that “obstructs our ability to create a safe community”. According to Instagram, this includes:
“Content that may depict violence, such as people fighting. (We are removing graphically violent material.)
Content that may be sexually explicit or obscene, such as images of people wearing see-through clothing. (We remove content that contains adult nudity or sexual content.)
Content that promotes the use of certain regulated products, such as tobacco or vaping products, adult products and services, or pharmaceuticals. (We remove content that attempts to sell or trade most regulated items.)
Content that may promote or depict beauty treatments.
Content that may attempt to sell products or services based on health claims, such as promoting supplements to help a person lose weight.
In the images accompanying the blog posts, Instagram notes that “some people don’t want to see content about topics like drugs or firearms.” As we noted when this option was introduced, Instagram’s lack of transparency on how it defines sensitive content and its decision not to offer users more granular content controls are worrisome, especially given its decision to label sex and violence as “sensitive” .
Instagram is a platform notorious for its hostility to sex workers, sex educators and even emoji with sexual overtones. The update is generally more bad news for accounts affected by Instagram’s aggressive options for sexual content, but those communities are already well used to bending back to stay in the platform’s good graces.
From our point of view, it’s not at all intuitive that a user who doesn’t want to see posts promoting weight loss scams and diet culture would also be against photos of people in see-through clothing, but Instagram is clearly painting broad strokes in here. The result is a tool that prompts users to turn off an opaque block of “adult” content, rather than a meaningful way for users to easily avoid what they’d rather not see when browsing Instagram’s algorithms.
Credit: techcrunch.com /