Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - UK
The Guardian - UK
Technology
Dan Milmo Global technology editor

Facebook tests tool to allow users to manage content they see

Whistleblower Frances Haugen gives testimony at the US Senate.
Whistleblower Frances Haugen gives testimony before the US Senate. Photograph: REX/Shutterstock

Facebook is testing a feature allowing users to control how much content they view from friends, groups and public figures, amid accusations that the social media platform is not doing enough to curb inflammatory posts.

Users will be able to reduce or increase the amount of content that they see from specific categories on their news feed, a customised scroll of content from friends, pages and groups that is a central plank of users’ experience of the platform.

Facebook’s menu for managing the contents of users’ news feed will offer options to control the amount of content from friends and family, groups, and pages and public figures. People in the test can keep the amount of posts in their feed under those categories at “normal”, or have them reduced or increased. Users will also be able to do the same with topics.

“This is part of our ongoing work to give people more control over [their]news feed, so they see more of what they want and less of what they don’t,” said Facebook’s owner, Meta, in a statement.

Facebook’s news feed, and the algorithms that control it, is a regular source of controversy and attempts at adjustment by its owner.

Documents released in recent months by a whistleblower, Frances Haugen, showed that a change to Facebook’s news feed algorithm in 2018, to promote reshared posts had the effect of pushing toxic and violent content as well as misinformation.

Haugen has also accused Meta of not doing enough to curb harmful content in general, with some of the documents she has released pointing to Facebook’s struggles to contain misinformation before the 6 January riots in Washington.

Meta has denied that Facebook pushes divisive content. Speaking at the annual Web Summit this month, Meta’s vice-president of global affairs, Nick Clegg, said Facebook’s content was largely “babies, barbecues and barmitzvahs” and that promoting toxic content went against the platform’s commercial self-interest.

As well as announcing the content test, Meta said it was launching an experiment allowing some advertisers on Facebook to avoid displaying their adverts to users who have recently engaged with posts related to news and politics, social issues, and crime and tragedy.

“When an advertiser selects one or more topics, their ad will not be delivered to people recently engaging with those topics in their news feed,” said Meta.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.