There was a downside to ‘meaningful social interaction’
Facebook reportedly filed complaints from political parties, saying a major News Feed change pushed them toward negative, polarizing posts. Today, wall street journal leaked report posted After promoting “meaningful social interaction” on the platform from Facebook. While Facebook framed the move as helping to connect friends, internal reports said it had “unhealthy side effects on important slices of public content such as politics and news”, calling these effects an “increasing liability”. .
part of the news Big wall street journal Chain Based on internal Facebook research. Today’s report highlights the repercussions of the 2018 decision to prioritize the post, with lots of comments and reactions. Facebook reportedly made the change noting that there was a drop in comments, likes and re-shares during 2017 – something that was partly attributable to people watching more professionally produced videos. publicly, CEO Mark Zuckerberg describes it As a way to increase “time well spent” with friends and family instead of passive video consumption.
After the change, internal research yielded mixed results. Daily active users increased and users found content shared by close connections to be more “meaningful”, but re-shared content (which was rewarded with change) had “misinformation, toxicity and violent content”. excessive” levels. People tended to comment on and share controversial content, and in the process they apparently angered Facebook in general.
A report flagged concerns by unknown political parties in the European Union, including one in Poland. “Research conducted in the European Union shows that political parties ‘strongly feel that changes in algorithms have forced them to negatively skewer their communications on Facebook, leading them to take more extreme policy positions. with the downstream effect of carrying,” it says. Facebook apparently heard similar concerns from parties in Taiwan and India.
In Poland, “one party’s social media management team estimates that they have shifted the ratio of their posts from 50/50 positive/negative to 80 percent negative, apparently as a function of changes in algorithms. ” And “many parties, including those that have shifted strongly to the negative, worry about the long-term effects on democracy.”
News publishers – frequent victims of Facebook’s algorithm – were surprisingly not happy with the change either. Facebook flagged that buzzfeed CEO Jonah Peretti complained that the change promoted things like “junky science” and racially divisive content.
Facebook frequently makes changes to the News Feed to promote different types of content, often explicitly responding to public concerns as well as financial considerations. (For example, the “time well spent” movement harshly stigmatized “mindless scrolling” on social media.) Facebook engineering VP Lars Backstrom pointed out magazine that “like any adaptation, there will be some ways in which it can be exploited or exploited.”
But, he magazine writes that when Facebook researchers proposed the improvements, Zuckerberg was hesitant to implement them if they threatened to reduce user engagement. Ultimately, though, Facebook would like Downplay the importance of commenting and sharing on the News Feed algorithm – putting more weight on what people have actually said.