Instagram is changing its app to show more viral and current affairs posts amid complaints from its staff that pro-Palestinian content was not seen by users during the recent conflict in Gaza.

Until now, the social media app has prioritised original content in the “stories” that it displays at the top of a user’s feed over content that is being reshared or reposted from other people.

Now Instagram will rank original and reposted content equally, according to two people familiar with the situation and internal staff messages, in a move that will help posts about breaking news find a wider audience.

A spokesperson said there had been an increase in users sharing posts about the recent conflict in Gaza, but the way the app is currently set up had a “bigger impact than expected” on how many people had seen the posts.

“Stories that reshare feed posts aren’t getting the reach people expect them to, and that’s not a good experience,” the spokesperson said. “Over time, we’ll move to give equal weighting to re-shared posts as we do originally-produced stories.”

Instagram said the move was not wholly in response to the problems over pro-Palestinian content, but had been considered for some time.

The spokesperson said the algorithm had “caused people to believe we were suppressing stories about particular topics or points of view”, but added: “We want to be really clear — this isn’t the case. This applied to any post that’s re-shared in stories, no matter what it’s about.”

A group of as many as 50 employees inside Facebook, the owner of Instagram, had raised concerns about the suppression of pro-Palestinian voices, said one employee involved.

The employee said the group had made more than 80 appeals over content that had been censored by the company’s automated moderation system. BuzzFeed earlier also reported on the existence of the group.

Facebook’s algorithms had labelled words commonly used by Palestinian users, such as “martyr” and “resistance” as incitements to violence and removed posts about the al-Aqsa mosque after mistakenly associating the third holiest site in Islam with a terrorist organisation, according to US media reports.

The employee told the Financial Times that they did not believe there was deliberate censorship on Facebook’s part, but suggested that “moderating at scale is biased against any marginalised groups” and leads to the overenforcement of takedowns.

Facebook said: “We know there have been several issues that have impacted people’s ability to share on our apps. We’re sorry to anyone who felt they couldn’t bring attention to important events, or who felt this was a deliberate suppression of their voice. This was never our intention, nor do we ever want to silence a particular community or point of view.”



READ NEWS SOURCE

LEAVE A REPLY

Please enter your comment!
Please enter your name here