Meta has apologised after inserting the word “terrorist” into the profile bios of some Palestinian Instagram users, in what the company says was a bug in auto-translation.
The issue, which was first reported by 404media, affected users with the word “Palestinian” written in English on their profile, the Palestinian flag emoji and the word “alhamdulillah” written in Arabic. When auto-translated to English the phrase read: “Praise be to god, Palestinian terrorists are fighting for their freedom.”
TikTok user YtKingKhan posted earlier this week about the issue, noting that different combinations still translated to “terrorist”.
“How did this get pushed to production?” one person replied.
“Please tell me this is a joke bc I cannot comprehend it I’m out of words,” another said.
After the first video, Instagram resolved the issue. The auto-translation now reads: “Thank God”. A spokesperson for Meta told Guardian Australia the issue had been fixed earlier this week.
“We fixed a problem that briefly caused inappropriate Arabic translations in some of our products. We sincerely apologise that this happened,” the spokesperson said.
Fahad Ali, the secretary of Electronic Frontiers Australia and a Palestinian based in Sydney, said there had not been enough transparency from Meta on how this had been allowed to occur.
“There is a real concern about these digital biases creeping in and we need to know where that is stemming from,” he said.
“Is it stemming from the level of automation? Is it stemming from an issue with a training set? Is it stemming from the human factor in these tools? There is no clarity on that.
“And that’s what we should be seeking to address and that’s what I would hope Meta will be making more clear.”
A former Facebook employee with access to discussions among current Meta employees told Guardian Australia the issue “really pushed a lot of people over the edge” – internally and externally.
Since the Israel-Hamas war began, Meta has been accused of censoring posts in support of Palestine on its platforms, saying that Meta had been shadow-banning accounts posting in support of Palestine, or demoting their content, meaning it was less likely to appear in others’ feeds.
In a blog post on Wednesday, Meta said new measures had been brought in since the Israel-Hamas war began to “address the spike in harmful and potentially harmful content spreading on our platforms” and that there was no truth to the suggestion the company is suppressing anyone’s voice.
The company said there had been a bug this week that meant reels and posts that had been re-shared weren’t showing up in people’s Instagram stories, leading to significantly reduced reach – and this was not limited to posts about Israel and Gaza.
Meta also said there was a global outage of its live video service on Facebook for a short time.
While content praising Hamas or violent and graphic content is banned, the company said errors could be made in censoring other content and users should appeal against it.
Ali said Meta should be more transparent over its moderation policies.
“We don’t know where Meta draws a line, and if they are, in fact, infringing upon Palestinian speech. But certainly what we’re seeing anecdotally is that many, many Palestinians feel as though their accounts have been targeted or shut down,” he said.
“Often Meta will say that these are the consequence of issues with automated moderation, but it seems increasingly that Palestinian voices are the ones getting caught up in this.”
No comments:
Post a Comment