During the war between Israel and Palestine in Gaza last year, the Facebook company-owned Instagram app began blocking all posts linked to the hashtag # alaqsa # on the historic Al-Aqsa Mosque in Jerusalem, causing Arabic-speaking users to I was found to be angry.
Facebook later apologized, explaining that its algorithm had mistakenly considered the third holiest site for Muslims to be a reference to the Al-Qassa Martyrs Brigade, a militant group of the Fateh Party.
The revelation came in documents recently released by Frances Hogan, a former Facebook employee. On the one hand, it shows the flaws of Facebook's monitoring and automation in other languages, on the other hand, Arabic-speaking users see it as a restriction on political and freedom of expression by this giant social media platform.
Facebook has a large number of Arabic-speaking users and it is common for the company to issue apologies for repeating such mistakes.
But documents submitted by Francis Hogan to the US Securities and Exchange Commission and Congress show that the mistakes made by the company were not due to ignorance. Facebook has long been aware of these serious flaws in its system, but no significant steps have been taken to address them.
These flaws are not limited to the Arabic language. A closer look at the files reveals how the lack of moderators who understand local languages and cultural backgrounds can lead to hateful extremist content in some of the most volatile parts of the world. Growing up on a media platform.
Meanwhile, Facebook company's platforms, artificial intelligence programs, have also failed to offer solutions that can capture vulnerable and harmful content in different languages in a timely manner.
In countries like Afghanistan and Myanmar, Facebook's scandal continued to spread malicious posts and news. On the other hand, in states like Palestine and Syria, freedom of expression was suppressed due to the ban on a few words and people were also deprived of publishing ordinary posts.
It is clear from the files that the Facebook company acknowledges that the spread of hateful material and provocative posts and false news against the Rohingya Muslim minority in Myanmar has continued unabated. Following the genocide of Rohingya Muslims around the world, Facebook announced that it would hire hundreds of moderators familiar with local culture, politics and religious nuances to secure the platform in Myanmar, but the company never did. How many such people did he actually recruit?
According to experts, the root of the problem is that the platform was never created with the intention that it would one day become a source of political expression for people around the world. But in the current situation, when Facebook has gained so much political importance and it has so many resources, the lack of supervision is a question mark.
In a statement issued by a Facebook spokesperson on the situation in response to a query from the news agency AP, it said that in the past two years, Facebook has added people who understand local languages, dialects and local backgrounds to the team. Has been invested in to better monitor the ongoing activities on the platform.
However, the company acknowledged that much work remains to be done on Arabic language content.