In the recent past, Facebook and other tech giants have come to be seen as the villains that have given a platform to bullies, trolls, bots and extremists.
The popular social network was blamed for not doing to prevent a genocide in Rohingya, for enabling Russian manipulation of the United States election, and in Africa, hate speech occasionally flares up during periods of political struggle.
In November last year, Facebook’s founder, Mark Zuckerberg shared his vision for how content should be governed and enforced on the platform.
We want Facebook to be a place where people can express themselves and freely discuss different points of view, whilst ensuring that it remains safe for everyone.
The vision has set in motion several strategies including setting up community standards and publishing detailed internal guidelines, working with civil society organisations to set up an independent oversight board for content decisions and establishing content review centers.ALSO READ: WhatsApp users can now only forward texts to 5 people to curb fake news
Africa’s first content review center
Facebook is set to open its first content review center in Sub-Saharan Africa, aimed at improving the safety and security of users across the continent.
The global tech giant says it will employ approximately 100 reviewers by the end of the year, supporting a number of languages, including Somali, Afaan Oromo, Swahili and Hausa.
Facebook has partnered with Samasource, which it describes as ‘one of the largest digital employers in East Africa’, to run the center that will be located in in Kenya’s capital, Nairobi.
Explaining the decision to set up the content review center, Fazdai Madzingira, Public Policy Associate for content said:
“Over the years, we have made significant investments globally, and locally in ensuring that people see the content they want to see, and are aware of what is and isn’t allowed on the platform. We want Facebook to be a place where people can express themselves and freely discuss different points of view, whilst ensuring that it remains safe for everyone.”
We spoke to Daniel Mwesigwa, a technology analyst in Uganda, to share with us his take on the latest attempts by Facebook to secure its platform.
How significant is the safety issue for Facebook users in Africa?
I think safety is a big issue because despite Facebook having dismal numbers in sub-Sahara Africa, about 150 million users, the general public thinks that anything that goes off Facebook or its sister platforms like Messenger and WhatsApp could be very important in not only amplifying discord, but also in what could translate offline. So, safety is a very important issue.
How exactly will the Content Review Center work to improve safety?
So, the content review center works in a very simple way. They hire people who know the nuances of a particular language, for example Swahili which is spoken by over 90 million people in East Africa. So, the content reviewer has the capability of filtering the content, which means they can filter it to be deleted or filter it to be hidden from the timeline.
Do you as a Facebook user and a Technology analyst have faith in Facebook’s safety strategy? Please tell us Why or Why not?
On one hand, I applaud all of these decisions to enable Facebook to properly moderate content, to uphold freedom of expression, to also uphold the sanity of the platform, to exterminate it of trolls, genocide and any other vitriolic content, I think its important. However, on the other hand, this body, in particular this his body which is being led by Facebook comprising different civil society organisations all over the world may not have the urgency or the capability to react in real time. What do I mean? Content on Facebook must be vetted in real time, yet these bodies will only sit and verify content after a period of time.@danmumbere