Site icon ETHIO12.COM

Facebook’s role in Myanmar and Ethiopia under new scrutiny

Whistleblower Frances Haugen adds to long-held concerns that social media site is fuelling violence and instability

The guardian | 09.20.21 | 22:01

Whistleblower Frances Haugen’s testimony to US senators on Tuesday shone a light on violence and instability in Myanmar and Ethiopia in recent years and long-held concerns about links with activity on Facebook.

“What we saw in Myanmar and are now seeing in Ethiopia are only the opening chapters of a story so terrifying, no one wants to read the end of it,” Haugen said in her striking testimony. Haugen warned that Facebook was “literally fanning ethnic violence” in places such as Ethiopia because it was not policing its service adequately outside the US.Advertisementhttps://9d141cc8a1aabcdc35c3b7ab52a44b76.safeframe.googlesyndication.com/safeframe/1-0-38/html/container.html

About half of Myanmar’s population of 53 million use Facebook, with many relying on the site as their primary source of news. In June this year, an investigation by the rights group Global Witness found that Facebook’s algorithm was promoting posts in breach of its own policies that incited violence against protesters marching against the coup launched by the military in February.

Researchers began by liking a Myanmar military fan page, which was not seen to be violating Facebook’s terms. They found that Facebook then suggested several pro-military pages that did contain abusive content.

“We didn’t have to look hard to find this content; FB’s algorithm led us to it,” said Rosie Sharpe, a digital researcher who worked on the report. “Of the first five pages they recommended, three of them contained content that broke FB’s rules by, for example, inciting or glorifying violence.”

The link between social media posts and offline violence in Myanmar had already been widely documented. In 2018 a Guardian analysis revealed that hate speech exploded on Facebook at the start of the Rohingya crisis the year before, when attacks by armed groups and ordinary communities on people from the Muslim minority erupted.Advertisementnull

Thousands of posts by nationalist, anti-Rohingya supporters gained traction online, including posts which falsely claimed mosques were stockpiling weapons. An independent investigation commissioned by Facebook later agreed with assessmentsthat the site had been used to incite offline violence.

“What happens on Facebook matters,” Sharpe said. “Promotion of violence online leads to real-world harms. That’s particularly true in Myanmar, where Facebook has admitted that it played a role in inciting violence during the military’s genocidal campaign against the Rohingya.”

Facebook has faced similar criticism in Ethiopia, which has been engulfed in an armed conflict between the federal government and the Tigray People’s Liberation Front (TPLF). In 2019, for instance, the retired Ethiopian runner Haile Gebrselassie blamed “fake news” being shared on Facebook for violence that left 81 people dead in Oromia region.

After another outbreak of ethnic violence in 2020 – sparked by the killing of a popular singer from the Oromo ethnic group – an investigation by Vice claimed that the violence had been “supercharged by the almost-instant and widespread sharing of hate speech and incitement to violence on Facebook, which whipped up people’s anger”.Advertisementnull

In her testimony Haugen blamed engagement-based ranking for “literally fanning ethnic violence” in countries like Ethiopia. “Facebook … knows, they have admitted in public, that engagement-based ranking is dangerous without integrity and security systems, but then not rolled out those integrity and security systems to most of the languages in the world,” Haugen said. And that’s what is causing things like ethnic violence in Ethiopia.”

Sharpe said legislators were not doing enough to hold social media companies to account.

“The EU has gone the furthest towards doing this. There’s draft legislation in the EU, the digital services act. If it was passed it would require very large online platforms to have to assess and mitigate the risk of their algorithms spreading content that impacts on our rights. However, the proposed law doesn’t go far enough as it would only give regulators the opportunity to scrutinise how algorithms work when they suspect wrongdoing.”

Facebook has pushed back forcefully against Haugen’s accusations. In a blogpostpublished on Tuesday evening its chief executive, Mark Zuckerberg, said “it’s just not true” that the company puts profit over safety.

… as you’re joining us from Norway, we have a small favour to ask. 

Guardian investigations bring about change for the better, particularly when we uncover rampant inequality and manifest unfairness. The Windrush project helped a generation of maligned immigrants settle their status. Our gig economy exposés have brought about redress and greater protection for workers. Our investigations into offshore wealth helped recover billions for exchequers. Snowden and Cambridge Analytica catalysed a global data privacy movement.

This work can take months to accomplish – but it matters. Power is getting more powerful. The best way to keep it in check is to find out what it is up to, and tell the world.

We have no shareholders and no billionaire owner. Just the determination and passion to deliver high-impact global investigations, always free from commercial or political influence. Reporting like this is vital for democracy, for fairness and to demand better from the powerful. 

And we provide all this for free, for everyone. We do this because we believe in information equality. Greater numbers of people can keep track of global events, understand their impact on people and communities, and become inspired to take meaningful action. Millions can benefit from open access to groundbreaking investigative reporting, regardless of their ability to pay for it. 

If there were ever a time to join us, it is now. Every contribution, however big or small, protects our independence, powers our journalism and sustains our future.

Exit mobile version