Facebook, Genocide, and Accountability
It looks like Facebook materially contributed to a genocide:
In a new report, Amnesty claims that Facebook’s algorithms “proactively amplified” anti-Rohingya content. It also alleges that Meta ignored civilians’ and activists’ pleas to curb hate-mongering on the social media platform while profiting from increased engagement.
Worse, Facebook executives knew about the effects their algorithms were having and did nothing to stop them:
In its report, Amnesty concludes that Meta was made aware as early as 2012 of how its engagement-based algorithms were contributing to serious real-world harm in Myanmar. It alleges that the company has over the last 10 years willfully disregarded known human rights risks on its platform and implemented inadequate solutions, prioritizing profit over users’ safety.
Htaike Htaike Aung, a digital researcher who documents the history of the internet in the Southeast Asian country through the Myanmar Internet Project, tells TIME she met with senior Facebook executives about the social media platform’s effects in 2012 and 2013. “It felt like talking to a void,” she says.
Facebook relied on one Burmese speaker to moderate all the content for the region. It's decision to allow posting news items right to Facebook incentivized hate speech to flourish, because its algorithms prioritized engagement and the clickbait this feature encouraged drove engagement. They relied on AI to solve the content issue even though AI was demonstrably unable to do so because it was cheaper than hiring humans and would introduce far less friction and thus keep engagement and thus add revenue higher.
And nothing is going to happen to Facebook. Even though they ignored the issues in favor of profit, even though they designed a product that encouraged genocide and let it run for years, nothing will happen to them. No executives will face trail for their actions, the company will be allowed to continue to exist, and their business model will not be impacted at all. Even Amnesty International can only imagine Facebook paying restitution. A fine, in other words. And a fine is just a fee, a cost of doing business.
If we allow this kind of blatant evil to go unpunished, which we appear to be doing, it will happen again. Facebook is a corporation. If we are not going to hold the officers of the company personally liable for creating a product that assisted genocide despite knowing that it was assisting genocide, then we should destroy the company. Revoke its charter. If we are worried about the people who work there, social media is not a dying industry. I am sure Myspace will be happy to hire them. And if that seems too flip for you:
No one who works at Facebook can pretend, at this stage, to be unaware of what Facebook is or how it prioritizes making money at the expense of lives
Let the government seize its assets and run it, making it algorithms public as it does so to make sure we all know where the product when so horribly wrong, until it can be sold off. And of course, all proceeds of said sale go right into the government coffers, once reparations have been paid out. People who bought stock in Facebook did so knowing that they were buying into a compay designed ot be controlled by Zuckerberg -- they took their chances.
None of this is going to happen, of course. Despite being the creation of governments, corporations have been successful at warping the law to such an extent that they are largely unaccountable to it, even in the most extreme cases such as this. That doesn't make it right.
Facebook knowingly assisted in genocide. We should hound the company and the people who allowed that to happen until they day they die with that knowledge. And we should use this example to rededicate ourselves to the struggle to regain democratic control over these malefactors of wealth.
Because never again was supposed to mean never again by anyone -- even smarmy little Silicon Valley hotshots.