Sunday, June 25, 2023

The Rohingya and Facebook: A Case Study about hate

“Power and Progress,” the book recently published by Acemoglu and Johnson (AJ), should not be promoted as a book on technology written by two economists, but as a book about the battle for political and economic equality in times of technological change. It is an important book.  I look forward to reading reviews of it by other scolars in the forthcoming months and years. I hope that the debate that the authors try to encourage expands and at least awareness is raised about the importance of directing technological progress at the service of humanity and not the richest and powerful small minorities.

One of their messages is that the economic model of privately owned social media encourages the transmission of hate. We see that in developed democracies, every day. Grievances, ridiculous exagerations, insults, lies, whataboutisms… find a fast motorway in Facebook and Twitter. I am only in the latter, and the protections against hate there (which I try to use) seem to me like an umbrella to contain a nuclear attack. Worse: they are part of the problem, pretending they do something when the business model from which they profit is the engine of the problem.

AJ address the case of the abuse against the Rohingya minority in Myanmar in pages 356-359. By 2017 there were 22 million Facebook users in Myanmar out of a population of 53 million. In a “combustible mix of ethnic tension and incendiary propaganda,” Facebook employed only one person who monitored Myanmar and spoke Burmese but not most of the hundred or so languages used in the country. According to Acemoglu and Johnson and the independent sources they quote in their long bibliographic essay, “the platform had become the chief medium of organizing what the US would eventually call a genocide” against the Rohingya Muslim minority. 

What these economists say about Facebook’s business model had a dramatic, violent expression in Myanmar, but is applicable also in communities where at the moment there is now less (or no) violence: it was “based on maximizing user engagement (to enable the company to sell more individualized digital ads), and any messages that garnered strong emotions, including of course hate speech and provocative misinformation, were favored by the platform algorithms because they triggered intense engagement from thousands, sometimes hundreds of thousands, of users.”


No comments:

Post a Comment