We’ve written a lot about WhatsApp’s fake news problem over the last few months and how the problem has also gripped Facebook in various parts of the world. We’re technology advocates and marvel at the way apps can change our modern lives. We also, however, understand the power that social technology on a global scale can have and this is why we report on the bad stuff like fake news as well as the good stuff like helping you get to work on time, for example.
WhatsApp and Facebook’s fake news scandals have been truly shocking with WhatsApp being at the heart of lynchings and horrific beatings in India and Facebook helping facilitate genocide in Myanmar. It is in India where most action has already been taken, with WhatsApp starting a number of initiatives including placing limits on message forwarding capabilities and even setting up an investigatory helpline to look into how users interact with false news stories.
It now looks as though Facebook is starting to apply what it has learned from its WhatsApp efforts in India to other parts of the world experiencing similar problems.
Facebook is limiting message forwarding on Messenger in Sri Lanka and clamping down on purveyors of fake stories in Myanmar
Facebook has released a blog post describing its efforts to halt the spread of fake news across its network and the most telling claim is that Facebook is starting to replicate what it has been doing on WhatsApp. In the post, Samdih Chakrabarti, who is a Facebook Director of Civil Integrity said, “In Sri Lanka, we have explored adding friction to message forwarding so that people can only share a message with a certain number of chat threads on Facebook Messenger. This is similar to a change we made to WhatsApp earlier this year to reduce forwarded messages around the world.”
A screenshot posted to the Facebook blog indicates the limits Messenger uses in Sri Lanka face will be the same as those faced by WhatsApp users in India. If you’re forwarding a message on Facebook Messenger, you’ll only be able to forward it to five people at a time.
In Myanmar, where Facebook’s fake news problem has helped whipped up hatred of the local Rohingya Muslim population, Facebook is taking a different approach. Chakrabarti went on to say, “In Myanmar, we have started to reduce the distribution of all content shared by people who have demonstrated a pattern of posting content that violates our Community Standards, an approach that we may roll out in other countries if it proves successful in mitigating harm.”
They’re targeting the sources of the problem, which is refreshing, and Facebook even said it is prepared to ban individuals or organizations promoting violence. This echoes past action taken by the social network against the Myanmar military.
It is good to see Facebook taking affirmative action in its bid to stamp out a true evil that has been able to spread across its network. It is also heartening to hear Chakrabarti recognize that, “This is some of the most important work being done at Facebook, and we fully recognize the gravity of these challenges.”
The damage that has already been done though, can’t be ignored and cleaning up the mess is only half of it. If Facebook truly understands the gravity of the situation, it will act more responsibly in the future to ensure this type of thing never happens again. With similar problems growing at home and with the hugely significant unveiling of Facebook’s planned cryptocurrency, it is more important than ever that the social giant recognizes its responsibilities and moves to fulfill them.