WhatsApp is trying to rein in fake news on its platform. So, to put its reputation back on track, they are trying to restrict the feature that fuels fake news. Yesterday, the app (which is owned by Facebook) announced that users will only be able to forward a message up to 5 times in order to keep viral content to a minimum.
WhatsApp sent emails announcing this change. Their statement read:
“Starting today, all users on the latest versions of WhatsApp can now forward to only five chats at once, which will help keep WhatsApp focused on private messaging with close contacts. We’ll continue to listen to user feedback about their experience, and over time, look for new ways of addressing viral content.”
Fake News on WhatsApp
Until now, anyone could craft a message claiming to be fact and forward it to 20 people. This allowed users to spread racist stories and messages blaming specific people or ethic groups for incidents, sometimes leading to violence.
The most gruesome incident happened in India last July when a mob murdered five members of a nomadic tribe passing through a village in India after a fake message accused them of being child abductors. The people who formed the lynch mob saw what they thought was a video of a kidnapping on WhatsApp with a message blaming members of the tribe and spread it to their contacts. The video was actually part of a child abduction education campaign and was not footage from a real kidnapping.
Another recent incident happened during the presidential election in Brazil, where the app has 120 million users. Leftist politician Fernando Haddad is accusing businessmen associated with his right-wing rival Jair Bolsonaro of spreading conspiracy theories and other bizarre stories about him via the app. Bolsonaro won the October election.
What Has Changed?
It’s clear that WhatsApp is taking its possible role in these incidents seriously. A spokesperson told the BBC:
“The forward limit significantly reduced forwarded messages around the world. [This] will help keep WhatsApp focused on private messaging with close contacts. We’ll continue to listen to user feedback about their experience, and over time, look for new ways of addressing viral content.”
Facebook, which owns WhatsApp, is under scrutiny by law enforcement agencies all over the world. This has led to several policy changes. Facebook recently cracked down on hundreds of accounts under Russian influence and agreed to let a fact-checking service flag content on its platform.
Now, under this new rule, WhatsApp users can only forward stories of any kind to a much more limited number of users, a rule it applied in India last July. However, WhatsApp groups can still contain 256 people, so the news can still reach 1,280 people from one account. However, this is down from 5,120 under the old rule.
WhatsApp’s end-to-end encryption makes sure only senders and receivers can see the messages. This makes it next to impossible for the company to identify the perpetrators of problematic content. After the lynchings, the Indian government sought ways of holding WhatsApp responsible for dangerous content and fake news. But this would mean that the government would also have to go after the encryption WhatsApp offers.
Even though WhatsApp authorities think five is a conservative number, the potential for group forwarding for over a thousand members is still troubling. However, the move is still a welcome one.
This can prove crucial in putting a stop to false stories that spread like wildfire on WhatsApp.