WhatsApp Moderators Can Read Messages – Report

whatsapp mobile

Concern raised again about how secure messages are on WhatsApp, after it emerges moderators can access messages if a complaint is filed

WhatsApp is once again facing questions over its encryption credentials, after it was reported that if a user complains about inappropriate content or someone’s account, the messaging app can then access a person’s recent messages.

A report on ProPublica, the on-profit website that produces investigative journalism in the public interest, reported that WhatsApp “has an extensive monitoring operation and regularly shares personal information with prosecutors.”

Sharing information with law enforcement is nothing new, and this is not the first time that questions have been raised about WhatsApp’s content security.

In 2017 for example, Tobias Belter, a security researcher at the University of California, Berkeley, discovered that WhatsApp is able to force the creation of new encryption keys for offline users so messages that aren’t delivered can be sent if a recipient changes their SIM card or device for example.

whatsapp

Gathering metadata

And in May 2019 Russian entrepreneur and Telegram founder Pavel Durov publicly stated that WhatsApp will never be secure, in a blog which alleged that the messaging app seemed “strangely suitable for surveillance purposes.”

But WhatsApp executives have always publicly asserted that messages are secure and Facebook cannot access messages, because all messages have end-to-end encryption.

Indeed, Mark Zuckerberg in testimony to the US Senate in 2018 said that WhatsApp messages are so secure, that nobody else – not even the company – can read a word.

Zuckerberg had said “We don’t see any of the content in WhatsApp.”

But ProPublica has alleged that this may not be the full story, as WhatsApp’s content moderation system allow moderators to examine messages from threads that have been reported by users as possibly abusive.

This, it should be noted, does not break WhatsApp’s end-to-end encryption.

Content moderators

It is known that WhatsApp moderators do hand over metadata to law enforcement; and the company does share user data amongst its app ecosystem.

But ProPublica this week reported that at least 1,000 moderators are employed by Facebook’s moderator contract firm Accenture (in offices in Austin, Texas, Dublin and Singapore), whose job it is to review user-reported content that has been flagged by its machine learning system.

These moderators monitor for, among other things, spam, disinformation, hate speech, potential terrorist threats, child sexual abuse material (CSAM), blackmail, and sexually oriented businesses.

And rightly so.

But unfortunately Facebook’s AI system does also allegedly send moderators an inordinate number of harmless posts, such as children in bathtubs.

And once the flagged content reaches them, ProPublica reports that moderators can see the last five messages in a thread.

And from there, the moderators can decide whether to ban the account, put the user “on watch,” or leave it alone.

WhatsApp discloses, in its terms of service, that when an account is reported, it “receives the most recent messages” from the reported group or user as well as “information on your recent interactions with the reported user.”

But ProPublica says this does not indicate whether WhatsApp can gather phone numbers, profile photos, linked Facebook and Instagram accounts, IP address, and mobile phone ID.

And this is on top of all users’ metadata, no matter their privacy settings.

It pointed out that this contradicts WhatsApp’s public statements earlier this year in a lawsuit against the Indian government, over that country’s IT laws to regulate content on social networking platforms and streaming services.

Broken encryption?

A security expert was quick to point out that this does not mean that WhatsApp’s end-to-end encryption has been broken.

“According to WhatsApp’s Terms and Conditions, if a user complains about inappropriate content or someone’s account, the service then has access to their recent messages. Numerous people have falsely concluded that this regulation annuls end-to-end (E2E) encryption,” said Victor Chebyshev, senior security researcher at Kaspersky.

“We need to distinguish between such terms as end-to-end encryption and the ‘report’ button, because they are completely different algorithms,” said Chebyshev. “End-to-end encryption gives the user and recipient a special key to unlock and read messages. Even if the messenger provides end-to-end encryption, it doesn’t mean that your interlocutor cannot send private messages from your chat to someone else without your knowledge.”

“And vice versa – after hitting the ‘report’ button, WhatsApp moderators do not get access to all of your data and collect it,” said Chebyshev. “They receive information that you provide them with only after you ask. Hence it’s not realistic to claim that WhatsApp gets access to exactly five recent messages, as claimed.”

“We conclude this based only on WhatsApp’s Terms and Conditions, nevertheless, there is no technical proof for this assumption yet,” said Chebyshev.

“Speaking of privacy concerns, it is important to remember that no type of online communication can be absolutely 100% private,” said Chebyshev. “The presence of encryption and trust in an application are completely different things – and trusting the person you are chatting with is a whole different issue. Even the most secret and protected chat can be photographed, and likewise, E2E traffic encryption does not mean that the other person will not send your message to someone else.”