According to investigative journalism nonprofit ProPublica, Facebook, the parent company of WhatsApp, is somehow able to view the content of user messages. If this is true, then Facebook is likely to face another scandal, as the company has repeatedly stated that it does not have access to user messages.
With end-to-end encryption, when data is encrypted and decrypted directly on user devices, Facebook should not have access to the content of the messages. The source says WhatsApp employees view messages that are flagged by users as inappropriate content. It also notes that the company collects large amounts of metadata to detect prohibited content without having to view the content of messages. With reference to the messenger staff, it is noted that the moderators have the opportunity to “Check messages, images and videos of users”…
WhatsApp has a huge user base, which is why the platform is often used to spread misinformation and prohibited content. The company is making efforts to combat this, using special algorithms for identifying messages based on metadata analysis, applying restrictions on the number of messages sent, etc. According to ProPublica, service moderators still have access to the content of user messages.
“Before the message is sent, a notification appears on the user’s device screen:“ No one outside of this chat, not even WhatsApp, can read or listen to the messages. ” These statements are not true. WhatsApp has over 1,000 contract employees in Austin, Texas, Dublin and Singapore researching millions of pieces of user-generated content. “, – says the material ProPublica. It is also noted that moderators use special software to view messages flagged by users as inappropriate content.
According to the source, WhatsApp moderators operate in strict secrecy. There is no mention of Facebook or WhatsApp in the job listings for the “Content Moderation Officer” job, and people have to sign a nondisclosure agreement when they are hired. Since WhatsApp uses encryption, AI algorithms cannot scan all chats on their own. Instead, moderators gain access to the content of a user whose post is flagged as inappropriate. An allegedly offensive message is sent to the moderator, along with four previous remarks, including images and videos. All this is transmitted unencrypted and goes to the queue, which is processed by the moderators.
In a statement from Facebook on the matter, there was no direct answer regarding end-to-end encryption. “We design WhatsApp to limit the amount of data we collect by providing tools to prevent spam, investigate threats and abuse, including based on user reports we receive. This work requires tremendous effort on the part of security experts. “, – said in the message of the representative of Facebook, who also noted that recently additional functions have appeared in the service to ensure the confidentiality of user data.