Fk has been accused of concealing the facts of child
Software

F*****k has been accused of concealing the facts of child abuse

The New York Times (NYT) published materialaccording to which the world’s largest social network deliberately downplays the uncovered facts of child abuse.

    Image source: janeb13 / pixabay.com

Image source: janeb13 / pixabay.com

Journalists studied training materials used in preparing Facebook moderators*. The controversial moment in them was the actions of moderators due to the position of the administration of the platform when handling images of a sensitive nature, where the age of people is not obvious. In such cases, the documents instruct employees to “err on the adult side,” that is, they knowingly regard as adults those individuals whose age is visually difficult to determine, and who may or may not have reached the age of majority.

According to the rules of the resource, posting photos of an intimate nature on the platform is prohibited – such materials can be deleted by moderators without warning. However, if the age of the person in the photo appears to be in dispute, the person will be considered of legal age and the relevant authorities will not be informed of the incident.

NYT journalists emphasize that there is simply no reliable way to determine a person’s age from a photo and evaluate it on Facebook* use the more than 50-year-old method for determining “progressive phases of puberty”. To make matters worse, in some cases moderation is not performed by full-time employees of the social network, but by contracted third-party employees who have only a few seconds to make a decision. and they are fined for sending false signals to the authorities.

Facebook reports more child abuse cases to authorities than any other company, and the resource’s administration justifies its “making adult-direction mistakes” policy out of concerns for user privacy. Additionally, the company wants no legal liability for false alarms that could prevent law enforcement from uncovering actual violent incidents. However, other platforms, including Apple, Snapchat and TikTok, adhere to exactly the opposite policy in controversial cases, according to the NYT.

* It is included in the list of public associations and religious organizations for which the court made a final decision, activities on the grounds of Federal Law No. 114-FZ of July 25, 2002 “On Combating Extremist Activity”.

.

About the author

Robbie Elmers

Robbie Elmers is a staff writer for Tech News Space, covering software, applications and services.

Add Comment

Click here to post a comment