CSAM EU Reporting obligations
CSAM EU Reporting obligations
Background
Yubo is a mobile livestreaming and social discovery application, available from age 13, and featuring an instant messaging function. Ensuring the safety of minors is a primary concern, which relies on a combination of human moderators and technology. Among the technical measures in place, we have deployed automated tools to protect young users, including voluntary detection of content that could constitute child sexual abuse material (hereinafter referred to as “CSAM”) or unlawful child solicitation (hereinafter referred to as “grooming”).
1. The type and volume of data processed;
We use automated technologies on a voluntary basis to detect CSAM and grooming content in the user-generated content on the platform. These technologies can spot suspicious content and generate reports that will be reviewed by moderators. Such reports may include user account information, user-generated content (including direct messages where relevant).
2. The specific ground relied on for the processing pursuant to Regulation (EU) 2016/679;
Depending on the case, the technology used to detect CSAM content or grooming may rely on one or more of the following legal grounds:
- the protection of the vital interests of any natural person who may be a victim of child abuse. For this reason, if our technology detects a suspicious content, our safety experts will review it and report CSAM and grooming to law enforcement authorities.
- the contract between Yubo and the data subjects, including the Terms of use (https://www.yubo.live/legal/terms-of-service) and the Community Guidelines (https://www.yubo.live/community-guidelines). Yubo uses detection tools on a voluntary basis to apply and enforce these documents, and in particular its commitment to the safety and protection of minors.
- Yubo’s legitimate interest to clean the platform of dangerous, illegal or fraudulent behavior (including child abuse) which ensures the confidence of users, parents and public authorities.
More informaton on legal bases used by Yubo can be found in the Privacy policy available here: https://www.yubo.live/legal/privacy-policy.
3. The ground relied on for transfers of personal data outside the Union pursuant to Chapter V of Regulation (EU) 2016/679, where applicable;
As Yubo uses data processors located outside the European Union, some data may be stored outside the EU territory, especially the United States of American. In that case, Yubo uses appropriate safeguards such as the EU Commission’s Standard Contractual Clauses, or the EU-US Data Privacy Framework.
4. The number of cases of online child sexual abuse identified, differentiating between online child sexual abuse material and solicitation of children;
Yubo has suspended 7,720 accounts in the European Union in 2023, which includes 940 CSAM-related cases and 6,780 cases related to grooming or child sexual exploitation.
5. The number of cases in which a user has lodged a complaint with the internal redress mechanism or with a judicial authority and the outcome of such complaints;
In 2023, we received a number of 8,400 appeals to restore a suspended account worldwide. We estimate that around 23% of these cases relate to EU-based accounts and 60% relate to CSAM or grooming violations. We estimate that approximately 50 EU-based accounts were restored following such appeals.
To our knowledge, no user has lodged a complaint with a judicial authority in the EU.
6. The numbers and ratios of errors (false positives) of the different technologies used;
Images:
- Hash databases from NCMEC and IWF: We took action on 2 cases matching the hash databases.
- We detect nudity with the Hive which may lead our moderation team to identify CSAM; accuracy rates for moderation in this area averaged 90% in 2023.
Text:
- In-house grooming detection algorithm: For 2023, accuracy rate reached approximately 80% on average. It means that in about 20% of cases, moderators did not take action. Moderators’ work is also subject to review and quality assurance. Changes have been made in our child enticement policy this year; therefore it can impact the accuracy rates until the moderators are trained and master the new policy completely.
7. The measures applied to limit the error rate and the error rate achieved;
The media matching technologies and hash databases are hold by reputable child safety organizations (like the NCMEC and the IWF). Such databases are fed by industry players. When there is a match between a media published on Yubo and the hashes of previously-identified CSAM in those databases, a human safety specialist will review this case and confirm whether this is not a false positive.
Our safety algorithms are subject to continuous improvement. First, moderators review the cases flagged by algorithms and reported in dedicated queues; moderators are trained and can therefore spot false positives and not take any action where appropriate. Then, Quality Assurance (QA) procedures are in place in which human specialists review the quality of work carried out by moderators.
8. The retention policy and the data protection safeguards applied pursuant to Regulation (EU) 2016/679;
Yubo has security measures in place such as encryption in transit (using TLS 1.3 protocol), access control, logging system, and data protection governance, including data protection impact assessments and limited retention periods. Data related to content moderation is usually kept for 12 months. Retention periods depend on the type of content, type of violation and storage conditions.
9. The names of the organisations acting in the public interest against child sexual abuse with which data has been shared pursuant to this Regulation;
National Center for Missing and Exploited Children (NCMEC)
Internet Watch Foundation (IWF)
NCEMC for the U.S. market, PHAROS (Plateforme d'harmonisation, d'analyse, de recoupement et d'orientation des signalements) for Europe.