European Union develops new rules to combat child pornography
European Union lawmakers have agreed to develop rules that require Alphabet, Google, Meta, and other online services to identify and remove child pornography, according to Reuters.
The proposed rules on Child Sexual Abuse Material (CSAM), put forward by the European Commission last year, have become a point of contention between advocates for online safety measures and activists concerned about surveillance, advocating for data privacy.
The EU executive proposed the CSAM rules after the current system of voluntary detection and reporting by companies proved inadequate to protect children. Lawmakers must now discuss the final details with member states before the proposal becomes law, a process that could be completed by 2024.
The proposed law requires messaging services, app stores, and internet service providers to report and remove known and new images and videos, as well as cases of grooming. Additionally, the EU will establish a Centre for Child Sexual Abuse Content, which will forward reports to the police.
"The European Parliament's position removes indiscriminate chat control and allows only for targeted surveillance of specific individuals and groups reasonably suspicious of being linked to child sexual abuse material with a judicial warrant," stated the European Liberal Youth (LYMEC).
The fight against child pornography in Ukraine
Ukraine has increased penalties for the sexual exploitation of children. The document introduces a special mechanism for the protection of children affected by sexual violence, and it also imposes penalties for pornography.