Can a machine discreetly safeguard your children?
The sexual abuse of minors in digital environments is a concerning and growing reality. Despite efforts by authorities to address it, the scale of the issue exceeds their resources. Artificial intelligence has emerged as a crucial tool to streamline investigations and anticipate risks, although its implementation presents legal and ethical challenges that must be considered.
### A Growing Threat in Silence
Every year, over 300 million minors fall victim to online sexual abuse, equating to one in twelve children globally. In Spain, between 10 and 20% of minors have encountered some form of online harassment or abuse, with inadequate police resources to investigate every case. Current legislation permits investigators to utilize publicly available digital information and inputs like grooming chats to track evidence. However, the sheer volume of data surpasses human capabilities. Artificial intelligence can analyze vast amounts of information in seconds, identify patterns, and pinpoint potential suspects, making certain investigations manageable.
### Can Technology Act without Invading Rights?
While AI is already utilized in Spain, such as in predicting recidivism in gender violence cases, its application in cases of online child sexual offenses raises legal concerns. In some countries, systems review conversations or detect illegal images, prompting questions about the balance between efficiency and privacy. European regulations have progressed, with the 2016 Directive prohibiting judicial decisions solely based on automated analysis. In 2021, AI usage was permitted under conditions like transparency, verifiability, and final human decision-making. The 2024 European Regulation categorized AI use in criminal justice as “high-risk,” mandating security and human oversight. In Spain, the European framework serves as a reference in the absence of specific regulations.
### A Powerful Ally, but not Infallible
AI can assist in filtering and prioritizing the multitude of cases handled by the police, yet there are clear limitations. For instance, impersonating minors to catch offenders is prohibited. Additionally, a poorly designed algorithm can result in serious errors, including baseless accusations. To prevent these shortcomings, a comprehensive understanding of criminal phenomena and accurate data input are essential. Algorithms, not impartial by nature, reflect the biases of those who develop them. Therefore, a robust legal framework safeguarding fundamental rights is imperative.
AI and Justice: a Possible Balance
If properly utilized, artificial intelligence could revolutionize the protection of minors. While not a replacement for investigators, AI can serve as their invaluable support, facilitating more in-depth and rapid analysis and furnishing stronger evidence for decision-making. Ultimately, we have a potent tool at our disposal that, if employed correctly, has the potential to transform the safeguarding of the most vulnerable members of society. But this will only be achieved through a thorough understanding of how, when, and to what extent to deploy it.
