Europe advocates pursuing CSAM, at the cost of privacy

A very, very tense debate is brewing in Europe and its institutions. And it is that the European Commission has presented the definitive draft of a new regulation that, if approved, will mean an important turn in European actions in relation to the protection of privacy. The reason? Strengthen the fight against child abuse content of a sexual nature (CSAM), a more than defensible objective, but which still raises multiple interpretations, which will obviously be put forward by both parties.

And it is that, as we can read in the draft, if what is proposed in it is approved, something that still has to go through several phases and revisions that take time, Europe could begin to demand that technology companies actively participate in the pursuit of CSAM. This, put like this, does not seem problematic, of course, but when we see what the commission is considering asking technology companies to do, we see measures that, when deployed, could end end-to-end encryption and absolute control of users about the content of their devices.

The project aims to create a new agency, with a legal form yet to be determined, at European level, and whose main task would be to work with entities such as Europol and police forces from the various European countries on the one hand, and with the big tech which are the ones that have the possibility both of establishing systems for the analysis of their users’ files, and of establishing some type of back door in the communications systems that allows flat access to them.

This is not a surprise, Europe has long wanted to legislate in this sense, that is, in the establishment of the necessary technological means to circumvent the privacy protections exploited by pedophiles for the exchange of the type of criminal material that this norm intends to pursue. A type of content that, it is known, with the proliferation of the Internet entered a golden age from which it has not yet come out.

If Europe finally approves what we can read in the draft or a similar proposal, the participation of technology companies in this type of action will no longer be voluntary, which is the current status quo, and they will be obliged by law to provide the European authorities empowered for them, all the assets that are required of them. And unlike what is happening today, with the initiative in this regard in the hands of technology companies, they will depend on the authorities, regardless of whether or not they decide to additionally maintain their own initiatives in this regard.

To speak more clearly, if there is a suspicion that any European citizen creates, owns, exchanges and/or markets CSAM content, technology companies will be obliged, in Europe, to provide the authorities with all the digital assets of the suspected person, their conversations WhatsApp, the files it stores in the cloud, etc. Including, as indicated before, also those assets that are currently protected with end-to-end encryption.

Of course, the draft reflects a fairly guaranteeing framework of citizenship rights, while proposing the legal fit of this potential new norm with those already existing in Europe, such as the GDPR. It would not be, according to the legal text, a free bar of access to digital assets for citizens, always with the endorsement of the regulators and with the necessary protection systems to avoid the inappropriate exploitation of this preferential access.

Even so, it is undeniable that the very establishment of the necessary measures so that the authorities can access the content of citizens in Europe It ends, de facto, with the level of privacy offered today by services with end-to-end encryptionwhich at least in their theoretical definition do provide absolute privacy, making it impossible for a third party to access them unless they have the necessary decryption keys.

Europe advocates pursuing CSAM, at the cost of privacy

The simple establishment of this type of access, even if it is tremendously limited, already poses a major security issue, which we can be sure will result in many attempts to exploit it. From cybercriminals to governments and companies with services like the increasingly controversial Pegasus, we can bet that the implementation of backdoors, as Europe could ask tech companies, will start a fox hunt like we have never seen before.

If you remember the case of the San Bernardino iPhone, you may remember Tim Cook’s statements, just at a time when Apple was under intense pressure from the FBI to provide some system to bypass the protection of the iPhone and/or the encryption of its contents:

«Weakening security for the sake of promoting security simply doesn’t make sense and weakening encryption or backdooring devices would allow vulnerabilities to be exploited by “bad guys” causing serious harm to our society and our economy.»

And I quote Cook precisely because just a few months ago, Apple was immersed in a major controversy, precisely for acting in the sense that Europe intends to do now. It was in August of last year when those from Cupertino announced Neural Hash, an intelligent system based on AI that would analyze the images and videos that users upload to iCloud, in search of CSAM. The mobilization against that announcement generated forced Apple to postpone, without a date, its start-up.

Thus, if a private initiative such as Apple’s has already set off so many alarms, this draft, which has already begun its legal journey in Europe, will undoubtedly provoke a particularly virulent debate in which those who advocate the persecution of pederasty will confront with all possible means, against those who consider that this type of regulation poses an unacceptable threat to privacy, and that even if its initial use is exclusively that indicated by the standard, any subsequent modification in this regard may result in access to data and digital assets for other purposes.

Europe advocates pursuing CSAM, at the cost of privacy

The problem is that both positions make sense., both the fight against the sexual exploitation of minors and the defense of privacy are causes with which many of us instinctively align ourselves. However, in cases such as the regulation that Europe is studying, both conflict. And although in the first instance the protection of minors has much more weight, if we spend a moment thinking about the consequences of an illegitimate use of these tools, we see that we could face the absolute and definitive loss of our privacy.

The European Commission affirms that this draft has the approval of all the parties involved in the persecution of the CSAM, but there are no references to the other side, to entities and experts who watch over privacy. Surely soon we will see communications in this sense, and they are voices that Europe should add to the discussion and review of this draft, before the steps for its approval begin.

What do you think about it? Do you think that Europe is doing the right thing by prioritizing the protection of minors, even if this implies a commitment to privacy? Or, on the contrary, do you think that other formulas should be sought to persecute pedophiles and their material, but that it should not be at the cost of sacrificing privacy?

Related Articles

Leave a Reply

Your email address will not be published.