News

I object to the AI ​​deciding about me

Every year, on January 28, the European Data Protection Day is celebrated, a celebration that reminds us of the importance of personal data and its use by private companies and public bodies.

It is interesting to stop and think about precisely that characteristic, the use or treatment of data, because the protection of personal data not only affects the custody of those who obtain it by lawful means, but also how they use it to make decisions. Not surprisingly, beyond the procedures and technical means necessary to manage their protection, data processing, particularly through Artificial Intelligence (AI) techniques, arouses extraordinary interest among companies.

Treatment of personal information and AI tools

In recent years, the processing of personal data is acquiring a new role due to the emerging Artificial Intelligence tools that are beginning to be used on the market every day.

We are not oblivious to the profiling of individual characteristics that companies do with our data in order to send us offers, products of interest or, what is more relevant and disturbing, obtaining a job or granting a financial loan. De facto, in some countries (obviously outside the European Union) the data is beginning to be used to assess even the “citizenship level” of a person.

In all these cases, Artificial Intelligence tools are becoming an increasingly widespread option. Precisely for this reason, the European Union has been considering new legislation regarding the use of AI in the processing of personal information for some time.

It is evident that the processing of personal data, regardless of its level of criticality, plays a fundamental role in the daily management of its custodians. Because, it is essential to think about whether we should allow its use for certain purposes. This is a right that the RGPD grants us; we may or may not accept that use, but we must explicitly state it.

However, even in that case, the RGPD also recognizes us other rights: the right to know the personal information that a third party has about us, the right to access that data, the right to have the data corrected if necessary, the right to withdraw consent to its use, the right to be forgotten or the right to transfer data to third parties.

Right of opposition and right of opposition to automatic treatment

We have left two other essential rights for last: the right to object and the right to object to automatic processing. These are precisely the points where the frictions with the use of Artificial Intelligence are greatest.

In 2020, the Spanish Agency for Data Protection published a guide that collected how AI tools should adapt to the RGPD to avoid discrimination, unfair decisions, or the denial of a service or product when a decision was made by a machine. However, today, the controversy as to the true adequacy remains latent.

The debate revolves around article 22 of the GDPR, which in its first section says: “All interested parties shall have the right not to be the subject of a decision based solely on automated processing, including profiling, that produces legal effects on them or significantly affects them in a similar way.”

Article 22 of the GDPR and the right to an explanation

Although the article has later nuances that could legitimize the depositories for this treatment, some experts have stated that the correct interpretation of this article must take into account one last right that we have not mentioned so far: the right to an explanation.

This right could call into question many of the AI ​​tools whose algorithms are hardly explainable by those who use them. The deep learning or the big data they are part of the technological component that is used for these decisions and on many occasions they are truly complex (if not impossible) to explain.

To overcome this difficulty, some experts believe that the incorporation of human presence in the decision process can solve the problem. In this way, AI tools would be presented as aids during the process whose final decision would be given to a person.

The role of the human decision maker

However, it cannot be ignored that human participation in this process should be meaningful, and not just a symbolic gesture. In fact, this should be carried out by people who have the authority and competence to change the decision. But not only that; they should also be able to explain it. Something that is not so evident when, for example, a video surveillance system based on AI techniques decides that a person is suspicious and suggests registration or arrest to the authority agent, or when a credit system proposes to grant or deny a line of credit or fire a person. The possible cases are numerous and the question we should always ask ourselves is Why have decision makers reached a particular conclusion about us? In the explanation may be precisely part of its illegality.

Whatever the case, it is worth remembering, precisely today, that the RGPD includes rights that as citizens we should not forget. Rights that we should exercise not only because our privacy is at stake but also because the treatment by third parties of that, sometimes undervalued, personal information can become illegal, and even more so when the decisions made with it are difficult to explain.

Signed: Juanjo Galán, Business Strategy at All4Sec

Related Articles