Transparency, from the medieval Latin “transparentia”, is thought to have emerged in the late 16th century as a general term for a transparent object. In essence, it means the property of allowing light to pass through so that objects behind it can be clearly seen. But in the 21st century, transparency has a different and broader meaning.
The Spanish Data Protection Agency (Agencia Española Protección de Datos, or AEPD) published an article in September 2023 on transparency in the context of the proposed Artificial Intelligence Act (AI Act) and the General Data Protection Regulation (GDPR), clarifying that different actors, different information and different recipients are involved, depending on the regulation.
Transparency in the context of the AIA proposal refers to the information that developers of AI systems must make available to users, understood as the entities that use these systems. Where AI systems are involved in the processing of personal data (or are a means of processing personal data), data controllers must obtain information about them that is sufficient to meet their various obligations under the GDPR. These include transparency to enable the exercise of rights, compliance with the principle of accountability and meeting the requirements of GDPR supervisory authorities in relation to their investigatory powers.
The transparency principle is set out in Article 5(1)(a), developed in recitals 39 and 58 to 62, and elaborated in Article 12 et seq. of the GDPR. The application of this transparency principle (hereinafter, “GDPR transparency”) is an obligation imposed on controllers of personal data processing to inform data subjects about the processing of their personal data.
In line with the above, the AEPD concluded that the obligations regarding the transparency of AI systems apply from the design of the AI system and throughout its life cycle, regardless of whether it processes personal data. Therefore, the term “transparency” is used in the AIA (Transparency-AIA) with a different meaning than in the GDPR (Transparency-GDPR).
Different Actors, Status and Nature
Transparency-AIA requires designers, developers, vendors and users/entities deploying AI systems.
Transparency-GDPR obliges data controllers.
Designers and developers may be controllers or processors if they use personal data in the design or development of the AI system.
Vendors may be controllers or processors if the AI systems store or process data of identified or identifiable data subjects.
Users/entities using an AI system may be controllers or processors if they use such a system as part of their processing.
Within the above, only those acting as controllers must comply with the GDPR’s transparency obligations.
However, under the transparency obligation, both users/entities deploying AI systems and natural persons or groups of persons affected by an AI system must be informed. These natural persons may be affected even if they are not data subjects (as defined in the GDPR), for example, if natural persons are the recipients of multimedia content generated by the AI system.
The Transparency-GDPR establishes an obligation for data controllers to inform data subjects about processing operations involving automated decisions and profiling. As far as automated decisions are concerned, not all of them are implemented by AI systems and, on the other hand, not all AI systems that are used in the processing of personal data would be related to automated decision-making or profiling under Articles 22(1) and 22(4) of the GDPR.
Different Types of Information
Transparency-GDPR is defined by the obligations contained in Articles 13 and 14 and Recitals 39 and 58 to 62 of the GDPR, which explain that the information to be provided to data subjects must make them “aware of the risks”, “of the existence of profiling and of the consequences of such profiling”, “of the controllers”, “of the purposes”, “of the rights”, “of the safeguards” and “of any other information necessary to ensure fair and transparent processing”, “taking into account the specific circumstances and context in which the personal data are processed”, in an “easily understandable” manner, using “clear and plain language”, and where there is automated decision-making, including profiling, at least providing meaningful information about the logic involved, as well as the significance and expected consequences.
Transparency-AIA refers to the information provided to users/entities deploying AI systems and relates to accountability (Article 13, Article 4a of the Parliament’s version and Recital 38 of the AIA), documentation, record-keeping and the provision of information on how to use such an AI system (Recital 43 of the proposed AIA). It should enable users/entities using the AI system to comply with their legal obligations. Transparency-AIA for natural persons is contained in Article 52.1 of the proposed AI and relates to the obligation to warn natural persons that they are interacting with an AI system.
Conclusion
The AEPD concludes that Transparency-GDPR and Transparency-AIA have different meanings, establish obligations for different actors, refer to different categories of information, both in terms of content and wording, and are addressed to different recipients. Therefore, providing data subjects with the same information as drafted from the perspective of Transparency-AIA would not be consistent with the obligations of the GDPR.
If the user/entity deploying the AI system assumes the role of data controller or data processor, it is obliged to comply with the GDPR principle of accountability, which means that the systems used to carry out the processing activity must themselves comply with this principle. If the means of processing, i.e., AI systems, cloud systems, mobile systems, communication systems and others, are not properly documented and do not provide evidence of the necessary performance, privacy and security requirements, the controller should not use them. Implicitly, the information available under the Transparency AIA framework should be sufficiently comprehensive to enable controllers and processors to comply with their various obligations under the GDPR.
Finally, the AEPD clarifies that we should not confuse transparency obligations with the investigative powers of the supervisory authorities, the assessment of certification bodies or the oversight activities of code of conduct bodies. The information to be provided to them must be sufficiently complete and detailed to enable them to carry out their responsibilities, but this information goes well beyond what is necessary to comply with the transparency obligations under the GDPR.
Disclaimer: While every effort has been made to ensure that the information contained in this article is accurate, neither its authors nor Squire Patton Boggs accepts responsibility for any errors or omissions. The content of this article is for general information only, and is not intended to constitute or be relied upon as legal advice.