The General Data Protection Regulation (GDPR), applicable in the EU Member States from 25 May 2018, is a comprehensive legislation which aims to adequately protect personal data of data subjects while taking into account the latest technological trends and developments. Still, some may argue that certain novel technologies such as blockchain were not addressed: take, for example, the append-only immutable blockchain’s inherent non-compliance with “the right to be forgotten”, or, in the case of a public blockchain globally distributed across a myriad of nodes, the GDPR requirements for the transfer of personal data to third countries.
Machine learning, a subset of AI which most of AI applications are based on, is addressed by the GDPR in the context of “automated decision-making”. This term is defined by Article 22 GDPR as “[making] a decision based solely on automated processing, including profiling, which produces legal effects concerning [the data subject] or similarly significantly affects [the data subject]”. Automated decision-making is generally prohibited unless one of the exceptions in Article 22 applies, such as when the data subject has given their explicit consent, provided that there are suitable measures in place to safeguard the data subject’s rights and freedoms and legitimate interests, at least “the right to obtain human intervention […], to express [their] point of view and to contest the decision”.
One of the GDPR requirements that sparked a debate in the context of machine learning is a concept that has, by some, been called the “right to explanation“. Recital 71 GDPR states that “[automated decision-making] should be subject to suitable safeguards, which should include […] [the data subject’s right] to obtain an explanation of the decision reached“, while Articles 13 (2) (f), 14 (2) (g) GDPR oblige the data controller to provide, where automated decision-making exists, “meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject” (our emphases). Where the personal data have been collected from the data subject, such information must be provided by the controller “at the time when personal data are obtained [from the data subject]”, i.e. prior to the processing of the personal data. Where the personal data have not been obtained from the data subject, the general requirement is that such information must be provided within a “reasonable period” after obtaining the personal data and no later than one month, “having regard to the specific circumstances in which the personal data are processed”. In addition, the individual has the right to request this information ex-post as part of the data subject’s “right of access” prescribed by Article 15 GDPR.
The debate has been focusing on whether and to what extent the above provisions mandate the data subject’s “right to explanation”. Some legal scholars went so far as to argue that such a right does not exist, thus attracting criticism. Still, semantic intricacies and ambiguities aside, the GDPR is clear on that, where an exception to the general prohibition of automated decision-making applies, (i) the data controller must provide the data subject with a meaningful explanation of an automated decision (where the personal data have been obtained from the data subject, such an explanation must be provided before the personal data are processed for making that decision), and (ii) the data subject has the right to obtain this information from the data controller. This means that the data controller which utilizes machine learning, e.g. a vendor providing an AI-based job matching web-application that automatically shares candidates’ profiles with companies based on certain criteria without human involvement in the process, must ensure that the decisions reached by the app’s machine learning model are, to a reasonable extent, transparent to the user and reviewable. Specifically, the data controller must provide “meaningful information” on how the automated decision regarding the individual is arrived at, thus preventing a “black box” scenario.
Some AI vendors could argue that the “right to explanation” puts their trade secrets and intellectual property at risk. Recital 63 GDPR states that “[the data subject’s right of access] should not adversely affect the rights or freedoms of others, including trade secrets or intellectual property […]”. At the same time, WP29 notes in the Guidelines that “controllers cannot rely on the protection of their trade secrets as an excuse to deny access or refuse to provide information to the data subject”. This means that information on automated decision-making provided to the data subject does not necessarily have to contain specific descriptions of proprietary algorithms or the “know-how” that underpins the process. The most important quality of such information is being able to provide a clear understanding to the data subject on how the decision is arrived at. This can be achieved by a non-technical description specifying what personal data are used, how the system processes these data when making an automated decision, in general terms. As WP29 states in the Guidelines, “[t]he controller should find simple ways to tell the data subject about the rationale behind, or the criteria relied on in reaching the decision. The GDPR requires the controller to provide meaningful information about the logic involved, not necessarily a complex explanation of the algorithms used or disclosure of the full algorithm. The information provided should, however, be sufficiently comprehensive for the data subject to understand the reasons for the decision”.