TechGDPR

Artificial Intelligence and the right to explanation under the GDPR

Friday March 30th, 2018 by Silvan Jongerius

 

abstract image created by Jesse van Mouwerik for TechGDPR

 

 Although the GDPR’s mandates for better data storage and collection practices are expected to help the overall economy, rather than stifle it, there still remain many immediate compliance risks that should especially concern companies using advanced AI to collect, store, and interpret their data. When it comes to the question of how to comply with new regulations while also not inhibiting the ultra-fast learning techniques of decision-making algorithms, there does not appear to be a clear answer for how to most smoothly move forward. Even so, lacking a coherent plan for doing so could prove very costly for some of the world’s most innovative and technologically sophisticated companies.  Identifying potential solutions requires a brief look at some of the GDPR’s most important legal language.

Understanding Recital 71 and Article 22

As it pertains to GDPR compliance, one critical challenge is addressing data subjects’ right to an explanation concerning the automated decision-making by algorithms, which is mentioned in Recital 71 and Article 22 of the GDPR. Article 22 states the following:

“The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.”

The right to an explanation means that data subjects can request to be informed by a company on how an algorithm has made an automated decision.  As stated in Recital 71 of the GDPR (included at the bottom of this blog post), data subjects have the right to “obtain an explanation of the decision reached after such assessment.” This is done with the purpose of giving someone the choice to opt out of something in case they find an algorithm to be biased. But how is this bias determined? The many forms that this answer will take is set to have a massive impact on the tech industry. Speed and accuracy are the primary reasons for companies’ adoption of automated decision-making.  If an algorithm is determined to be biased, it can have serious consequences for data subjects, including legal consequences.

Issues with the Right to an Explanation

Implementing the right to an explanation is incredibly difficult, primarily because no one, not even programmers, knows exactly how an algorithm does what it does in every situation.  People have the right to have a decision reviewed by a human being, but it’s not clear that a human being can make coherent sense of all that is going into an algorithm’s decision-making process. Companies are also very secretive about advances they’ve made in the functionality of their algorithms – some of which are incredibly valuable to their bottom line. To request that a company explain why their algorithm has made a certain decision would be an almost impossible task, especially on an individual level, because of the fact that their AI has made an immeasurable number of micro-decisions based on large sets of data, and based on an ever-evolving mathematical equation.  

 

abstract image created by Jesse van Mouwerik for TechGDPR

 

There is a responsibility to provide meaningful information about the logic involved in an algorithm’s decision-making process, but a comprehensive overview of every last decision-influencing variable in terms of what a particular individual sees is not technically possible, let alone feasible. This is because complex algorithms are not built by programmers but by other algorithms, which are in turn built by more algorithms.  The original algorithm, which was built by a programmer, is far less complex than its advanced AI-generated successors. The programmer created one simple algorithm with the main goal of continuously absorbing new information and building better versions of itself. This means that a human, even a very tech-savvy one, may not ever be fully aware of (or be able to keep up with) what an AI has done in order to improve its capabilities. As this video created by C. G. P. Grey explains, algorithms are very tricky to understandeven for their creators.

 

Please accept YouTube cookies and tracking to play this video. By accepting you will be accessing content from YouTube, a service provided by an external third party.

YouTube privacy policy

If you accept this notice, your choice will be saved and the page will refresh.

 

AI Compliance on a Larger Scale

Although the right to an explanation is meant to protect data subjects from biased and harmful automated decision making, it’s also clear that the lack of certainty on how exactly this protection can be implemented creates serious obstacles to furthering the development of Artificial Intelligence, and therefore poses risks to the economy as a whole. Now that the GDPR has come into full force, companies must be ready to comply to the best of their abilities. They must also be open to discussing the limits of their capacity to track every last action of a decision-making AI. One likely solution comes from Article 22, paragraph 2.b of the GDPR, which states:

“Paragraph 1 shall not apply if the decision: is authorised by Union or Member State law to which the controller is subject and which also lays down suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests…”

Moving forward

Looking to the future, as governments and companies learn more about their data and the best ways to secure data, amendments may be required in order to account for the unique challenges of algorithms and their behaviorbut this will take time. Ultimately, our economies should not be forced to take actions that seriously hinder the development of AI.  In turn, our governments must find the right stick-and-carrot balance to ensure that companies are taking substantial action to ensure that citizens are more aware of, and better protected from, the more dangerous aspects of automated decision-making. Lastly, it’s important to note that the right to an explanation should not be fully abolished or disregarded, as it does serve an important cause for both companies and individuals. To ensure that neither party is at a loss, the capabilities of valuable AI, as well as the limits of key-clacking mammals’ ability to keep up with it, need to be better understood by everyone.

 

To learn more about data privacy and the GDPR, follow us on Twitter

 

 

null

The data subject should have the right not to be subject to a decision, which may include a measure, evaluating personal aspects relating to him or her which is based solely on automated processing and which produces legal effects concerning him or her or similarly significantly affects him or her, such as automatic refusal of an online credit application or e-recruiting practices without any human intervention. Such processing includes ‘profiling’ that consists of any form of automated processing of personal data evaluating the personal aspects relating to a natural person, in particular to analyse or predict aspects concerning the data subject’s performance at work, economic situation, health, personal preferences or interests, reliability or behaviour, location or movements, where it produces legal effects concerning him or her or similarly significantly affects him or her. However, decision-making based on such processing, including profiling, should be allowed where expressly authorised by Union or Member State law to which the controller is subject, including for fraud and tax-evasion monitoring and prevention purposes conducted in accordance with the regulations, standards and recommendations of Union institutions or national oversight bodies and to ensure the security and reliability of a service provided by the controller, or necessary for the entering or performance of a contract between the data subject and a controller, or when the data subject has given his or her explicit consent. In any case, such processing should be subject to suitable safeguards, which should include specific information to the data subject and the right to obtain human intervention, to express his or her point of view, to obtain an explanation of the decision reached after such assessment and to challenge the decision. Such measure should not concern a child. […]

null

 

International Transfers of Personal Data after the Schrems II ruling
August 6th, 2020

A Comparison of POPIA and GDPR in Key Areas
July 28th, 2020

HIPAA, the GDPR and MedTech
July 23rd, 2020

Small meetings under the COVID-19 ordinance in Berlin
March 18th, 2020

Response to the GDPR-relevant points in the German Blockchain Strategy of September 2019
September 29th, 2019

GDPR compliant products debunked: it’s all about HOW you use it
September 26th, 2019

GDPR’s Right to be Forgotten in Blockchain: it's not black and white.
August 13th, 2019

What is the difference between personally identifiable information (PII) and personal data?
June 27th, 2019

Personal data and cold calling under the GDPR
June 25th, 2019

Blockchain & DLT under the GDPR explained to the European Commission
June 4th, 2019

Artificial Intelligence (3)
Berlin (1)
Beyond EU (6)
Big Data (2)
Blockchain (10)
Comparison (1)
Court Cases (1)
Data Subjects (6)
DLT (1)
DPO (2)
European Commission (2)
GDPR Canvas (1)
GDPR Status (2)
Germany (2)
International Transfers (1)
IoT (4)
Privacy by Design (7)
Regulation (3)
Speaking (1)
Startups (1)
Strategy (2)
Terminology (2)
WiFi (1)
Workshop (2)
Analysis
Article 17
Artificial Intelligence
Big Data
Blockchain
call center
CCPA
CJEU ruling
Cold calling
compliance
covid-19
Data transfers
Debunked
Europe
European Commission
GDPR
GDPR Analysis
GDPR Compliance
GDPR so far
gdpr workshop
gdpr year one
German Blockchain Strategy
HIPAA
International transfers
marketing
medical data
MedTech
one year gdpr
open workshop
personal data
personally identifiable information
PII
POPIA
Privacy by Design
privacy policy
Retail Analytics
right to be forgotten
right to erasure
Schrems II
south africa
WiFi
WiFi-Tracking
Zcash
August 2020 (1)
July 2020 (2)
March 2020 (1)
September 2019 (2)
August 2019 (1)
June 2019 (3)
May 2019 (2)
April 2019 (1)
February 2019 (2)
January 2019 (1)
December 2018 (2)
October 2018 (1)
September 2018 (1)
August 2018 (3)
July 2018 (4)
June 2018 (1)
March 2018 (1)

Contact us to find out how we can help you with your GDPR compliance.