Artificial Intelligence and the right to explanation under the GDPR

 Although the GDPR’s mandates for better data storage and collection practices are expected to help the overall economy, rather than stifle it, there still remain many immediate compliance risks that should especially concern companies using advanced AI to collect, store, and interpret their data. When it comes to the question of how to comply with new regulations while also not inhibiting the ultra-fast learning techniques of decision-making algorithms, there does not appear to be a clear answer for how to most smoothly move forward. Even so, lacking a coherent plan for doing so could prove very costly for some of the world’s most innovative and technologically sophisticated companies.  Identifying potential solutions requires a brief look at some of the GDPR’s most important legal language.

Understanding Recital 71 and Article 22

As it pertains to GDPR compliance, one critical challenge is addressing data subjects’ right to an explanation concerning the automated decision-making by algorithms, which is mentioned in Recital 71 and Article 22 of the GDPR. Article 22 states the following:

“The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.”

The right to an explanation means that data subjects can request to be informed by a company on how an algorithm has made an automated decision.  As stated in Recital 71 of the GDPR (included at the bottom of this blog post), data subjects have the right to “obtain an explanation of the decision reached after such assessment.” This is done with the purpose of giving someone the choice to opt out of something in case they find an algorithm to be biased. But how is this bias determined? The many forms that this answer will take is set to have a massive impact on the tech industry. Speed and accuracy are the primary reasons for companies’ adoption of automated decision-making.  If an algorithm is determined to be biased, it can have serious consequences for data subjects, including legal consequences.

Issues with the Right to an Explanation

Implementing the right to an explanation is incredibly difficult, primarily because no one, not even programmers, knows exactly how an algorithm does what it does in every situation.  People have the right to have a decision reviewed by a human being, but it’s not clear that a human being can make coherent sense of all that is going into an algorithm’s decision-making process. Companies are also very secretive about advances they’ve made in the functionality of their algorithms – some of which are incredibly valuable to their bottom line. To request that a company explain why their algorithm has made a certain decision would be an almost impossible task, especially on an individual level, because of the fact that their AI has made an immeasurable number of micro-decisions based on large sets of data, and based on an ever-evolving mathematical equation.  

abstract image created by Jesse van Mouwerik for TechGDPR

There is a responsibility to provide meaningful information about the logic involved in an algorithm’s decision-making process, but a comprehensive overview of every last decision-influencing variable in terms of what a particular individual sees is not technically possible, let alone feasible. This is because complex algorithms are not built by programmers but by other algorithms, which are in turn built by more algorithms.  The original algorithm, which was built by a programmer, is far less complex than its advanced AI-generated successors. The programmer created one simple algorithm with the main goal of continuously absorbing new information and building better versions of itself. This means that a human, even a very tech-savvy one, may not ever be fully aware of (or be able to keep up with) what an AI has done in order to improve its capabilities. As this video created by C. G. P. Grey explains, algorithms are very tricky to understandeven for their creators.

AI Compliance on a Larger Scale

Although the right to an explanation is meant to protect data subjects from biased and harmful automated decision making, it’s also clear that the lack of certainty on how exactly this protection can be implemented creates serious obstacles to furthering the development of Artificial Intelligence, and therefore poses risks to the economy as a whole. Now that the GDPR has come into full force, companies must be ready to comply to the best of their abilities. They must also be open to discussing the limits of their capacity to track every last action of a decision-making AI. One likely solution comes from Article 22, paragraph 2.b of the GDPR, which states:

“Paragraph 1 shall not apply if the decision: is authorised by Union or Member State law to which the controller is subject and which also lays down suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests…”

Moving forward

Looking to the future, as governments and companies learn more about their data and the best ways to secure data, amendments may be required in order to account for the unique challenges of algorithms and their behaviorbut this will take time. Ultimately, our economies should not be forced to take actions that seriously hinder the development of AI.  In turn, our governments must find the right stick-and-carrot balance to ensure that companies are taking substantial action to ensure that citizens are more aware of, and better protected from, the more dangerous aspects of automated decision-making. Lastly, it’s important to note that the right to an explanation should not be fully abolished or disregarded, as it does serve an important cause for both companies and individuals. To ensure that neither party is at a loss, the capabilities of valuable AI, as well as the limits of key-clacking mammals’ ability to keep up with it, need to be better understood by everyone.

To learn more about data privacy and the GDPR, follow us on Twitter

null

The data subject should have the right not to be subject to a decision, which may include a measure, evaluating personal aspects relating to him or her which is based solely on automated processing and which produces legal effects concerning him or her or similarly significantly affects him or her, such as automatic refusal of an online credit application or e-recruiting practices without any human intervention. Such processing includes ‘profiling’ that consists of any form of automated processing of personal data evaluating the personal aspects relating to a natural person, in particular to analyse or predict aspects concerning the data subject’s performance at work, economic situation, health, personal preferences or interests, reliability or behaviour, location or movements, where it produces legal effects concerning him or her or similarly significantly affects him or her. However, decision-making based on such processing, including profiling, should be allowed where expressly authorised by Union or Member State law to which the controller is subject, including for fraud and tax-evasion monitoring and prevention purposes conducted in accordance with the regulations, standards and recommendations of Union institutions or national oversight bodies and to ensure the security and reliability of a service provided by the controller, or necessary for the entering or performance of a contract between the data subject and a controller, or when the data subject has given his or her explicit consent. In any case, such processing should be subject to suitable safeguards, which should include specific information to the data subject and the right to obtain human intervention, to express his or her point of view, to obtain an explanation of the decision reached after such assessment and to challenge the decision. Such measure should not concern a child. […]

null

Do you need support on data protection, privacy or GDPR? TechGDPR can help.

Request your free consultation

Tags

Show more +