Respecting Data Subject Rights in AI: A Practical Guide for Businesses

Nowadays, data subject rights must be considered as artificial intelligence (AI) revolutionizes industries. However, with this advancement, data privacy and data protection both become major concerns for both businesses and consumers. With AI tools enabling greater collection and use of personal data, making it more critical than ever for organizations to respect the rights of data subjects. It is important that organizations design and deploy these technologies in compliance with data protection laws, especially the rights of data subjects provided by the GDPR.

Data subject rights (DSRs) are not optional check boxes. They are legally enforceable rights granted to individuals whose personal data is processed. Businesses must respect data subject rights throughout all stages of AI development, deployment, and ongoing system management. The GDPR grants individuals several rights over their personal data. Let us focus on four of these here:

  1. Right to be informed: As with other data protection frameworks, transparency is key under the GDPR. This right takes the form of a duty to inform prior to the processing taking place. Businesses must include information on how they collect, use, store, and share data, the purpose of processing, the legal basis, data retention periods, and who may receive the data. Privacy notices are the typical repositories for this information. They must be concise, accessible, and written in plain language.
  2. Right of access: Data subjects can request access to the exact personal data a business holds about them. Businesses must provide information about processing activities, data categories, and any third parties with whom they share the data.
  3. Right to rectification: Data subjects can request organizations to correct incorrect or incomplete data without delay. Businesses must respond promptly and update the data across systems and third-party processors where necessary.
  4. Right to object, right to be forgotten and right to revoke consent: It allows individuals to exercise control. The European Data Protection Board (EDPB)  published a case digest on right to object and erasure. Data subjects must be able to object to the use of their data and request its erasure when it is no longer necessary, when they withdraw consent, or for purposes like direct marketing.

Incorporating data minimization in AI Systems

One of the most effective ways businesses can respect data subject rights is by adhering to the data protection principle of data minimization. This GDPR principle requires businesses to collect and process only the minimum personal data necessary to achieve their specific purpose. Avoid over-collecting data, use anonymized or synthetic data for training, and regularly review AI outputs to remove unnecessary personal information.

Implement transparent data practices

Transparency is central to building trust and achieving legal compliance. Always define the purpose of processing, specifically the training of AI models. If businesses rely on legitimate interest, they must show that they gave data subjects the chance to object; otherwise, they invalidate their legal basis.

Clearly inform existing customers in advance when using their data to train AI models, and provide opt-out options before processing begins. Transparency is key. 

When there’s no direct relationship with the individual (such as when using publicly available data or from data brokers), the GDPR requires information to be provided within one month of its collection GDPR Articles 14.  

In 2023, the Italian DPA temporarily banned OpenAI’s ChatGPT, citing a lack of transparency around how it used personal data for training. The DPA later required the company to implement clear privacy notices and provide users with ways to exercise their rights.

Respect the right to access 

Can data owners request access to training data? 

This becomes complicated with large language models, but under the GDPR, individuals have the right to know if and how their data is being used.

How to exercise that right? 

Under the GDPR, individuals have the right to know if and how their personal data is used, including data processed by AI systems. While this is straightforward for users with an existing relationship (who can submit data subject access requests via account settings or customer support), it’s more complicated when there’s no direct connection.

In such cases, organizations must ensure proactive transparency by clearly informing people through privacy policies and AI transparency reports. Failure to uphold this right contributes to loss of trust and accountability in AI use and development.

Develop clear processes for data deletion and rectification 

Can data be corrected or deleted after it has been used to train an AI model? 

While difficult, companies must explore the use of data architectures that allow tracing of personal data contributions. The GDPR (Recital 26) considers even pseudonymous data, like randomly generated user IDs, as personal data since organizations can technically link it back to a person, directly or indirectly.

To reduce data subject risk while improving compliance, companies could implement the following measures:

  • Data encryption: Businesses should ensure proper security implementation, especially when handling sensitive personal information.
  • Anonymization and pseudonymization: Where possible, anonymize or pseudonymize data before using it in AI models. Anonymization and pseudonymization protect personal data by reducing breach risks and limiting the impact on individuals in case of a data exposure.
  • Access control: Implement strict access controls and monitoring to ensure only authorized personnel can access personal data. This prevents unauthorized exposure of sensitive information.

By embedding these practices into AI development pipelines, organizations can take meaningful steps toward compliance, trust-building, and ethical AI deployment.

Ensure security and privacy by design

Organizations should build user trust and meet regulations by embedding privacy from the start, not treating it as an afterthought. This is the core of the privacy by design principle under the GDPR.

Key steps include:

  • Promoting user choice and control: Provide clear opt-out options before processing data—whether in email campaigns, mobile app popups, or web trackers.). Empower users with privacy dashboards that let them view, manage, and delete their personal data at any time.
  • Secure data handling: Businesses must encrypt personal data used in AI training while transmitting and at rest. Implement strict access control mechanisms to ensure that only authorized personnel can interact with sensitive data.

Embedding privacy and security into system architecture from the outset not only ensures compliance, trust-building, and ethical AI deployment.

Maintain ongoing communication and feedback loops

Transparency shouldn’t stop at data collection. When introducing AI processing, update your privacy notices to reflect new processing activities, as required by the GDPR. Use layered notices to highlight AI-specific practices like model training, profiling or automated decision-making. Importantly, inform users before processing, not after. True consent means giving people a real choice. Building feedback loops as user input is essential for improving fairness, spotting issues, and building trust in your AI systems.

Conclusion

As AI continues to shape modern business, respecting data subject rights is not just a legal obligation; it’s a foundation for responsible innovation. By embedding privacy by design, adopting transparent data practices, and enabling user control, organizations can align AI development with GDPR principles and foster long-term trust. Data protection isn’t a compliance checkbox, it’s a strategic imperative for ethical and sustainable AI.

Feel free to reach out to us for any clarification of AI compliance needs.

Do you need support on data protection, privacy or GDPR? TechGDPR can help.

Request your free consultation

Tags

Show more +