Transparency and information obligation under GDPR

The European Data Protection Board (EDPB) announced the topic for Coordinated Enforcement Action 2026 on transparency and information obligations. Articles 12, 13, and 14 of the GDPR require that individuals be informed when their personal data is processed, ensuring transparency and enabling greater control over personal information. Participating data protection authorities will join this action voluntarily in the coming weeks, with enforcement activities scheduled to launch during 2026.
Experian credit checks fine
As the background example of the above transparency obligations, the Dutch data protection authority AP last week imposed a 2.7 million euro fine on Experian Nederland. Experian provided credit ratings on individuals to its customers until 2025. The company collected data on factors such as negative payment behavior, outstanding debts, and bankruptcies. The AP found that Experian violated the GDPR by improperly using personal data, and failed to adequately inform individuals about this.
Experian created credit reports on individuals at the request of clients such as telecom companies, online retailers, and landlords. People started contacting the AP after they could no longer pay installments or because they suddenly had to pay a high deposit when switching energy suppliers. Only afterward did it become clear that this could be due to Experian’s credit scores. Because people weren’t aware of the credit check, they couldn’t check in time whether the information was accurate. Experian collected data about people from various sources, both public and private, and failed to adequately explain why this data collection was necessary.
Experian acknowledged violating the law and will not appeal the fine. It has ceased operations in the Netherlands and will delete the database containing all personal data.
Stay up to date! Sign up to receive our fortnightly digest via email.
More legal updates
DMA and GDPR: The EDPB and the European Commission endorsed joint guidelines on the interplay between the Digital Markets Act (DMA) and the GDPR. The DMA and the GDPR both protect individuals in the digital landscape, but their goals are complementary as they address interconnected challenges: individual rights and privacy in the case of the GDPR and fairness and contestability of digital markets under the DMA. However, several activities regulated by the DMA entail the processing of personal data by gatekeepers and refer to definitions and concepts included in the GDPR (eg, on how to lawfully combine or cross-use personal data in core platform services).
Italy’s new AI law: On 10 October, the Italian law on Provisions and Delegation to Government on Artificial Intelligence, including an age verification requirement, entered into force. It is the first comprehensive legislation adopted by an individual EU member state on research, testing, development, adoption, and application of AI systems and models, with a human-centric approach. The government has appointed the Agency for Digital Italy and the National Cybersecurity Agency to enforce the legislation, which received its final approval in the parliament after a year of debate. The enforcement measure imposes even prison terms on those who manipulate technology to cause harm, such as generating deepfakes.
US Bulk Data: The US Department of Justice’s Sensitive Data Bulk Transfer Rule is in effect as of October 6, JD Supra law blog reports. This means if your organisation transfers US sensitive data (from demographic data to cookie data) that hits the bulk thresholds, you need to develop and implement a compliance program, either a stand-alone program or as part of the compliance program (through due diligence and audit procedures).
Electronic patient files

In Germany, the electronic patient record (ePA) for everyone has been tested in model regions since January 2025. Since 29 April, it has been available for use nationwide by practices, hospitals, and pharmacies, among others. As of 1 October, it is generally mandatory for practices and other medical facilities to fill out the records. At the same time, the information (eg, on ongoing or further treatment) can only be included in the ePA for everyone if the insured person has not fundamentally objected to this with their health insurance provider.
Finally, special consent requirements apply to information from genetic testing for diagnostic purposes, as well as on children and adolescent records.
California privacy updates
At the end of September, California finalised regulations to strengthen consumer privacy that go into effect on 1 January, 2026. However, there is additional time for businesses to comply with some of the new requirements, namely cybersecurity audits, risk assessments, and requirements for automated decision-making technologies, as well as updates to existing CCPA regulations. The final regulations and supporting materials will be posted on the regulator’s website as soon as they are processed.
ISO/IEC 27701
On 14 October, ISO released ISO/IEC 27701:2025, the latest version of the global Privacy Information Management System (PIMS) standard. For the first time, ISO/IEC 27701 is now a standalone standard, no longer just an extension of ISO/IEC 27001. The standard is designed for personally identifiable information (PII) controllers and processors, who hold responsibility and accountability for processing PII to:
- Strengthen data privacy and protection capabilities
- Help demonstrate compliance with global privacy regulations such as the GDPR
- Support trust-building with partners, clients and regulators
- Align with existing ISO/IEC 27001 systems to streamline implementation
- Facilitate accountability and evidence-based privacy management
Cookie updated guidance
The Swiss FDPIC published an updated version of its cookie guidelines, which contains specific clarifications and additions intended to improve the comprehensibility of the text and clarify practical issues. In particular, the FDPIC found it useful to clarify why the use of cookies for the purpose of delivering personalised advertising may require the consent of the data subjects. This is the case when the website operator provides third parties with access to visitors’ personal information in return for payment by integrating third-party cookies or similar technologies, and these third parties are embedded in several websites. As the latter are enabled to carry out high-risk profiling, this constitutes a particularly intensive intrusion into the privacy of the data subjects.
AI systems development guidance

In Germany, the Data Protection Conference (DSK) publishes guidance on AI systems with Retrieval Augmented Generation (RAG). It provides legal and technical information on how to harness the potential of such AI systems while simultaneously reducing the risks for those affected. RAG is an AI technology that augments large language models with targeted access to company or government agency knowledge sources to deliver context-specific answers.
Typical application examples include in-house chatbots that access current business data and scientific assistance systems that leverage research databases.
Thus, RAG use must be designed in compliance with data protection by design and by default. Controllers must ensure transparency, purpose limitation, and the protection of data subjects’ rights at all times. Controllers wishing to implement such RAG systems must conduct data protection assessments of the various processing operations on a case-by-case basis and always keep their technical and organisational measures up to date.
More from supervisory authorities
Union membership: The Latvian data protection authority DVI explains whether an employer needs to know about a worker’s union membership. The answer is that the employer cannot request such information from the employee at any time. The most appropriate justification for processing such data is when such rights are established for the employer by law; however, there is also the possibility of obtaining the employee’s consent or finding out this information when the employee has disclosed it themself.
Such a question should not be asked during a job interview, when drawing up an employment contract or during an employment relationship, as long as the employer does not intend to terminate the employment relationship with the employee in question. If an employee is to be dismissed, asking about union membership is important because union members may have special protections, such as the need to obtain the union’s consent to termination.
Commercial robocalls: The DVI also explains what a company should consider if it wants to use commercial robocalls. The regulatory framework stipulates that the use of automated calling systems, which operate without human intervention for the purpose of sending commercial communications, is permitted only if the recipient of the service has given their prior free and explicit consent. Thus, sending commercial communications in this way is lawful only if the person concerned has previously (before making the call) given their free and explicit consent to be disturbed by automated calling devices.
Google Analytics fine confirmed by court

In 2023, Sweden’s data protection authority IMY decided after an inspection that Tele2 (mobile network provider) must pay a penalty fee of SEK 12 million because they violated the GDPR. The Court of Appeal has now ruled in favor of IMY. The violation concerned the fact that the company, in connection with the use of Google Analytics, transferred personal data to the US without adequate protection.
IMY assessed that the data transferred to the US via Google’s statistical tool was personal data, since the data transferred could be linked with other data that Google had access to and thus enabled Google to distinguish and identify specific persons.
Minors’ data in the EU
On 16 October, the European Parliament’s Committee on the Internal Market and Consumer Protection adopted its report on the Protection of minors online. The report calls for an EU-wide digital minimum age of 16 for accessing social media, video-sharing platforms and AI companions without parental consent, and a minimum age of 13 for any social media use. It urges the European Commission to strengthen enforcement of the Digital Services Act and to swiftly adopt guidelines on measures ensuring a high level of privacy, safety, and security for minors. The Parliament is expected to vote on the final recommendations during the November plenary session.
Microsoft use of children data
The Austrian data protection authority ruled on a complaint regarding Microsoft’s handling of children’s data under the GDPR. It found that the Federal High School and the Federal Ministry for Education, acting as joint controllers, violated the complainant’s right of access and right to be informed. They failed to provide complete and timely information on data processed through Microsoft Education 365, including cookies and third-party data transfers, (content, log, and cookie data). Microsoft was also found to have infringed the complainant’s right of access by not providing complete information on cookie data, its own processing purposes, and transfers to third parties such as LinkedIn, OpenAI, and Xandr, digitalpolicyalert.org reports.
Doping scandals and personal data

A CJEU Advocate General has ruled on the publication of the name of professional athletes who have infringed anti-doping rules. In the related case in Austria, four athletes concerned submit that that publication contravenes the GDPR. Such publication is provided for by law. It aims, first, to deter athletes from committing infringements of the anti-doping rules and thus to prevent doping in sport.
Second, it aims to prevent circumvention of the anti-doping rules by informing all persons likely to sponsor or engage the athlete in question that he or she is suspended. In that context, the Austrian court asked the Court of Justice to interpret the GDPR. The first opinion was that such practice is contrary to EU law. The principle of proportionality requires account to be taken of the specific circumstances of each individual case. In the Advocate General’s view, publishing the relevant name, but limited to the relevant bodies and sports federations, accompanied, for example, by pseudonymised publication on the internet, would make it possible to achieve both those objectives.
Receive our digest by email
Sign up to receive our digest by email every 2 weeks
In other news
Clearview AI fine confirmed: On 7 October, the UK Upper Tribunal confirmed that Clearview AI’s facial recognition business is subject to the EU and UK GDPRs. Clearview had argued that its scraping of billions of online images to produce facial recognition services for sale to foreign law enforcement agencies placed it outside of GDPR’s material and territorial scope. The tribunal rejected the claim and made it clear that Clearview’s activities involve ‘behavioural monitoring’. Clearview sought a narrow interpretation of the GDPR, but the tribunal rightly adopted a broader one that clearly encompasses automated processing.
This decision follows the Information Commissioner and Privacy International’s appeal against a 2023 First Tier Tribunal ruling that had quashed Clearview’s 7,552,800 pounds fine. Clearview trawls through sites like Instagram, YouTube and Facebook, as well as personal blogs and professional websites. It uses facial recognition technology to extract the unique features of people’s faces, effectively building a gigantic biometrics database. Clearview has previously been found to be in breach of the GDPR in France, Italy, Austria and Greece, resulting in fines totalling 65,200,000 euros.
Meta AI bots: The Guardian reports that parents will be able to block their children’s interactions with Meta’s AI character chatbots. The social media company is adding new safeguards to its “teen accounts”, which are a default setting for under-18 users, by letting parents turn off their children’s chats with AI characters. These chatbots, which are created by users, are available on Facebook, Instagram and the Meta AI app. Parents will also be able to block specific AI characters and get “insights” into the topics their children are chatting about with AI. Meta said the changes would be rolled out early next year, initially to the US, UK, Canada and Australia.
In case you missed it

AI for everyday tasks: As more and more companies are using their users’ personal data to train AI models, the French data protection regulator CNIL explains how to oppose it for the main platforms. The practical cases include: Google – Gemini, Meta – Meta AI, Open AI – ChatGPT, Microsoft – Copilot, X – Grok, DeepSeek, Mistral – The Cat, Anthropic – Claude, and LinkedIn.
‘Self-aware’ AI: Guernsey’s data protection authority meanwhile publishes its observations on how AI has formed the basis of a number of companion apps and the creation of numerous digital friends and partners. It is important to remember, for all of us, personally and professionally, that such products are not ‘living beings’, while more and more news stories continue to emerge of tragic outcomes in which a digital companion played a part. Individuals have the right not to be subject to automated decision making which is at the core of such products, without appropriate safeguards being in place. And for organisations functioning as data controllers, these are vested with the responsibility on any decisions AI makes or advice it provides to people.