EU Biotech Act
The EDPB and EDPS adopted a Joint Opinion on the European Commission’s Proposal for a European Biotech Act. It aims to strengthen Europe’s biotechnology and biomanufacturing sectors, including streamlining the regulatory framework and updating the rules for clinical trials (in the form of proposed amendments to the Clinical Trials Regulation). The privacy regulators welcome the aim to establish a single legal basis for the processing of personal data by sponsors and investigators in the context of clinical studies. The opinion provides several recommendations to ensure that the proposed simplifications do not lower the level of protection for clinical trial participants:
- Clarifying the controller roles of the actors involved in funding and conducting clinical trials, jointly and severally
- Limiting data retention for various personal data collected throughout the clinical trial (except master files storage requirements)
- Further processing for other clinical trials and scientific research
- Coherence with the AI Act
- Appropriate technical and organisational measures (the use of pseudonymisation)
- Regulatory sandboxes
Stay up to date! Sign up to receive our fortnightly digest via email.
Main developments
Transparency enforcement action: On 18 March, the EDPB launched its Coordinated Enforcement Framework (CEF) action for 2026. Following a year-long coordinated action on the right to erasure in 2025, the CEF’s focus this year will shift to compliance with the obligations of transparency and information under the GDPR. The GDPR ensures that individuals are informed when their data is being processed (under Art. 12, 13 and 14). This right to be informed is a core element of transparency and ensures that individuals have more control over their data. Participating authorities will soon contact controllers from different sectors across Europe.
European Blockchain sandbox: The European Commission has published the results of the third edition of the ‘European Blockchain Sandbox‘, an initiative in which European data protection agencies participate along with other authorities. Following the publication of the selected projects, which cover all EU/EEA regions and represent a wide range of sectors and issues, and once the stage of confidential regulatory dialogues was completed, the report of good practices will follow, the same process as the first two editions.
Other legal updates

Data Brokers EU study: The Belgian data protection agency and the EDPB commissioned a study to gain greater insights into the ecosystem of data brokerage. In particular, several types of data brokers and providers were identified: personal data brokers, AI platforms integrating personal data, business data brokers, data pools and cleanrooms, data marketplaces, self-generated data providers, data brokers with user control, and aggregated data providers with re-identification risk.
The study shows that the data broker and provider market in Belgium is highly diverse, with varying levels of risk associated with the use of personal data. More than 40 data brokers and providers active in Belgium were identified in the study.
Big Tech compliance with the EU DMA: The gatekeepers designated in 2023, Alphabet, Amazon, Apple, ByteDance, Meta and Microsoft, have submitted reports on their updated compliance measures under the Digital Markets Act (DMA), outlining the changes they have implemented and measures they have taken during the past year. The gatekeepers also submitted to the Commission updated, independently audited reports on consumer profiling techniques. The public versions of the updated compliance reports will shortly be available here and here.
US privacy laws development: DLA Piper publishes a list of recently introduced comprehensive privacy bills, state by state (Alabama, Arizona, Iowa, Illinois and more). They are reflecting a continued trend toward expanding individual privacy rights and creating new compliance obligations on businesses that collect and process personal data, including consent requirements, data minimisation, data brokers, child data, geolocation, biometrics and other types of sensitive data.
More from supervisory authorities

Age assurance guide: The Australian Information Commissioner (OAIC) has published new guidance on age assurance technologies to assist entities in ensuring Australians’ privacy is protected when they encounter age checks online. Three months on from the commencement of the Social Media Minimum Age (SMMA) scheme, the OAIC has observed significant growth in age checks taking place in Australia to allow people access to other online services. The guidance calls on entities to:
- establish whether age checks are needed and take a privacy-by-design approach
- undertake due diligence to ensure the security of the entity’s age assurance ecosystem
- assess risk and choose age-assurance methods that are proportionate and data minimising
- ensure clear consent requests are used for the collection of sensitive information (such as biometric templates) or for secondary use or disclosure
- be transparent in privacy notices and ensure meaningful support is available to individuals, through simple and easy to access complaints processes
IT security in the health sector: The IT security of software products in the healthcare sector has room for improvement. This is a recent conclusion reached by Germany’s Federal Office for Information Security (BSI) after testing the standard configurations of various healthcare software products. As part of the project, four exemplary practice management systems (PMS) vulnerabilities were examined using penetration tests. The results included: the lack of encryption methods for data transmission and the use of outdated and therefore insecure encryption algorithms.
AI systems monitoring criteria
AI outputs are typically non-deterministic, meaning the AI may exhibit a range of behaviours under the same input conditions. To that end, America’s NIST publishes much needed analysis of post-deployment AI system monitoring aimed at improving their reliability. The study introduces the six monitoring categories to support a more organised discussion:
- Functionality: Does the system continue to work as intended?
- Operational: Does the system maintain consistent service across its infrastructure?
- Human Factors: Is the system transparent to humans and of high quality?
- Security: Is the system secure against attacks and misuse?
- Compliance: Does the system adhere to relevant regulations and directives?
- Large-Scale Impacts: Does the system promote human flourishing?
Web filtering proxy
The French privacy regulator CNIL promotes cybersecurity solutions that comply with the GDPR, both in their use and in their design. To this end, it publishes a recommendation to support users and providers of filtering web proxies – a device or service used to secure internet access by filtering web content for security and compliance reasons. Web filters can help meet the data security obligation (Art. 32 of GDPR). However, they are themselves based on data processing that must also be ensured to comply with the GDPR. CNIL recommendations aim in particular to inform data controllers:
- on compliance with the principles of the GDPR in the use of a web filtering proxy, including the determination of a legal basis, the minimisation of the data collected, the retention periods and the respect of the exercise of rights by the data subjects;
- on the points of attention relating to the use of HTTPS decryption and the implementation of a list of exceptions;
- on the deployment modalities;
- on the security of the access filtering and logging solution.
In other news

Account deletion and purchase history: The Privacy Commissioner of Canada has issued its findings in an investigation into complaints against Loblaw Companies (the biggest Canadian food retailer) related to the PC Optimum Loyalty Program. Several complainants alleged that Loblaw did not delete their PC Optimum accounts after they requested it, and/or that it had not responded to inquiries about their deletion requests.
The investigation found that, while Loblaw had mechanisms in place for customers to request an account deletion or to raise privacy concerns, it took an unreasonable amount of time to address the requests, and also failed to respond to some privacy-related inquiries. The investigation also found that Loblaw retained PC Optimum members’ purchase history after their account had been deleted, and that the removal of personal identifiers such as names and email addresses was an insufficient measure to have in place.
Age assurance technology fine: The Spanish AEPD fined Yoti 950,000 euros following an investigation into its role as an intermediary in identity and age-verification processes. The fine includes 500,000 euros for processing special category biometric data without a valid exemption under Article 9 of the GDPR, 200,000 euros for obtaining consent for research and analytics through pre-ticked boxes in breach of Article 7, and 250,000 euros for retaining data, including biometric and geolocation information, for longer than necessary in violation of the storage limitation principle under Article 5(1).
The AEPD required Yoti to demonstrate within six months that its processing of biometric data, consent mechanisms, and data retention practices comply with the GDPR, digitalpolicyalert.org reports.
More enforcement decisions
Amazon Italy ban: The Italian Data Protection Authority Garante ordered Amazon Italia Logistica to immediately stop processing the personal data of more than 1,800 employees at its Passo Corese (RI) site. The ban concerns workers’ sensitive information, which Amazon systematically collected and stored throughout their employment and retained for up to ten years after they left the company, using an internal platform linked to the attendance tracking system and accessible to numerous managers.
The information was recorded on the platform following interviews conducted when employees returned from periods of absence. It included details about medical conditions such as Crohn’s disease, herniated discs, and pacemaker implants, as well as participation in strikes and trade union activities. In some cases, notes referred to alleged misuse of leave. Personal and family matters were also documented, including references to a terminally ill parent, a sibling with brain cancer and marital separations, according to the Maltese data protection agency analysis.
Receive our digest by email
Sign up to receive our digest by email every 2 weeks
Intesa Sanpaolo fine: Garante also fined Intesa Sanpaolo 17.628 million euros for unlawful personal data processing. Intesa Sanpaolo had profiled approximately 2.4 million customers identified as “predominantly digital customers” through automated processing of personal data, including age, use of digital channels, absence of investment products, and financial balances below 100,000 euros. This profiling lacked a valid legal basis. The regulator determined that informed consent under Article 6(1) of the GDPR was the only applicable legal basis, and that such consent had not been obtained, digitalpolicyalert.org sums up.
Foreign service providers and the choice of jurisdiction
A DLA Piper analysis looks at a case in California demonstrating the expanding reach of personal jurisdiction over foreign companies operating online platforms. It relates to an appellate court’s decision to reverse a district court’s dismissal of a class action against an Estonian software company for lack of personal jurisdiction. The plaintiffs brought a class action in the Northern District of California against 3Commas Technologies, an Estonian private limited company that provides software services for cryptocurrency trading, based on an alleged data breach.
In the above case, the foreign company collected IP addresses, billing addresses, and location data that could reveal users as California residents, contacted them, and interacted with them for cryptocurrency trades. The appeal court also decided that including specific references to California privacy rights can be construed as evidence of intentionally targeting California consumers. Finally, the choice of law and forum selection clauses in vendor contracts may be used as evidence, too.
And Finally

Data altruism: The French CNIL also publishes FAQs on Recognised Data Altruism Organizations in the EU. The Data Governance Regulation (DGA) creates an EU-recognised Data Altruism Organisation (DAO) status. These altruistic organisations voluntarily share data for general interest and non-profit purposes. In particular, Article 18 of the DGA sets out the various general conditions for registration:
- conducts altruistic data activities
- be a legal person pursuing objectives of general interest under national law
- operates on a not-for-profit basis and is legally independent of any entity operating for profit
- conducts its data altruism activities through a structure that is functionally separate from its other activities
- complies with a set of common European rules, known as the ‘compendium of rules’, in a transparent, secure and interoperable manner
AI agents and data security: A Krebs-on Security law blog looks at AI-based assistants, autonomous programs that have access to the user’s computer, files, online services and can automate virtually any task. In particular, their popularity is growing among developers and IT workers. These powerful new tools are rapidly shifting the security priorities for organisations, while blurring the lines between data and code, trusted co-worker and insider threat. The article explains various vulnerabilities for users, including the case where exposing a misconfigured AI agent web interface to the Internet allows external parties to read the bot’s complete configuration file, including every credential, from API keys and bot tokens to signing keys. Another experiment showed how easy it is to create a successful supply chain attack through a public repository of downloadable “skills” that allow AI agents to integrate with and control other applications.