blockchain

Data protection digest 19 Oct – 2 Nov 2025: New AI Act and GDPR study & personal data stored on Blockchain

Blockchain applications and data protection    

The Bank of England, in its October statement, confirmed that many firms in the financial sector are already using AI, exploring opportunities to use quantum computing, and piloting DLT applications. One example is stablecoins built on DLT networks, which are already being used at scale by individuals and businesses worldwide for faster, cheaper cross-border payments and automated financial contracting. However, the bank admits that key barriers to scaling up blockchain solutions are regulatory frameworks that are not entirely suited to digital assets and cross-border initiatives. Blockchain’s inherent characteristics present unique challenges for GDPR compliance

When it comes to handling personal data, blockchains present a significant challenge in respecting data subject rights. Its immutability, for example, contradicts the fundamental “Right to be Forgotten”. The global distribution of blockchain nodes also complicates regulatory supervision. Conducting a Data Protection Impact Assessment (DPIA) is not just a legal requirement for high-risk blockchain-based personal data processing, but is an important step towards responsible innovation. To help organisations meet these requirements, TechGDPR has created a free downloadable Blockchain DPIA Template, which guides users through all required areas of GDPR compliance:

  • Description of the processing operations
  • Legal basis and necessity assessment
  • Identification of risks
  • Safeguards and technical measures
  • Implementing privacy by design principles
  • Data subject rights and governance structures

The pre-designed template includes ready-to-use sections, prompts, and examples, significantly saving time and ensuring that no critical aspect of your DPIA is overlooked.

Stay up to date! Sign up to receive our fortnightly digest via email.

UK Adequacy

The European Data Protection Board, EDPB, has issued its opinion on the adequate protection of personal data by the United Kingdom. In July 2025, the European Commission started the process towards the adoption of its draft implementing decision on the adequate protection of personal data by the UK. It extends the validity of certain parts of the previous adequacy decision until December 2031. In particular, the EDPB asks for the need to further clarify by the Commission recent changes in the UK post-Brexit legislation regarding: 

  • removing the direct application of the principles of EU law, including the right to privacy and data protection
  • new powers to introduce changes via secondary regulations, which require less Parliamentary scrutiny (eg, on international transfers, automated decision-making)
  • changes to the rules governing third-country transfers
  • processing exemptions for law enforcement 
  • restructuring of the Information Commissioner’s Office 
  • safeguards provided by the EU-US Umbrella Agreement, whose privacy and data protection safeguards are incorporated into the UK-US Cloud Act Agreement
  • encryption to remain essential for ensuring the security and confidentiality of personal data and electronic communications.

AI Act and the GDPR

The European Parliament has published a study on the Interplay between the AI Act and the EU digital legislative framework, including the GDPR. In particular, the AI Act introduces requirements for fundamental rights impact assessments (FRIAs) in cases that often also trigger data protection impact assessments (DPIAs) under the GDPR. These instruments differ in scope, supervision, and procedural requirements, creating duplication and uncertainty. Transparency and logging obligations are also redundant across both regimes. Moreover, there is ambiguity over how data controllers and AI providers should manage rights of access, rectification, and erasure when personal data becomes embedded in complex AI models. 

In AI contexts, the GDPR-governed “legitimate interests” legal basis is widely regarded as the most relevant and frequently invoked basis, states the report. Meanwhile, consent is often impracticable and contractual or legal obligation bases rarely map neatly onto AI training or deployment scenarios. Finally, the AI Act introduces additional governance layers: the AI Office and the European AI Board at the EU level and the national GDPR supervisory bodies with respect to data protection issues, which produce a potentially overlapping set of competent supervisory bodies. 

Legal updates

Dragi report: The Future of Privacy Forum takes a closer look at the report on European competitiveness issued in 2024 by former Italian Prime Minister Mario Draghi, which calls for simplification of the GDPR, and criticizes “heavy gold-plating” by Member States in GDPR implementation. The Commission is now set to announce a Digital Omnibus package with proposals to quickly reduce the burden on businesses. However, changes to the GDPR fundamental principles could bring any reform into conflict with the TFEU and the Charter and lead to action before the Court of Justice. 

GDPR enforcement: On 21 October, the European Parliament passed the regulation on additional procedural rules regarding the enforcement of the GDPR. The document aims to harmonise the criteria for assessing the admissibility of cross-border complaints and clarifies the rights of complainants and entities under investigation. The regulation establishes the same admissibility standards no matter where in the EU the GDPR complaint was filed. Both complainants and companies involved will have the right to be heard at specific stages of the investigation and will receive preliminary findings to express their views before a final decision is issued. 

Data for research: From 29 October, researchers can request data access from very large online platforms and search engines to study systemic risks. Access to public platform data has been available since the Digital Services Act (DSA) came into force in February 2024. Researchers now have the opportunity to request access to platforms’ internal data and to investigate its impact on society. Since datasets can allow direct or indirect inferences about individual users through their interactions, profiles, or other published content, researchers must comply with the requirements of the GDPR when carrying out their projects.

More from supervisory authorities

DSA and the GDPR: The EDPB has closed the consultation on the guidelines on the interplay between the Digital Services Act and the GDPR. One of its sections examines the limits on automated decision-making that involves the processing of personal data by intermediary service providers. The paper also further examines the transparency of processing and deceptive design patterns prohibited by the DSA when these practices involve personal data.  It also reviews the relationship between profiling restrictions and advertising technology, systematic risk assessments and minors’ data protection.

China privacy updates: China has issued its first national standard for certification of cross-border personal information processing. The standard, which takes effect on March 1, 2026, sets out fundamental principles, security requirements, and obligations for safeguarding individuals’ rights in cross-border data processing. Reportedly, the certification is valid for three years. The applicant may reapply for certification for continual use of such certification six months before its expiration. In general, under the Chinese Personal Information Protection Law (PIPL), a data handler may transfer personal information outside of China if one of the following three conditions (with some exemptions) is met:

  • Apply for and pass the security assessment;
  • Sign and file the standard contract; or
  • Obtain the personal information protection certification.

Hacked emails

Almost one in ten people affected by cybercrime in the previous year experienced unauthorised access to an online account or email. To provide targeted support to consumers in such cases, the German Federal Office for Information Security (BSI) published a guide – Emergency checklist: Hacked account (in German). If a person can no longer log in despite having the correct password, their email account may have been hacked. Changes in settings or attempts to log in from new devices can also be signs. To protect your account, the BSI recommends securing it with either a strong password combined with two-factor authentication or with passkeys. 

IoT security

According to America’s NIST, IoT products often lack product cybersecurity capabilities that their customers, organisations and individuals can use to help mitigate their cybersecurity risks. Manufacturers can help their customers by providing necessary cybersecurity functionality and the cybersecurity-related information they need. To that end, NIST closes public consultations and offers a public draft of Foundational Cybersecurity Activities for IoT Product Manufacturers. This publication describes recommended activities that manufacturers should consider performing before their IoT products are sold to customers. 

GenAI guidance

blockchain

European Data Protection Supervisor (EDPS) has published its revised and updated guidelines on the use of generative AI and processing of personal data by EU institutions, bodies, offices, and agencies (EUIs), reflecting the fast-moving technological landscape and the evolving challenges posed by generative AI systems. It introduces several key updates, including:

  • a refined definition of generative AI for greater clarity and consistency
  • a new, action-oriented compliance checklist for EUIs to assess and ensure the lawfulness of their processing activities
  • clarified roles and responsibilities, assisting EUIs in determining whether they act as controllers, joint controllers, or processors
  • detailed advice on lawful bases, purpose limitation, and the handling of data subjects’ rights in the context of generative AI.

Receive our digest by email

Sign up to receive our digest by email every 2 weeks

Capita fine

The UK’s privacy regulator, ICO, issued a fine of 14 million pounds to Capita for failing to ensure the security of personal data related to a breach in 2023 that saw hackers steal millions of people’s information, from pension records to the details of customers of organisations Capita supports. For some people, this included sensitive information such as details of criminal records, financial data or special category data. Capita processes personal information on behalf of over 600 organisations providing pension schemes, with 325 of these organisations also impacted by the data breach.

The investigation found that Capita, in its capacity as a data controller, had failed to ensure the security of the processing, as well as lacking the appropriate technical and organisational measures. In particular, Capita did not prevent both privilege escalation and unauthorised lateral movement through the network, and did not effectively respond to security alerts when detected.    

Grindr fine confirmed

On October 21, Norway’s Borgarting Court of Appeal upheld Grindr’s multi-million privacy fine for violating Art. 9 of the GDPR, which forbids the processing of specific categories of personal data. The court decided that sharing a dating app user ID with advertisers revealed sensitive information regarding their sexual orientation. It further stated that consent was invalid since it was combined with service access, giving customers no real option.

Grindr’s multi-page privacy policy was also unclear concerning the extent and beneficiaries of data sharing, according to the Digital Policy Alert legal blog.

In other news

Data security fine: Australian Clinical Labs (ACL) has been ordered to pay AUD 5.8 million for breach of the Privacy Act 1988 following a 2022 cyber incident which impacted the personal information of over 223,000 individuals. This is the first civil penalty under the Privacy Act, DLA Piper law blog reports. The incident occurred within the IT environment of ACL’s subsidiary, Medlab Pathology, which was acquired only 3 months prior. Critical vulnerabilities in the subsidiary’s IT systems were not properly identified before the acquisition, as part of the due diligence process, as ACL intended to fully integrate them into its own IT environment within the following 6 months.

Insurance data security fines: The New York state Attorney General secured a 14.2 million fine from car Insurance companies over data breaches. Eight car insurance companies’ poor cybersecurity allowed hackers to steal driver’s license numbers to fraudulently obtain unemployment benefits, failing to protect the private information of more than 825,000 New Yorkers. These companies allowed people to obtain a car insurance price quote using an online tool. Some of the companies also provided password-protected tools to insurance agents to generate quotes for customers. The investigation found that data thieves were able to exploit a “pre-fill” function in the companies’ online quoting tools.

blockchain

Electronic identification services fine: In Finland, the Data Protection Ombudsman has imposed an 865,000 euro fine on Aktia Bank for neglecting information security in its electronic identification service. Due to a short-term disruption, some people who logged into various services with Aktia’s bank codes had access to other customers’ highly personal information, as the service mixed up the identification of people. The regulator found that the bank had shortcomings in the planning, implementation and testing of a technical change made to the service.

Patient data breaches

Polish regulator UODO imposed an approximately 10,000 euro fine on Gyncentrum for failing to report a personal data breach. A medical centre specialising in infertility treatment, among other things, sent a communication, the subject line of which indicated the name of a genetic test, to another person, also a patient of the centre (with the same name). The document contained personal data: first name, last name, bank account number, and address. It also included the transfer amount and the name of the test performed, revealing that it was part of an extensive prenatal diagnostic program. The patient herself learned of the incident from another patient at the centre. 

In Guernsey, the Medical Specialist Group (MSG) was also fined 100,000 pounds following a cyber-attack. In 2021, the MSG became aware of a personal data breach after it received suspicious emails indicating that its email server had been accessed by cybercriminals. These vulnerabilities enabled criminals to access and steal e-mails stored on the server, some of which contained sensitive patient health data. These e-mails were subsequently used to facilitate multiple phishing campaigns targeting MSG patients over a series of months. The MSG notified the regulator of this breach. The inquiry found that the company routinely failed to install security updates to its e-mail server over the course of 13 months. This included updates directly related to the breach exploit and other critical vulnerabilities. 

California privacy violations

California’s Attorney General secured a settlement with Sling TV, a streaming service, resolving allegations that the company violated the California Consumer Privacy Act (CCPA) by failing to provide an easy-to-use method for consumers to stop the sale of their personal information and by failing to provide sufficient privacy protections for children. Sling TV is an internet-based live TV service that offers both a paid subscription and a free, ad-supported streaming service. Unlike traditional television, where advertising is based on the content of the programming, Sling TV uses its internet-based platform to deliver highly targeted advertising, using detailed consumer data such as age, gender, location, and income to personalise ads for viewers, often without their awareness.   

In case you missed it

Digital health care: Privacy International suggests that a Digital Health Technology Assessment (dHTA) is needed to make sure that tools developed by the private sector and relied on by public healthcare providers do not harm people and their rights. The Health Technology Assessment (HTA) is a longstanding practice that is used to assess the effectiveness and safety of technological innovations before they can be used in the diagnosis, treatment, management and prevention of health problems.

Thus, there is an overwhelming need for clear and specific rules that engages with the specific needs and challenges of new and emerging practices.

Multi-party computation: An EDPS blog article states that across sectors from health research to financial systems, data sharing continues to drive innovation, yet it also intensifies privacy and compliance challenges, making the balance between access to data and confidentiality increasingly difficult. Secure multi-party computation (SMPC) proposes a way to reconcile these seemingly conflicting goals – enabling organisations to jointly compute insights without revealing their underlying data. Under SMPC, multiple parties can work together to compute a result from their private data without ever exposing that data to one another. Unlike traditional encryption, which protects data only while it’s stored or transmitted, SMPC ensures confidentiality throughout the computation process itself for:

  • hospitals improving disease prediction models using patient data,
  • banks detecting cross-border fraud patterns,
  • governments analysing the impact of social policies,

From a legal perspective, SMPC challenges traditional interpretations of privacy law. Frameworks like the GDPR were not designed with cooperative computation in mind; thus, they must be embedded within transparent governance frameworks and ethical oversight.

Do you need support on data protection, privacy or GDPR? TechGDPR can help.

Request your free consultation

Tags

Show more +