Conditional consent

Data protection digest 18 Feb – 2 Mar 2026: ‘Conditional Consent’ for meaningful user control over cookie preferences

Conditional consent vs cookie fatigue

On 10 February, the EDPB and EDPS, in a joint opinion, strongly welcomed the regulatory solution to address cookie fatigue and the proliferation of consent banners. This follows the  European Commission’s proposal to switch to automated, machine-readable indications of data subjects’ choices under the Digital Omibus package. The EU regulators welcome that, pursuant to the proposed Article 88b of the GDPR, harmonisation standards will be developed. 

Such standards should cover the communication of data subjects’ choices, from browsers to websites, from mobile phone applications to web services, and ensure that all involved actors use the same automated machine-readable indications and are not simply repackaging consent in a new technical format. 

 Stay up to date! Sign up to receive our fortnightly digest via email.

Anticipating the need of data controllers and browser providers in the near future to be able to accept and enable automated signals, TechGDPR publishes Conditional Consent, an open concept paper proposing what automated signalling should look like for meaningful user control, based on three dimensions:

  • Cookie purpose
  • Website category
  • Third-party processing

The concept paper contains the main principles, legal basis and exceptions, technical specifications, along with a comparison with existing tools, and a proposed implementation solution, all available at conditionalconsent.com.

Main developments 

Prohibited AI practices: A Future of Privacy Forum analysis draws “red lines” under prohibited practices in the new EU AI Act. They concern harmful manipulation and deception, social scoring, individual risk assessment, untargeted scraping of facial images, emotion recognition, biometric categorisation, and real-time remote biometric identification for law enforcement. Prohibited AI practices are regulated by Article 5 of the AI Act, which became applicable in February 2025. Plus, starting on 2 August 2025, this provision also became enforceable

AI-generated images: The EDPB has signed a Joint Statement on AI-Generated Imagery and the Protection of Privacy. The statement, coordinated by the Global Privacy Assembly, represents the united position of 61 authorities across the world. The statement addresses serious concerns about AI systems that generate realistic images and videos depicting identifiable individuals without their knowledge or consent. The co-signatories are especially concerned about potential harm to children and other vulnerable groups, such as cyber-bullying and/or exploitation. Fundamental principles should guide all organisations developing and using AI content generation systems, including:

  • Implement robust safeguards to prevent the misuse of personal information.
  • Ensure meaningful transparency about AI system capabilities, safeguards, acceptable uses and the consequences of misuse. 
  • Provide effective and accessible mechanisms for individuals to request the removal of harmful content involving personal information and respond rapidly to such requests. 
  • Address specific risks to children through implementing enhanced safeguards and providing clear, age-appropriate information to children, parents, guardians and educators

Digital Omnibus legal study

The European Parliament published a study identifying interlinks and possible overlaps between different legal acts in the field of digital legislation. It analyses the European Commission’s Digital Omnibus package proposals published on 19 November 2025, distinguishing administrative simplification from more substantive recalibration of safeguards across data, privacy, cybersecurity and AI areas. The study highlights key areas of controversy (legal certainty, enforcement capacity, and impacts on rights) and sets out areas for consideration for parliamentary scrutiny, including:

  • Debate over the definition of personal data in the GDPR
  • Integrating ePrivacy into GDPR (cookie fatigue)
  • Concerns about restricting data access rights
  • Data Act consolidation
  • Centralised incident notification submission SEP
  • AI timelines, burden reduction and centralisation.

Ransomware statistics

In 2025, 65 ransomware incidents were reported to the police in the Netherlands. Incident response companies responded to 40 incidents. Access is usually gained through exploiting vulnerabilities and account takeovers. In a ransomware attack, computer systems and data are locked with a code containing malicious software. Hard drives, databases, backups, USB drives, and cloud data can also be affected. The victim is blackmailed. The attacker offers this code for payment. 

Reporting the incident is crucial if you, as a business or individual, have been a victim of ransomware. Even if the criminals have already been paid, filing a report provides the police with vital information. A report can contain missing information that police can use to unlock the system. It also helps them identify suspects. 

More from supervisory authorities

GDPR survey in Germany: The North Rhine-Westphalia data protection commissioner has used a recent survey by the business association Bitkom as an opportunity to reject discussions about the complete or partial centralisation of data protection supervision.

The survey of 603 companies clearly shows that businesses in the state primarily view data protection laws as too complicated. 85 % of the companies surveyed in Germany want more understandable data protection regulations. 79 % are calling for a reform of the GDPR, and 69 % demand better coordination with other regulations. 

Just 33 % believe that decision-making processes would be faster within a federal agency, while 44 % are concerned about losing proximity to their local supervisory authority and thus a direct contact person (which implies the need for additional staff to handle a sharply increasing number of complaints and consultation services). 

Session replay tools: The French data protection regulator CNIL is launching a public consultation on its draft recommendation concerning session replay tools that allow the monitoring and analysis of users’ online behaviour. The objective is to support the actors who design these tools and those who use them in their compliance. Session replay tools are used to reconstruct the complete browsing path of an Internet user on a website or a mobile app. They can, for example, be used to detect and fix bugs or optimise the structure or ergonomics of a website or mobile application. 

Receive our digest by email

Sign up to receive our digest by email every 2 weeks

More official guidance

GDPR certification criteria: The North Rhine-Westphalia data protection commissioner also approved a nationwide catalogue (available in English and German) of criteria for IT solutions. Companies that meet these criteria will receive a certificate confirming their compliance with European data protection law, which they can then use for advertising purposes. The catalogue was developed by TÜV Nord Group. This is the third such approval issued by the NRW regulator.

Specifically, it addresses so-called information processing services – online banking, accounting, and AI systems, as well as search engines. The certification process, conducted by a specialised certification body, typically involves a detailed audit of the processing operations within the respective company. This audit verifies the technical and organisational measures in place, as well as compliance with the principles of the GDPR. 

Health screening campaigns via phone are possible: In Italy, the data protection authority Garante has approved the use of telephone numbers for screening, provided that adequate safeguards are respected. Healthcare companies may use adult patients’ telephone numbers, provided during previous healthcare services, to promote participation in screening campaigns required by national or regional regulations, even if the information request did not expressly state this purpose at the time the data was collected.

Specifically, healthcare companies will be required to update their information, specifying that the most recent contact details collected for treatment purposes, subject to verification of their accuracy. It may be used exclusively for the promotion of public prevention programs and not for other purposes (for example, scientific research or administrative activities).

In other news

Employee data access rights: The LewisSilkin legal blog analyses a recent decision from the French Court of Appeal, which confirmed that employees cannot rely on their right of access to obtain copies of entire work email correspondence or business files, merely because their name or email address appears in them. Where the material contains no substantive personal data beyond identifying information, the right of access does not extend to wholesale document disclosure.

Furthermore, the right of access cannot be seen as a litigation discovery mechanism (e.g., employee dismissal as it appears in the above case). The court decision also reflects the ICO guidance on the Right of Access.  

Reddit fine: In the UK, Reddit was fined 14.47 million pounds for children’s privacy failures. The Information Commissioner’s investigation found that Reddit did not apply any robust age assurance mechanism. The company did not have a lawful basis for processing the personal information of children under the age of 13. It also failed to carry out a data protection impact assessment to assess and mitigate risks to children before 2025. In the past year, Reddit introduced age assurance measures that include age verification to access mature content and asked users to declare their age when opening an account. The commissioner once again informed Reddit that relying on self-declaration presents risks to children, as it is easy to bypass. 

Samsung consent case: The Texas Attorney General reached an agreement with Samsung Electronics America, concerning the collection of Automated Content Recognition (ACR) viewing data from Texas consumers through Samsung smart televisions. Under the agreement, Samsung must cease collecting or processing ACR viewing data without obtaining Texas consumers’ express consent and must update its smart televisions to implement clear and conspicuous disclosures and consent screens, digitalpolicyalert.org reports.

More enforcement decisions

Ransomware attack followed by privacy fine: In Spain, data protection agency AEPD fined Sprinter Megacentros del Deporte (a sporting goods retailer) 2.6 million euros for a data breach, DataGudance reports. A ransomware attack encrypted systems and exfiltrated data, affecting 6.3 million individuals. Notification of a data breach to data subjects was also not delivered ‘without undue delay’ and lacked specific mitigation information. 

Conditional consent

Biometric data fine: The Italian Garante has fined eCampus University 50,000 euros for unlawfully processing the biometric data of numerous participants in its online courses. The investigations revealed the lack of a suitable legal basis to justify the use of biometric systems, especially given the availability of less invasive tools.

It also emerged that the University had not conducted a data protection impact assessment before implementing the system. The violations affected a very high number of participants, over 450 students for each lesson.

Data processing agreement fine: The Polish data protection authority UODO has fined DPD Polska more than 2.75 million euros after finding serious failures in how the courier company structured its relationships with external carriers, according to an analysis by grcreport.com. These carriers participated in loading and unloading parcels and had access to address labels containing personal data. In some cases, shipments were transported in vehicles not owned by DPD Polska and for which it had no other legal basis. Despite this third-party access, the company did not conclude personal data processing agreements with the carriers.

GDPR does not prevent authorities from being notified of social fraud

The Danish data protection regulator, Datatilsynet, explains that the GDPR does not contain a general prohibition on disclosing information to public authorities. On the contrary, the rules allow data to be disclosed when there is a lawful basis for processing. This may be if the disclosure is necessary to comply with a legal obligation. The question of whether, for example, an insurance company may or must disclose information on possible fraud to a public authority, therefore, depends on the specific legal basis in national legislation, including rules on confidentiality and sector-specific regulations. 

And Finally

Conditional consent

AI models and GDPR audit tool:  The French CNIL, with other actors in the digital data domain, the ANSSI, the PEReN and Inria, are launching a call for expressions of interest to test an audit tool called PANAME that makes it possible to assess the confidentiality of AI models and their compliance with the GDPR. This project aims to develop a tool to audit the privacy of AI models. It will take the form of a library for performing data extraction and/or re-identification tests on AI models. 

For more than a decade, research has shown that it is possible to extract data, including personal data, from an AI model that was included in the training dataset. This extraction can be carried out via:

  • statistical techniques at the model level, full or partial access to the model, 
  • in the case of generative AI, by directly querying the model by instruction (prompt).

AI geolocation: Privacy International explains that one of the most concerning capabilities of the newest AI systems is to infer geographic location from images. Vision‑Language Models (VLMs) can now determine where in the world any given photo is taken with striking speed and accuracy. Most people are unaware that widely accessible AI tools can identify the location of their personal photos, even when Global Positioning System (GPS) metadata has been removed. Inferring location from images without GPS data may potentially support beneficial activities, such as robotics development or investigative journalism. But they are not privacy risk-free. 

Do you need support on data protection, privacy or GDPR? TechGDPR can help.

Request your free consultation

Tags

Show more +