Digital omnibus

Data protection digest 18 Nov-2 Dec 2025:  “Digital omnibus” package latest & market price of personal data already estimated

“Digital omnibus” package latest

On 19 November, the European Commission presented proposals for amendments in the digital area legislation, including the GDPR, the Data Act, the EU AI Act, and the NIS 2 Directive. According to digitalpolicyalert.org analysis, the Digital Omnibus would amend the GDPR by:

  • changing the definition of personal data to specify any entity that is reasonably likely to have the means to identify a person,
  • exempting certain biometric data and data used by AI from the restrictions on processing special categories of personal data,
  • clarifying on further processing of personal data in the public interest or for scientific research purposes, and
  • specifying that processing of personal data that is necessary for the interests of a controller in the development or operation of an AI system can be pursued for ”legitimate interests”.

The Digital Omnibus would also exempt personal data processing from the cookie requirements under the ePrivacy Directive. Instead, it would amend the GDPR to maintain the consent requirement, while specifying that certain processing activities, such as electronic communications transmissions, service provision, audience measurement solely for an online service provider, and maintaining or restoring security, would be considered lawful. Websites and apps would have to allow data subjects to consent through automated, machine-readable mechanisms; browser manufacturers must likewise enable users to grant or refuse consent.

Finally, personal data breaches that are likely to result in a high risk to the rights and freedoms of natural persons would need to be reported to the single-entry point within 96 hours of becoming aware of them. Similarly, there would be unified lists of processing activities that do or do not require a Data Protection Impact Assessment, and create a standard DPIA template and methodology.

Stay up to date! Sign up to receive our fortnightly digest via email.

GDPR enforcement

On 17 November, the Council of the EU adopted new rules to improve cooperation between national data protection bodies when they enforce the GDPR to speed up the process of handling cross-border data protection complaints. Main elements of the new EU regulation include:

  • Admissibility: Regardless of where in the EU a complaint is filed, admissibility will be judged based on the same information/conditions. 
  • Rights of complainants and parties under investigation: Common rules will apply for the involvement of the complainant in the procedure, and the right to be heard for the company or organisation that is being investigated.
  • Simple cooperation procedure: For straightforward cases, data protection authorities can decide, to avoid administrative burden, to settle actions without resorting to the full set of cooperation rules.
  • Deadlines: In the future, an investigation should not take more than 15 months. For the most complex cases, this deadline can be extended by 12 months. In the case of a simple cooperation procedure between national data protection bodies, the investigation should be wrapped up within 12 months.

The regulation will enter into force 20 days after its publication in the Official Journal of the EU. It will become applicable 15 months after it enters into force.

More legal updates

The European Commission has launched a whistleblower tool for the AI Act. Whistleblowers can provide relevant information in any of the EU official languages and in any relevant format. The tool provides a secure means to report potential law violations that could compromise fundamental rights, health, or public trust. The highest level of confidentiality and data protection is guaranteed through certified encryption mechanisms. Anyone can access the AI Act Whistleblower Tool and read more information about the tool and the frequently asked questions

California privacy updates: California has enacted a bill which amends the state’s data breach notification law to establish strict new reporting timelines. Beginning January 1, 2026, businesses must notify affected California residents within 30 calendar days of discovering a security incident involving personal information. For incidents affecting more than 500 residents, notice to the California Attorney General must be provided within 15 calendar days of the consumer notice. The amendment allows limited exceptions for law enforcement needs or when necessary to determine the scope of the incident and restore system integrity, JD Supra lawblog reports. 

In parallel, starting Jan. 1st, 2027, California will prohibit a business from developing or maintaining a browser, as defined, that does not include functionality configurable by a consumer that enables the browser to send an opt-out preference signal to businesses with which the consumer interacts through the browser. The bill would require a business that develops or maintains a browser to make clear to a consumer in its public disclosures how the opt-out preference signal works and the intended effect. The bill would grant a business that develops or maintains a browser that includes this functionality immunity from liability for a violation of those provisions by a business that receives the opt-out preference signal. 

Child data protection in the EU

On 26 November, the European Parliament adopted a resolution on the protection of minors online as part of an own-initiative procedure on the topic. The resolution calls, among other things, for the implementation of an EU-wide harmonised digital minimum age of 16 for accessing social media, video-sharing platforms and AI companions without parental consent, with 13 as the minimum age for any social media use by children, even with parental consent. 

In parallel, the German Data Protection Conference, DSK, adopted a resolution calling for amendments to the GDPR to strengthen protections for children. It proposes a ban on children’s consent for profiling and advertising, limits on children’s ability to consent to special-category data processing, and clearer rights for children to access counselling and medical services privately. It also focuses on a prohibition on children consenting to automated decisions, attention to children in breach notifications, data protection by design and default, and consideration of children’s risks in data protection impact assessments, digitalpolicyalert.org sums up. 

Cloud computing

The European Commission has published non-binding Model Contractual Terms for data access and use and Standard Contractual Clauses for cloud computing contracts. They have been developed to help parties, especially SMEs, implement the provisions of the Data Act. Their use is voluntary and open to users’ possible amendments. Although they were mainly drafted for business-to-business contracts, they can also be used in relations between businesses and consumers, if relevant consumer protection rules are added. 

Three sets of Model Contractual Terms (MCTs) were drafted to cover the relationships where data sharing is mandatory, between data holders, users and data recipients of data generated when using connected products. Plus, proposed Standard Contractual Clauses (SCCs) translate the provisions of ‘cloud switching’ into ready-to-use contractual terms that can be inserted in data processing contracts:

  • SCC Switching & Exit
  • SCC Termination 
  • SCC Security & Business continuity (including provider notification of significant incidents).

Email security

The German Federal Office for Information Security, BSI,  has published a White paper on requirements for the protection, transparency, and user-friendliness of webmails that systematically and future-orientedly increase consumer security. The paper considers not only technical security functions, but also usability, transparency and trust as essential components of digital sovereignty. A fundamental part of e-mail security currently still rests on the shoulders of users. They should be familiar with two-factor authentication, passkey and encryption. The BSI sees responsibility primarily with the providers: they must provide effective procedures regarding authentication, encryption, spam protection and account recovery that work without major user intervention.

Data Act implementation

Digital omnibus

The Data Act has been in effect since September 2025. This new European regulation is intended to give consumers within the EU more control over the use of their data. For instance, a car owner will have the right to access the data their car collects. If repairs are needed, they can share the data with a garage of their choice, explains the Dutch data protection agency AP, which will jointly oversee the implementation process at a national level, starting from 21 November.

The Data Act and the implementing laws do not override the rules of the GDPR. In the event of conflicting rules, the GDPR takes precedence. This means that any data sharing involving personal data must comply with the GDPR, stresses the regulator. 

More from supervisory authorities

Market research data processing: In Poland, the data protection regulator UODO approved the “Code of Conduct on the Processing of Personal Data by Private Research Agencies”. The reason for the development of the code was numerous discrepancies in the processing of the personal data of research participants. As a result, in the case of identical surveys, their participants, depending on the entity conducting the study, could receive divergent information, for instance, on the legal basis for the processing of personal data. Information obligations were also fulfilled differently. The Code also provides guidance to help carry out a risk assessment or, where justified, a data protection impact assessment.

It is worth noting that the code obliges all entities that join it to appoint a Data Protection Officer (DPO)

Sound recording and CCTV: Organisations often choose to conduct video surveillance with sound recording. Sometimes, they also do not disable the camera manufacturer’s default audio function. As a result, the additional risks posed not only by image capture, but also by sound recording are not sufficiently assessed. In addition, the processing of personal data related to it is not always carried out legally: recording sound and image are two different data processing operations, so both audio and video require different legal bases

The processing of personal data by performing video surveillance with audio recording is not justified in most cases. There are rare situations where it is legal and permissible, mainly when it is associated with an increased risk to the essential interests of the organisation or society. Often, the legal basis for such processing can be found in the special regulatory framework applicable to a particular industry in which the organisation operates.

Receive our digest by email

Sign up to receive our digest by email every 2 weeks

Employment clauses and personal data processing

Labour clauses are widely used by both public and private contracting authorities to ensure fair wages and working conditions for suppliers. Contracting entities often require the supplier to provide documentation of its compliance with the labour clauses, typically in the form of employees’ salaries and timesheets, and employment contracts. This gives rise to questions about the supplier’s legal basis for disclosing such personal data to the contracting authority, notes Denmark’s data protection agency. To that end, there will generally be an overriding legitimate interest that these may form the basis for the disclosure of the information in question.

TechSonar 2025-2026

EDPS’s latest guidance on new technology concentrating on the TechSonar report 2025-2026 explores six trends: agentic AI, AI companions, automated proctoring, AI-driven personalised learning, coding assistants and confidential computing. While each of these technologies serves a distinct purpose, they are deeply interconnected. Together, they illustrate how AI is progressively reshaping not only business processes or common daily tasks, but also the human experience of technology. Continue reading the full report here

In other news

Digital omnibus

Data security in cloud-based EdTech: The US Federal Trade Commission will require education technology provider Illuminate Education, Inc. (Illuminate) to implement a data security program and delete unnecessary data to settle allegations that the company’s data security failures led to a major data breach, which allowed hackers to access the personal data of more than 10 million students

Illuminate sells cloud-based technology products and collects and maintains personal information about students on behalf of schools and school districts. In its complaint, the FTC alleged that in 2021, a hacker used the credentials of a former employee, who had departed Illuminate three and a half years prior, to breach Illuminate’s databases stored on a third-party cloud provider. 

Medical data breach: The Norwegian data protection regulator upheld the fine on Argon Medical Devices. In 2023, it issued an American company Argon Medical Devices an infringement fee of approximately. 127,000 euros for violating the GDPR. In 2021, Argon discovered a security breach that affected the personal data of all of its European employees, including those in Norway. Argon sent the Norwegian regulator a notification of a breach long after the 72-hour deadline for reporting such breaches. 

Argon believed that they did not need to report the breach until they had a complete overview of the incident and all its consequences. This view was enshrined in their procedures, and this was the basis for the delay.  The case is an important reminder that controllers must have appropriate measures in place to determine whether a breach has occurred and to promptly notify the supervisory authority and the data subject.

Mobile app gaming company fine

California’s Attorney General settled with Jam City, Inc., resolving allegations that the mobile app gaming company violated the state’s Consumer Privacy Act (CCPA) by failing to offer consumers methods to opt out of the sale or sharing of their personal information across its popular gaming apps. Jam City creates games for mobile platforms, including games based on popular franchises such as Frozen, Harry Potter, and Family Guy. In addition to 1.4 million dollars in civil penalties, Jam City must provide in-app methods for consumers to opt out of the sale or sharing of their data and must not sell or share the personal information of consumers under 16 years old without their affirmative “opt-in” consent.

Data brokers fine

The Belgian data protection authority GBA, meanwhile, has imposed a 40,000 euros fine on data broker Infobel for illegally reselling data for marketing purposes, cybernews.com reports. A consumer complained to the GBA after getting a marketing brochure in the mail from a firm with which he was not a customer. The complainant asks how the corporation received his information. The customer was informed that his information had been given by a media agency. The agency obtained his information via Infobel, a data broker that received it from a telecom operator. 

Infobel said it had permission to sell the complainant’s information to the media agency since it had secured approval from data subjects. However, the data protection authorities claimed that there was no explicit, informed, or unambiguous consent. 

Cookie consent fine

On November 20, the French regulator CNIL fined the French company Conde Nast Publications 750,000 euros for non-compliance with the rules applicable to cookies deposited on the terminals of users visiting the “vanityfair.fr” site. In particular, cookies subject to consent were placed on the terminals of users visiting the “vanityfair.fr” site as soon as they arrived on the site, even before they interacted with the cookie banner to express a choice. Also, when a user clicked on the “Refuse all” button in the banner, or when they decided to withdraw their consent to the registration of trackers on their terminal, new cookies subject to consent were nevertheless deposited, and other cookies, already present, continued to be read. 

And finally…

Meta multi-million file: A Spanish court has ordered Meta to pay 479 million euros to Spanish digital media outlets for unfair competition practices and infringing the GDPR, a ruling the company will appeal, Reuters reports. The settlement, which will be given to 87 digital press publishers and news organisations, is related to Meta’s use of personal data for behavioural advertising.

The complaint filed by the Spanish outlets centred on Meta’s shift in the legal basis for processing personal data after the GDPR went into effect in May 2018. Meta changed “user consent” to “performance of a contract” to support behavioural advertising. Later, regulators judged that it was insufficient. Meta returned to consent as its legal foundation in 2023. The judge assessed that Meta generated at least 5.3 billion euros in advertising income during those five years.

Personal data monetisation: The French CNIL commissioned a survey on the perception of the French people regarding the use of their personal data. From a representative sample of 2,082 people aged 15 and over, 65% of them say they are willing to sell their data. Of these, only 6% would be willing to sell it for less than 1 euro per month, while 14% preferred a fee of more than 200 euros per month. 

The most common valuation was between 10 and 30 euros per month, preferred by 28% of respondents. This coincides with the latest market research based on Meta services estimation, where, for a price of 5 euros, 20% of people would be willing to sell their data, and 90% of companies would be willing to buy it. Taken together, these results make it possible to approximate a market price for data that would be around 40 euros per month (and per subscribed service). 

Do you need support on data protection, privacy or GDPR? TechGDPR can help.

Request your free consultation

Tags

Show more +