GDPR enforcement simplified
A new regulation came into force on 1 January, supplementing the GDPR. It speeds up the work of data protection authorities in enforcement cases that involve multiple countries in the EU/EEA. The regulation provides, among other things, for time limits, stages of investigation, the exchange of information between authorities, and the rights of the parties concerned. In future, data protection authorities will have to issue a resolution proposal on a cross-border case as a rule within 12-15 months. In the most complex cases, the deadline can be extended by 12 months. The regulation will apply from April 2027.
Stay up to date! Sign up to receive our fortnightly digest via email.
UK Adequacy decision
At the end of December the EU Commission adopted two new adequacy decisions for the UK – one under the GDPR and the other under the Law Enforcement Directive, until 27 December 2031. In accordance with the new decisions, transfers of personal data from the EU to the UK can continue to take place without any specific framework. Following Brexit, the Commission adopted two adequacy decisions vis-à-vis the UK in 2021. Sunset clauses had been introduced in each of the decisions. The decisions expired in mid 2025, but have been extended until the end of the year. The EDPS has since issued an opinion on these decisions.
More legal updates

US consumer privacy updates: In Kentucky, as well as Indiana, Rhode Island and several other states, GDPR-enhanced legislation related to consumer data privacy took effect on January 1. In Kentucky, in particular, the new legislation establishes the rights to confirm whether data is being processed, to correct any inaccuracies, to delete personal data provided by the consumer, to obtain a copy of the consumer’s data, and to opt out of targeted advertising, the sale of data, or profiling of the consumer along with requirements for entities that control and process their data.
Similarly, in January, new regulations became effective in California regarding a risk-assessment framework for certain high-risk data processing activities, as well as transparency and notice requirements, disclosure of sensitive personal information, data breach reporting, consumer rights requests, and data collection and deletion by data brokers.
AI use by banks
The Hungarian data protection regulator issued a report on the processing of personal data by AI systems used by banks in Hungary (available in English). Some good practices indicated by the report include:
- AI recognition of images, voices and texts must be reliable, without compromising data security. Principles of data minimisation and storage limitation must be observed.
- The quality of the data used for AI training is important, as well as identifying whether or not the training data needs to be linked to a specific natural person. In many cases, pseudonymisation or anonymisation can be used to mitigate privacy risks before training.
- The use of ‘Shadow AI’ is becoming a new phenomenon. It covers all cases where, in an organisation, users use AI systems in an unregulated, non-transparent, uncoordinated manner from the point of view of the organisation, either for work or for some personal use, using the organisation’s IT infrastructure.
- In their operations, certain banks under review also use analytical models to analyse and predict creditworthiness and product affinity, the precise classification of which may raise questions. They often operate on a statistical basis, but may also have an AI-based component, and it is necessary to apply the appropriate safeguards.
More from supervisory authorities

EU Data Act: The French privacy regulator CNIL explained how the EU Data Act is going to reform the EU digital economy, gradually implemented through 2026-2027. The Act sets fair rules on the access and use of personal or non-personal data generated by connected objects. It allows anyone who owns or uses connected products to access the data generated by this object. It also facilitates their sharing with other actors, in particular by prohibiting unfair contractual clauses.
The implementation of this regulation must be done in conjunction with the GDPR. In particular, it provides that in the event of a contradiction between the two texts, it is the GDPR that prevails when personal data is concerned.
Similarly, the Digital Governance Act should be taken into account, which has set up new trusted intermediaries to encourage voluntary data sharing.
Bodycam use: At the end of December, the CJEU ruled in a case regarding a data controller’s obligation to provide information when collecting personal data via a body-worn camera worn by ticket inspectors on public transport. The collection of personal data by means of body-worn cameras constitutes collection directly from the data subject. The information obligation must therefore be respected at the time of collection, Article 13 of the GDPR. The information obligation can operate at several levels, where the most important information is, for example, stated in a warning sign, while the remaining information can be provided in another appropriate (and easily accessible) way.
Receive our digest by email
Sign up to receive our digest by email every 2 weeks
Disney US settlement

On 31 of December, a federal judge required Disney to pay 10 million dollars to settle FTC allegations that the company allowed personal data to be collected from children who viewed child-directed videos on YouTube without notifying parents or obtaining their consent as required by the Children’s Online Privacy Protection Rule (COPPA Rule). A complaint alleged that Disney violated the COPPA Rule by failing to properly label some videos that it uploaded to YouTube as “Made for Kids”.
The complaint alleged that by mislabeling these videos, Disney allowed for the collection, through YouTube, of personal data from children under 13 who viewed child-directed videos and used that data for targeted advertising to children.
More enforcement decisions
TikTok investigations: According to vitallaw.com, the Spanish and Norwegian data protection authorities have issued warnings to TikTok users regarding the company’s transfer of personal data to China, where national laws could require that data be shared with Chinese authorities. TikTok already faces EU fines over violations of the GDPR and was ordered to stop transferring personal data to China.
So far, TikTok has been granted an interim injunction that allows the company to continue transferring personal data to China until the case is resolved. As a result, regulators are warning users to read the online platform’s notifications and privacy policies, check their privacy settings and think about what they share in the app. It is also recommended that businesses consider whether to continue using TikTok and conduct risk assessments.
PCRM software fine: Finally, the French CNIL has fined Nexpublica 1,700,000 euros for failing to provide sufficient security measures for a tool for managing the relationship with users in the field of social action. Nexpublica (formerly Inetum Software), specialises in the design of computer systems and PCRM software used in particular by homes for disabled people.
At the end of 2022, Nexpublica customers made data breach notifications with the CNIL, because users of the portal had access to documents concerning third parties. The CNIL then carried out inspections of the company, which revealed the inadequacy of the technical and organisational measures. It is considered that the vulnerabilities found:
- were mostly the result of a lack of knowledge of the state of the art and basic safety principles;
- were known and identified by the company through several audit reports.
Despite this, the flaws were only patched after the data breaches.