The Data Act is almost here

In February, the European Commission published a set of updated technical FAQs on the implementation of the legal provisions of the Data Act, applicable as soon as of 12 September 2025. It enhances data sharing and enables a fair distribution of data value by establishing clear rules related to the access and use of data within the EU – B2B, B2C, and B2G. The guide elaborates among other things on:
- the definitions of data users, data holders and third parties, as well as
- cloud and service interoperability requirements,
- fairness of data-sharing contracts, and
- enforcement and dispute resolution frameworks.
The GDPR is fully applicable to all personal data processing activities under the Data Act. In some cases, the Data Act specifies and complements the GDPR, (eg, real-time portability of data from loT devices). The Data Act also restricts the re-use of data by third parties. In the event of a conflict between the GDPR and the Data Act, the GDPR rules on the protection of personal data will prevail.
Stay up to date! Sign on to receive our fortnightly digest via email.
US data transfers
The Norwegian regulator Datatilsynet answered FAQs about the rules for US data transfers, due to a political situation in Washington. Although we currently have rules that make it easy to transfer personal data to the US, the Data Privacy Framework, the regulator expects that these rules will sooner or later be challenged in the CJEU. An adequacy decision will remain in force until it is revoked by the Commission.
This means that any changes in the US will not automatically result in the lapse of the adequacy decision. At the same time, if it is revoked, there will most likely not be a transition period. It is important to be aware of this when purchasing US services. Also, the use of US cloud services on European soil could be negatively affected if the adequacy decision is lifted. The most important advice for your business is to have an exit strategy for what you will do if you can no longer transfer personal data to the US in the same way as today.
DORA implementation updates
On 18 February, the European Supervisors, (ESAs) —EBA, EIOPA, and ESMA – published a roadmap to designate critical ICT third-party service providers (CTPPs), such as cloud services and data hosting companies, that are critical to the functioning of financial entities under the Digital Operational Resilience Act. By 30 April, the competent authorities must submit the Registers of Information to the ESAs. These registers will list information regarding all ICT third-party arrangements that the financial entities have submitted to the authorities.
By July, the ESAs will notify the affected ICT third-party service providers if they have been classified as critical, and by the end of 2025 will start overseeing them for non-compliance (risk management, testing, contractual agreements, location requirements, etc).
Legal updates worldwide
China data audits: With effect from May 1, 2025, Chinese regulators will focus more on the data protection compliance audit requirements under the Personal Information Protection Law, according to DLA Piper’s legal analysis. The measures provide the conditions and rules for both self-initiated and regulator-requested compliance audits regularly, covering the whole data lifetime, (for large and high-risk data processing, they will be conducted every two years), with the possible rectification steps and further enforcement.
US privacy enforcement: In the past two months, New York state has amended several rules on data breach notification. The amended law requires New York residents to be notified of a data breach, fixing a 30-day deadline for businesses; plus, responsible persons must inform the state’s Attorney General, Department of State, the Police and Financial Services, (only for covered entities), about the timing, content, distribution of the notices, and the approximate number of affected individuals. A copy of the template of the notice sent to affected persons must also be provided.
Meanwhile, Virginia state passed a bill requiring social media platforms to use commercially reasonable methods, such as a neutral age screen mechanism, to determine whether a user is a minor, (under 16 years of age), and to limit a minor’s use of the platform to one hour per day, per service or application, while allowing a parent to give verifiable parental consent to increase or decrease the daily limit. The amendment goes into effect on January 1, 2026.
Automated decision CJEU ruling
The Top European Court ruled that a data subject is entitled to an explanation as to how any decision was taken in respect of him or her. According to a judgement delivered on 27 February, a data subject is entitled to an explanation as to how a decision was taken in respect of him or her, and the explanation provided must enable the data subject to understand and challenge the automated decision.
The case refers to a mobile telephone operator in Austria who refused to allow a customer to conclude a contract because of her credit standing. The operator relied in that regard on an automated assessment of the customer’s credit standing carried out by Dun & Bradstreet Austria. The contract would have involved a monthly payment of 10 euros.
Algorithmic discrimination and the GDPR
The European Parliament’s recent research meanwhile states, that one of the AI Act’s main objectives is to mitigate discrimination and bias in the development, deployment and use of high-risk AI systems. To achieve this, the act allows ‘special categories of personal data’ to be processed, based on a set of privacy-preserving conditions, to identify and avoid discrimination. The GDPR, however, is more restrictive in that respect. The legal uncertainty this creates might need to be addressed through legislative reform or further guidance, states the report.
More from supervisory authorities
DPIA guidance: The Swedish Data Protection Authority IMY has published guidance on impact assessments for activities that process personal data, (in Swedish). The practical guide is intended to facilitate the work of impact assessments and reduce uncertainty about how the various steps are carried out and how the regulations should be understood. It also contains some legal interpretation support, as well as detailed templates for an assessment.
Urban data platforms: As municipalities move towards becoming smart cities or smart regions, more and more systems are being equipped with communication interfaces, states the German Federal Office for Information Security. These include sensors for recording parking spaces, measuring river water levels or smart garbage cans. Urban data platforms, (UDPs), can be used to bundle various information streams and enable efficient decision-making, such as on optimized traffic control, and early warning systems in the event of disasters or urban planning.
To that end, the regulator has prepared technical guidance, for developers, solution providers and operators of such platforms, (in German). It analyses various existing IT security standards and examines existing UDPs for their vulnerabilities.
Employment records: The UK ICO updated its guidance aimed at employers who keep employment records. The data protection law does not stop you from collecting, holding and using records about workers. It helps to strike a balance between employer needs and every worker’s right to a private life.
The terms ‘worker’ or ‘former worker’ mean all employment relationships, including employees, contractors, volunteers, and gig or platform workers. It can be combined with the other ICO guidance on data protection and employment – in particular, our detailed guidance on workers’ health information and monitoring of workers.
Receive our digest by email
Sign up to receive our digest by email every 2 weeks
Insurance companies data swaps
The North Rhine-Westphalia Data Protection Commissioner has initiated investigations against ten insurance companies in North Rhine-Westphalia for an illegal exchange of personal data. Specifically, the companies, together with almost 30 other insurers, shared data from customers in international travel health insurance to uncover cases of fraud and identify fraud patterns. Since the insurance companies are based in ten federal states and other European countries, a joint coordinated investigation was launched. To exchange data, the insurers used a closed email distribution list, on which several employees of the companies involved were usually registered.
Privacy policy
The Latvian DVI looks at the most common shortcomings in privacy policies of the organisations it’s investigated, and asks data controllers to take them into account:
- Privacy policy is hard to find
- Complex and unclear text
- Not all legal bases and purposes of data processing are listed
- The purpose of data processing is not linked to the legal basis
- Failure to specify the organization’s legitimate interests
- Unclear information about the storage period
- Failure to specify recipients of personal data
Finally, there is also a lack of guidance on data subjects’ rights and their implementation, and complicated mechanisms are provided for the implementation of rights.
Emotion recognition
The Dutch Autoriteit Persoonsgegevens requested feedback on the AI Act’s ban on AI systems that recognize emotions in work or education, (unless for medical or safety reasons). The conditions outlined in data protection legislation must also be fulfilled if emotion recognition is done using personal information. Clarity is required on the definitions of emotions, biometric information, and the boundaries of “workplace” and “educational institutions.”
In particular, in the GDPR, the definition of ‘biometric data’ is linked to the unique identification of a natural person that is allowed or confirmed by the processing of personal data. AP notes that the definition of the term ‘biometric data’ in the AI Act must be interpreted in the light of the GDPR. The distinction between emotions and physical states and between emotions and easily visible expressions also remains unclear.
In other news
Web browsing data fine: America’s FTC requires Avast to pay 16.5 million dollars, (which will be used to compensate consumers), and prohibit the company from selling or licensing any web browsing data for advertising purposes to settle charges that the company and its subsidiaries sold such information to third parties after promising that its products would protect consumers from online tracking. The FTC alleged Avast sold that data to more than 100 third parties through its Czech subsidiary, unfairly collected consumers’ browsing information through the company’s browser extensions and antivirus software, stored it indefinitely, and sold it without adequate notice and consumer consent.
Refused bank loan: It is not possible to further process the data of a loan applicant if no customer agreement has been concluded with the bank, confirmed the Polish Supreme Administrative Court in its recent judgment. The court agreed with the data protection regulator UODO, that the processing of data in the scope of creditworthiness assessment and credit risk analysis, related to inquiries that did not end with the granting of a loan, cannot be used, (neither by the bank nor the credit information bureau), in connection with the legitimate interest of the data controller.
Data security
Location data: The Data Protection Commissioner in North Rhine-Westphalia warns citizens against being too careless with their location data. If people are careless when selecting an app and sharing personal data, they make it easier for third parties to collect location data and resell it to data traders. The data traders could then use the location information in conjunction with the device-specific ID to create individual movement profiles.
Consumers should ideally pick up their smartphone and check the system settings to see which app has been granted access rights. If in doubt, you should revoke permission.
Self-declared GDPR compliance: The Liechtenstein data protection authority asks organisations to be careful with self-declared GDPR compliance of software solutions or cloud services. Instead, it is necessary to check whether the respective service can achieve the determined level of protection with appropriate settings or measures. Security measures in the cloud include encryption mechanisms or regulations on access rights. Under certain conditions, the aforementioned check must be carried out in the form of a data protection impact assessment (DPIA).
Suppose the data stored in the cloud is transferred to a third country outside the EU/EEA area. It must also be checked whether this offers a level of protection equivalent to that in the EU/EEA area or can be ensured through suitable measures and guarantees under the GDPR. In addition, providers of cloud services are usually contracted as data processors, which is why the existence of a legally compliant data processing contract must be observed.
In case you missed it
AI from non-EU countries: A number of European regulators draw attention to the risks associated with the use of AI tools like DeepSeek. Although this model of generative AI is freely accessible on the Internet, the manufacturer did not design it for the European market. Based on current knowledge, it can be assumed that the requirements of the AI Act and the GDPR in particular are not met. Some practical steps can be assumed:
- Pay attention to the transparency of the provider and appropriate documentation.
- Use a separate, secure IT environment to avoid data leaks.
- If no privacy-preserving measures are known, it is reasonable to assume that none exist (and inform your employees of the risks associated).
- Take into account the AI competence and ban on prohibited AI practices that must be ensured from February following the AI Act.
- Make sure that the manufacturer of the AI application, if it is also responsible for data protection and is not based in the EU, has appointed a GDPR representative, (otherwise, the effective enforcement of the rights of those affected can become very difficult).
AI in education: The Future of Privacy Forum meanwhile highlights the Spectrum of AI in education in its latest infographics. While generative AI tools that can write essays, generate and alter images, and engage with students have brought increased attention on the students, schools have been using AI-enabled applications for years for predictive or content-generating purposes too, including reasoning, pattern recognition, and learning from experience.
In practice, they often help with: automated grading and feedback, student monitoring, curriculum development, intelligent tutoring systems, school security and much more.