How does the EU Data Act interact with the GDPR?
The Data Act will become applicable in the EU starting on 12 September 2025. In the runup, the European Commission has published an FAQ on the new legislation. Together with the Data Governance Act, it enables a fair distribution of value by establishing clear rules related to the access and use of data within the EU’s data economy. While the Data Act does not regulate the protection of personal data, the GDPR remains fully applicable to all personal data processing activities under the Act.
This includes the powers and competences of supervisory authorities and the rights of data subjects. Sometimes, it complements the GDPR, (eg, real-time portability of data from Internet-of-Things objects). In other cases, it restricts the re-use of data by third parties, such as for profiling purposes, (unless it is necessary to provide the service to the user). In the event of a conflict between the GDPR and the Data Act, the GDPR rules shall prevail, (see Art. 1(5) of the Data Act).
Stay up to date! Sign on to receive our fortnightly digest via email.
Corrective powers under the GDPR
The CJEU has ruled that a supervisory authority is not obliged to exercise a corrective power in all cases of breach and, in particular, to impose a fine. It may refrain from doing so where the controller has already taken the necessary measures on their initiative. The case relates to a savings bank in Germany where one of its employees had consulted a customer’s data on several occasions without being authorised to do so. The employee had confirmed in writing that she had neither copied nor retained or shared the data, and the bank had taken disciplinary measures. The data controller nevertheless notified the data protection authority of this breach.
More legal updates
California tech updates: Among over a dozen new bills covering personal data and generative AI, Governor Gavin Newsom signed a bill on training data sources into law. It includes reporting provisions for developers on sources or owners of datasets, a description of data points in them, whether the datasets contain personal information, how the datasets further the intended purpose of the AI system or service, whether the datasets include any data protected by copyright, trademark, or patent and more. Changes will be due on 1 January 2026.
California has also expanded the definition of personal data to more abstract digital formats, including compressed or encrypted files, metadata, or artificial intelligence systems that are capable of outputting personal information. At the same time, a landmark artificial intelligence safety bill was blocked by the governor after strong opposition from major technology companies. The draft bill required the most powerful AI models to undergo safety testing and other oversight obligations.
Lax social media privacy controls: The Federal Trade Commission has examined the data practices of major social media and video streaming services, revealing they engaged in vast surveillance of consumers to monetize their personal information while failing to adequately protect users online, especially minors. Among other things, companies feed users’ and non-users personal information into their automated systems, including for use by their algorithms, data analytics, and AI, without proper testing and oversight. Meanwhile, data subjects had little or no way to opt out of how their data was used by these automated systems.
Who determines how to secure data?
The Polish Supreme Administrative Court has made a final decision on whether a data controller can use an employee to determine how to secure data. In a related case, the probation officer of a district court lost an unencrypted pendrive with the personal data of 400 people. The analysis of the case showed that the controller had not fulfilled security obligations correctly.
Before the incident, the controller issued the device and instructed the probation officer to implement security measures on their own. The obligation to register and encrypt the medium was introduced only after the officer lost it. Additionally, employees were only given basic training in data protection, which did not give them enough knowledge on securing digital mediums or calculating the risks of data loss. As a result, the employee decided to protect the data by carrying their drive in a locked bag.
More from supervisory authorities
Data accountability from A to Z: The Luxembourg data protection and cybersecurity authorities have recently developed DAAZ, a GDPR compliance tool that addresses the challenges faced by start-ups and small and medium-sized enterprises, (available in English). The tool comes in response to the personal data protection challenges faced by SMEs in particular, which are often at a disadvantage compared with large organisations in terms of resources and expertise.
Mobile applications: The French CNIL has published the final version of its recommendations to help professionals design privacy-friendly mobile applications. From 2025, these will be the subject of a specific control campaign. According to the latest data, a typical French consumer downloads 30 apps and uses their mobile phone for an average of 3 hours and 30 minutes per day. Among other things, the recommendations include best practices for stakeholders to ensure that users understand whether the requested permissions are really necessary for the application to function.
AI Act and GDPR: Finally, the Belgian regulator published its information guide, (available in English), on the EU AI Act from a GDPR perspective. It includes sections on AI system definition, and data protection principles such as purpose limitation, data minimisation and data subject rights in an AI context. It also emphasizes accountability, security measures and human oversight in AI development.
Termination of employment
Although former employees have the right to request the deletion of their data, it should be understood that this right is not absolute, according to the Latvian regulator. In one example, the former employer has the right to temporarily retain an e-mail box for a certain period to ensure continuous communication with the company’s customers, (eg, by forwarding e-mails), and access information that is essential to the operation of the company. However, the employer must clearly define for how long this e-mail address will be stored and communicate it to employees.
This does not mean that the employer can use the information found in the e-mail for other purposes. The principle of purpose limitation should be taken into account here. If an employer recovers, for example, a computer or smartphone used by an employee after the end of the employment relationship, they may discover that private e-mails or other communication channels were accessed on it. If the employee is not logged out of these accounts, the employer has no right of access, despite owning the device.
Receive our digest by email
Sign up to receive our digest by email every 2 weeks
Data requests via a representative
Finland’s data protection commissioner has stated that a person can make an inspection request for their data with the help of an agent and, for example, ask the organisation to provide the agent with that information. Data protection legislation does not prevent the exercise of data protection rights through another person. An individual who contacted the regulator’s office had asked the Tax Administration to deliver all information about them to their representative’s postal address. However, the Tax Administration refused to provide information to the agent, citing that the information could only be provided to the person directly.
More enforcement decisions
Commercial legitimate interest: Hogan Lovells’ law blog reports that a Dutch court once again has recalled a decision of the data protection authority for its overly strict interpretation that purely commercial interests cannot be legitimate interests under the GDPR. The court ruled in favour of the unnamed company by suspending a 120,000 euro fine, as there was still room for legal discussion.
The cumulative criteria for a valid legitimate interest, (eg, for direct commercial marketing), requires a careful assessment, including whether the data subject could reasonably expect the data processing. Additionally, the personal data concerned should be strictly necessary for the legitimate interests pursued, and, finally, the fundamental rights and freedoms of the data subject must be preserved.
Meta fine for password storage in plaintext: The Irish Data Protection Commission has fined Meta Ireland 91 million euros. This inquiry was launched in April 2019, after the company notified the regulator that it had inadvertently stored certain passwords of social media users in ‘plaintext’ on its internal systems, (eg, without cryptographic protection or encryption). These passwords were not made available to external parties.
Selling data to competitors: A man in the UK has pleaded guilty and been fined for unlawfully retaining and selling thousands of details of customer records from the car leasing company he worked for. Shortly before he resigned from his role as sales consultant, at Leaseline Vehicle Management Ltd, he sold over 3,600 pieces of personal information he’d taken from the company’s internal customer database. He approached multiple competitor companies with this information, whilst claiming that the data belonged to him.
Data security
Facial recognition: The German Data Protection Conference observes that some authorities are already using biometric facial recognition in public spaces, citing non-specific criminal procedural rules. However, the legal framework and the civil liberties of those affected – potentially all citizens – are not sufficiently taken into account. For this reason, the European legislators have excluded certain applications in the AI Act and set strict limits for others. The regulator calls upon the national legislators to create specific and proportionate legal bases for the use of facial recognition systems in public spaces.
Minor’s data: Following the UK Ofcom’s publication of the draft Children’s Codes of Practice which are due to come into effect in early 2025, Instagram has changed the way it works for minors, connectedworld.clydeco.com reports. For all under 18s, the new “teen accounts” will activate several privacy settings by default, such as preventing non-followers from seeing their material and requiring them to manually accept new followers.
Also, the only way for 13 to 15-year-olds to change the settings is to add a parent or guardian to their account. Strict guidelines will also be applied to sensitive content to avoid suggesting potentially dangerous material and muting notifications overnight, (“sleep mode”).
Portability right: A new portability right applies to employees and consumers in Québec, JD Supra law blog reports. The purpose is to allow individuals in private and public sectors to access their data and transfer it to another legally authorised organization of their choice. It only applies to data that has already been digitally stored, and directly provided by the individual. Though the legislation does not specify any particular format. PDFs, pictures, and proprietary formats that call for additional software or costly licensing should be avoided in favour of formats like CSV, XML, or JSON.