GDPR role, how to determine?
The French privacy regulator CNIL reviews the criteria and practical consequences of determining the GDPR role of data controllers and processors. The qualification does not always depend on a contractual choice but on the facts: who decides what, and who executes what, concerning personal data. The controller is the natural or legal person who determines both the purposes and the means of the processing, the “why” and “how” of the use of personal data, ensures compliance with the GDPR, but does not necessarily have actual access to the data:
- The essential means: what personal data is collected and used, for how long, who the recipients are, etc.
- Non-essential means: technical implementation, such as the choice of software.
- Where two or more controllers jointly determine the purposes and means of the processing, they are joint controllers.
The processor, meanwhile, is a person or body that processes personal data on behalf of the controller. They must always comply with the instructions given by the controller. Sometimes, they can choose the technical means that seem most suitable, as long as this respects the objectives set by the controller. If the processor decides on the objectives and means itself they exceed their GDPR role. In this case, they are considered to be the data controller and may be sanctioned.
Only under certain conditions may the processor reuse the data entrusted to them by the data controller for their own purposes. For example, a subcontractor may reuse data for the purpose of improving its cloud computing services. Such re-use could be considered compatible with the original processing, subject to appropriate safeguards such as anonymisation. On the other hand, their reuse for commercial prospecting purposes would hardly satisfy the “compatibility test”.
Stay up to date! Sign up to receive our fortnightly digest via email.
UK data reform
The Data Use and Access Bill (DUAB) has passed Parliament and now awaits the Royal Assent, when it will become law. The bill introduces a framework of ‘smart data’ schemes to regulate the access, sharing, and protection of customer and business data across various sectors. It introduces, among other things, a recognised legitimate interest list to streamline data use for public safety, interoperable medical records and timely access for professionals, while maintaining a risk-based approach to automated decision-making and sensitive personal information, etc. The UK Information Commissioner is tasked with enforcing the regulations that will be introduced under the bill. The UK now benefits from the EU’s adequacy regime for personal data transfers, which was extended by six months on the Commission’s recommendation, until the end of 2025. This allows the UK government to complete the DUAB in advance of Brussels’ next adequacy assessment.
More legal updates
EDPB latest: The European Data Protection Board has published the final version of guidelines on data transfers to third-country authorities. The EDPB clarifies how organisations can best assess under which conditions they can lawfully respond to requests for personal data from non-European authorities. For example, the updated guidelines address the situation where the recipient of a request is a processor, or where a mother company in a third country receives a request from that country’s authority and then requests the personal data from its subsidiary in Europe.
The EDPB also published training material on AI and data protection addressed to professionals with a legal and technical focus, such as data protection officers, privacy professionals, cybersecurity professionals, developers or deployers of high-risk AI systems.
High-risk AI: The European Commission opened a consultation on the classification of AI systems as high-risk as part of the implementation of the AI Act, until 18 July. AI systems that classify as high-risk must be developed and designed to meet the requirements about data and data governance, documentation and record-keeping, transparency and provision of information to users, human oversight, robustness, accuracy, security and more. The purpose of the survey is targeted consultation to collect input from stakeholders on practical examples of AI systems and issues to be clarified in the Commission’s guidelines.
Australia privacy updates: The Bird&Bird legal blog explains that from 10 June 2025, Australia’s statutory tort for serious invasions of privacy comes into force. Passed by Parliament last year as part of a privacy reform, it introduces several causes that could trigger a legal action and remedies: a) invasion of privacy, b) reasonable expectation of privacy, c) fault element, d) seriousness, and e) public interest balancing. Read more details on who will be exempt from these rules in the original publication.
Pixel tracking
The French regulator CNIL opened a public consultation on its draft recommendation (in French) on the use of tracking pixels in emails. The objective is to help the actors who use these trackers to better understand their obligations, particularly in terms of collecting user consent. Tracking pixels are an alternative tracking method to cookies. They take the form of an image of 1 pixel by 1 pixel, integrated into a website or an email, but invisible to the user. Loading this image, whose name contains a user ID, lets you know that the tracked user has visited a page or read an email. The consultation will close on 24 July.
More from supervisory authorities
Federated learning: The EDPS elaborated on the benefits and limitations of Federated Learning (FL) – an approach to Machine Learning (ML) by allowing multiple sources of data, (devices or entities), to train a shared model while keeping data decentralised collaboratively. From a personal data protection perspective, FL offers significant benefits by minimising personal data sharing, (data exchanged among the client devices and the resulting ML models can be treated as anonymous data), and purpose limitation. However, one of the primary concerns remains the potential for data leakage through model updates, as even without direct access to raw data, an attacker could infer sensitive information by analysing the gradients or weights shared between devices. Continue reading the EDPS analysis here.
Unintentional disclosure: The situations in which personal data are unintentionally disclosed are increasingly occurring, according to the Bulgarian regulator CPDP. The most common cases concern: a) unintentionally or thoughtlessly providing data in a phone conversation or electronic communication with services – brokerage and investment services, marketing research etc, b) lost documents containing personal information, including copies of IDs, c) incorrectly provided documents to service providers, d) responding to misleading messages through phishing, smishing, and vishing. If you have inadvertently disclosed your personal information in the situations described above:
- Save all messages, emails, phone numbers, documents and other relevant evidence.
- If you have sent information to the wrong address, immediately contact the actual recipient or the one to whom you intended to send the message to inform them and seek any assistance.
- If you have managed to establish contact with the actual recipient, request to exercise your right to erasure.
- Change passwords and enable two-factor authentication wherever possible.
- Monitor your bank accounts, social media accounts, and other online platforms.
- Tell your family, friends, colleagues so that they can take preventive precautions, etc.
Receive our digest by email
Sign up to receive our digest by email every 2 weeks
Vodafone multimillion fines
The German federal data protection authority BfDI issued fines totalling 45 mln euros as well as a reprimand imposed on Vodafone. The company uses different distribution channels, including local shops, some of which are operated by partner agencies. Investigations found privacy-related weaknesses in the processes to supervise and audit the processors as well as weaknesses in the IT systems leading to the risk of customer data being misused for fraud. Such risks actually materialised in some cases.
Furthermore, Vodafone offers an online service portal for its customers. When used in combination with the company’s hotline, investigations found weaknesses in the authentication process for the customer accounts that could lead to misuse of eSIMs, etc.
Spotify and Vinted fines upheld
In Sweden, an appeal court upheld the approx. 5.2 mln euro fine imposed on Spotify AB for noncompliance with the GDPR. The company must therefore pay a penalty fee. Spotify did not provide in a clear and easily accessible manner the information necessary for the data subject to be able to exercise their rights. It also failed to provide information about storage periods and criteria for determining these, and did not provide sufficient information about appropriate safeguards when transferring personal data to a third country or an international organisation.
Similarly, the Regional Administrative Court in Lithuania rejected the complaint of UAB Vinted regarding decisions taken by the State Data Protection Inspectorate VDAI. The court found that all the examined factual circumstances and legal norms were assessed properly, and the regulator acted in accordance with the law and the limits of its competence. Last year, the VDAI fined the company 2.3 mln euros for GDPR violations:
- improper processing of requests from personal data subjects to delete their data and insufficient and unclear information provided;
- improper implementation of the accountability principle;
- processing of personal data through so-called shadow blocking, which was carried out without a clear and lawful basis.
In other news
Pixels tracking fine: The Norwegian regulator has audited six websites’ use of tracking pixels. All of them shared visitors’ personal data with third parties without any legal basis, (eg, visitors were “duped” into consent), and in several of the cases, the data was sensitive. These websites were – online pharmacy, services for vulnerable children, medical services, information about various diseases, conditions and diagnoses, and a website that sells bibles. The information included which websites people visited, what actions they took, or what they added to their shopping cart.
The regulator also found violations of the duty to provide information. In one of the cases, it imposed a fine of approx. 22,000 euros.
Online pharmacy user tracking fine: Finland’s data protection agency meanwhile issued a 1,100,000 euro fine against the pharmacy company Yliopiston Apteekki because of data protection shortcomings, also related to the use of tracking services. The regulator started investigating the practices of the company after a doctoral researcher from the University of Turku contacted them. Using network traffic analysis, the researcher found data protection deficiencies in Finnish online pharmacies as part of research focused on the functioning of health-related online services.
Yliopiston Apteekki had used cookies and other tracking technologies for its online pharmacy in a manner that transmitted data on users’ interactions with the shop related to prescription medicines and over-the-counter medicines directly to Google and Meta, among others. For example, the tracking service providers received data on when a customer added a product to their basket and clicked the purchase button. The transmitted data also included users’ IP addresses and other identifying data. If a user was logged in to their Google or Facebook account when they used the online pharmacy, Google and Meta could have directly identified them.
23andMe bankruptcy case
23andMe’s customers should be given the opportunity to consent to the sale of their personal data to whoever buys the company’s assets, a consumer privacy ombudsman has told the bankruptcy court handling 23andMe’s case, VitalLaw law blog reports. An alternative safeguard would be for the consent request to come from the winning bidder. The question of what happens to 23andMe’s data upon sale has attracted significant interest from privacy advocates, lawyers and politicians, with US congressional hearings and calls for legislation to protect genetic data. You can view the whole 211-page ombudsman report into 23andMe’s planned sale of customers’ personally identifiable information here.
In case you missed it
Diversity at work: In a context of increased awareness of the fight against discrimination, more organisations want to measure the diversity within their workforce. Diversity measurement surveys distributed by employers to their employees collect personal, sometimes sensitive, data, explains the French CNIL, and must be accompanied by guarantees, in accordance with the GDPR. These surveys must remain optional, and employees or agents must be properly informed and their rights respected. The CNIL also recommends favouring anonymous surveys and limiting the data collected with closed-ended questions. Further advice for employers (in French) can be read here.
AI assistants industry: Building AI assistants that fit into our daily lives is a top priority for the AI sector. Privacy International says that companies in this field need to respond to concerns about how they will secure our data. The fact that AI tools need a lot of processing power to perform some tasks is perhaps too much for a personal device. Thus, cloud-enabled synchronisation is how the corporations address that problem. Once the data leaves the device, businesses could use it to train their systems, and they might grant access to your data to their employees and service providers. These surpass what a consumer may reasonably expect. Therefore, AI firms must inform users about:
- How do I have granular control over access to sensors, data and apps?
- How can I easily access settings to retract consent?
- Where is the clear information on what data is used to respond to a query?
- How can I access and delete any data accessed and used by the Assistant?
According to PI, this is why it is crucial that users insist that their data be processed on their devices as much as possible and used only for specific and limited reasons.