Constant video monitoring and screengrabs at work
A company that used software designed to account for times of alleged “inactivity” and grabbed frequent photos of its employees’ computer screens was fined 40,000 euros by the French data protection regulator CNIL. The staff members were also continuously videotaped, both visually and audibly. In particular, the company had placed software on some of its workers’ PCs to track their teleworking activities. To deter property theft, it also installed a constant video monitoring surveillance system, in both a workplace and a break area. Due to the company’s modest size and the software’s instant withdrawal during the audit, it was decided not to name it.
Stay up to date! Sign on to receive our fortnightly digest via email.
GDPR fines clarified
The CJEU clarified the calculation of GDPR fines for undertakings. The top EU court aligned the GDPR ‘undertaking’ concept with that of the TFEU, stating that the maximum amount of the fine is to be determined based on a percentage of the undertaking’s total worldwide annual turnover in the preceding business year. The concept of ‘undertaking’ must also be taken into account to assess the actual or material economic capacity of the recipient of the fine and thus to ascertain whether the fine is at the same time effective, proportionate and dissuasive.
AI system definition
The European Commission has published the non-binding guidelines on prohibited AI practices, as defined by the AI Act, as well as guidelines on AI system definition to facilitate the first AI Act’s rules application as of 2 February. The guidelines specifically address practices such as harmful manipulation, social scoring, emotion recognition, and real-time remote biometric identification, among others.
The guidelines on AI system definition explain the practical application of the legal concept. The definition adopts a lifecycle-based perspective encompassing two main phases: the pre-deployment or ‘building’ phase and the post-deployment or ‘use’ phase. It can comprise seven main elements, (not required to be present continuously throughout both phases):
- a machine-based system;
- that is designed to operate with varying levels of autonomy;
- that may exhibit adaptiveness after deployment;
- and that, for explicit or implicit objectives;
- infers, from the input it receives, how to generate outputs;
- such as predictions, content, recommendations, or decisions;
- that can influence physical or virtual environments.
Legal updates worldwide
China data privacy updates: Cyberspace Administration released measures for the administration of compliance audits on personal data protection including cross-border data transfer regulations. It applies to all personal information processors operating within the country. Processors handling data of over 10 million individuals must conduct audits at least every two years. Processors handling data of over 1 million individuals must appoint a data protection officer. These and the number of other measures take effect on 1 May 2025.
UK privacy law reform: The Data, (Use and Access), Bill completed its House of Lords stages and had its first and second readings in the House of Commons. Several significant amendments were made to the Bill, including the addition of clauses regarding compliance with UK copyright law by operators of web crawlers, general-purpose AI models and transparency and deepfakes, as well as an extension of the direct marketing ‘soft opt-in’ not only to commercial but to the charity sector too.

The Bill will allow automated decision-making, (with exceptions on processing with a legal or similarly significant effect), with no limitation on which lawful basis an organisation can use, subject to putting specific safeguards in place. Finally, in a debate focussed on concerns about using research provisions for AI development, Parliament chose to limit the provision by adding a public interest test rather than by imposing a blanket ban.
Direct marketing advice generator
The UK Information Commissioner launched a free online tool to help organisations ensure their direct marketing activities comply with the Privacy and Electronic Communication Regulations (PECR), and the UK GDPR. This allows organisations to reach out and promote their products and services to both new and existing customers and can assist in making sure they’re contacting people who are happy to hear from them. The tool covers email, SMS, direct mail, social media, telemarketing, etc.
TIA
The French CNIL published the final version of its Data Transfer Impact Assessment guide, (in French). Regardless of their status and size, a very large number of data controllers and processors are concerned by the issue of data transfers outside Europe. A TIA must be carried out by the exporter subject to the GDPR, with the assistance of the importer, before transferring the data to a country outside the EEA where such transfer is based on a tool of Art. 46 of the GDPR (standard contractual clauses, binding corporate rules, etc.). There are two exceptions to this obligation for the data exporter:
- the country of destination is covered by an adequacy decision of the European Commission;
- the transfer is made based on one of the derogations listed in Art. 49 of the GDPR.
More from supervisory authorities
Age assurance and digital services: The best interests of the child should be a primary consideration for all parties involved in processing personal data, states the EDBP. So far, the GDPR has introduced minimum age requirements in the context of information society services (Art. 8), and the Digital Services Act references age verification as a risk mitigation measure (Art. 35). Several Member States have implemented minimum age requirements for performing legal acts, exercising certain rights or accessing certain goods and services.
The risk-based approach is also crucial when balancing the potential interference with natural persons’ rights and freedoms against children’s safety. This would therefore require that a Data Protection Impact Assessment, (Art. 35 GDPR), be conducted before processing, taking into account the nature, scope, context and purposes of the processing. Furthermore, any occurrence of automated decision-making in the context of age assurance should also comply with the GDPR.
Customer data checklist: The personal data that telecommunications providers typically process includes name, date of birth, postal address, bank details, email address and telephone numbers. This data is of interest to attackers in itself. Mobile phone numbers or email addresses are also often used as security anchors for other services. In addition, the business model of telecommunications providers involves dealing with expensive hardware. Taking into account the state of the art and the implementation costs, an appropriate level of protection must then be guaranteed in each case. To that end, the German Federal Data Protection Commission ‘BfDI’ offers a checklist for handling customer data in sales for telecommunications companies from a data protection perspective to facilitate the analysis of risks related to personal data, (in German).
Receive our digest by email
Sign up to receive our digest by email every 2 weeks
Search engine and anonymity
QWANT is a French company that launched its search engine in 2013. The data used in the context of the sale of the search engine’s advertising space, operated via Microsoft, was presented as anonymous, (the truncated IP address or the hashed IP address for the constitution of an identifier). However, in 2019, following a complaint, the French CNIL found out that, despite the strong precautions taken to avoid the re-identification of individuals, the dataset transmitted to Microsoft was not anonymised but only pseudonymised.
In 2020, the company was alleged to have modified its privacy policies, (in various languages due to cross-border processing), to mention:
- the transmission of “pseudonymous” data to Microsoft; and
- to explicitly state the legal basis and advertising purposes for data transmission.
Former employee data from personal email
The Danish Data Protection Authority has decided in a case where a company had accessed and downloaded emails from a former employee’s private email account as part of a dispute between the parties, and a police report. The company informed the regulator that it was processing these data in the legitimate interest. The regulator criticised the move. It noted that the company’s investigation was directed at the former employee’s work computer, and that access to the personal email account was discovered by accident.
Nonetheless, the company continued to search, even after the company had become aware that it was a personal email account.
More enforcement decisions
Transaction logs failure: According to Data Guidance, the Spanish data protection authority AEPD resolved a case in which it fined GENERALI ESPAÑA, (insurance and finance services), 4 million euros for a data breach. An attacker used insurance broker credentials to get access to the personal information of policyholders, former policyholders, and other people, (about 1.5 million), as a result of a technical glitch in the customer maintenance system update. Furthermore, the lack of transaction logs made it impossible to determine the true extent of the intrusion immediately. Names and surnames, ID numbers, phone numbers, dates and birthplaces, and IBANs were among the personal information breached.

Hidden video monitoring in neonatology: Similarly, the Polish UODO imposed approx. 275,000 euro fines on Centrum Medyczne Ujastek in Kraków, for installing image recording devices in two rooms of the neonatology department, and for failing to apply technical and organisational measures appropriate to the risk for data processed on memory cards located in the monitoring devices. Images showed newborns and their mothers performing intimate activities, including feeding and caring for children.
The children whose images were recorded no longer required intensive care, so their health was not at risk. Neither patients nor employees were informed about the recording. At the same time, the Medical Center reported to the UODO a loss or theft of memory cards from image recording devices in the above-mentioned rooms. After investigation, it was determined that the memory cards on which the recordings were located were not encrypted, and the devices used to record images were not configured properly. Finally, the risk analysis did not include the risk that was the cause of the incident and did not specify the security measures that could prevent it.
Data security
Data scraping: The Guernsey Data Protection Authority reported about a recent suspected data scraping incident in which an online business directory appeared to be scraped by a third party using an automated tool, who then attempted to sell the data. The regulator recommends key measures for any websites with business directories, user profiles, or that store personal data in any other forms:
- Rate limiting, also known as throttling, is a technique used to limit the number of actions a user can make on a website in quick succession, safeguarding against automated bots
- CAPTCHA is a widely used tool which requires users to confirm that they are human by completing a quick and simple task.
Data breach notification: The Swiss data protection authority FDPIC published guidelines on reporting data security breaches. As a rule, the report must contain a description of the circumstances of the breach and the controller’s assessment of its implications and include in particular details of the type, time, duration and extent of the breach and its already known and anticipated effects on the data subjects. The regulator also accepts voluntary reports where the controller does not assess the breach as posing a high risk to the data subjects but wishes to inform the FDPIC for other reasons. At the same time, data security breaches that lead to serious breaches of professional and manufacturing secrecy but do not affect personal data do not fall within the scope.
Big Tech
Gig economy: What would you do if your employer suddenly fired you or reduced your pay without telling you why?, asks Privacy International. Unfortunately, this is the reality for the many millions of gig workers driving or delivering for platforms like Uber, Deliveroo and Just Eat, from hiring to firing to dynamically adjusting pay to allocating jobs. To that end, PI has produced three demands for platforms to implement:
- Maintain a public register of the algorithms used to manage workers;
- Accompany all algorithmic decisions with an explanation of the most important reasons and parameters behind;
- Allow workers, their representatives and public interest groups to test how the algorithms work.
Shift from third-party cookies to device fingerprinting? Research by DLA Piper examines Google’s plan to remove the ban on device fingerprinting—which entails gathering and combining data about a device’s hardware and software to identify the device—for businesses that use its advertising tools, with effect from February 16. This comes after Google decided to keep third-party cookies in July 2024. See the original analyses for the implications of such a move regarding consent requirements and reduced user control.
Agentic AI: Future of Privacy Forum makes a deep dive into a new technology described as “AI agents.” Unlike automated systems and even LLMs, these systems go beyond previous technology by having autonomy over how to achieve complex tasks, such as navigating on a user’s web browser to take actions on their behalf, (from making restaurant reservations and resolving customer service issues to coding complex systems). You can read the original publication for data protection considerations of such systems, such as data collection, a lawful basis for model training, data subject rights, accuracy of output, data security and ensuring adequate explainability.