GDPR compensations
In Denmark, an individual has been awarded financial compensation for non-material damage resulting from a data breach (Art. 82 of the GDPR). A High Court ruled on 20 August, that a woman should receive approx. 335 euros in compensation after a municipality mistakenly shared her health information with a third party. The decision has been appealed to the Supreme Court, where the woman and her lawyer will, among other things, try to have the GDPR compensations increased and awarded to her spouse as well.
Until now, Danish practice has been that claims for compensation without financial loss must be assessed according to the provisions of the Danish Civil Liability Act. The court has generally required a qualified damage effect. The decision from August could, if upheld by the Supreme Court, be a new breakthrough in Danish law and possibly the European law. The compensation of 335 is a small amount, but if thousands of citizens choose to file a lawsuit in connection with the same breach – for example via a class action – the consequences for companies and authorities could be extensive.
Stay up to date! Sign up to receive our fortnightly digest via email.
EU-US data transfers and immigration control

On 17 September, the European Data Protection Supervisor (EDPS) issued an Opinion on a framework agreement between the EU and the US on the exchange of information for security screenings and identity verifications. Individual Member States would be empowered to sign bilateral agreements for the exchange of data from their national systems. It would be the first agreement concluded by the EU to entail the large-scale sharing of personal data, including biometric data (fingerprints), for border and immigration control purposes with a third country.
More legal updates
Data transfers for medical research: The German Data Protection Conference (DSK) adopted a paper on data transfers to third countries for scientific research in the medical sector. The admissibility of transferring personal data to third countries under data protection law cannot be assessed in general terms, but only on a case-by-case basis, as numerous circumstances play a role in the assessment. This also applies to scientific research for medical purposes. It must always be examined whether the data subjects have been adequately informed about the (intended) transfer in accordance with the GDPR. In scientific research for medical purposes, broad consent is an established legal basis for data processing. Since there may be special interactions between Broad Consent and the basis for transfer under the GDPR, these are explained in detail in the DSK paper (in German).
The European Innovation Act: The European Commission concluded its consultation and evidence-gathering for an impact assessment to assist in the creation of the European Innovation Act. The Commission seeks information on ways to overcome obstacles that innovative entities encounter, including fragmented regulations, restricted access to infrastructure and funding, underutilised innovation procurement, and inadequate commercialisation of findings from publicly funded research and innovation. The Act aims to create sector-wide horizontal conditions as opposed to sector-specific programs.
Political online targeting ban in the EU: Political parties will soon be prohibited from targeting voters online with political advertisements. A new European regulation on the Transparency and Targeting of Political Advertising (TTPA) will take effect on 10 October. It aims to prevent voters from being secretly influenced during election campaigns and to undermine trust in fair elections, which can involve the processing of personal data.
LinkedIn AI training

Users who do not want LinkedIn to use their data to train AI models must disable this before 3 November. The European data protection authorities are urging people to do so. This data includes profile information and public content shared in the past. Once this data is in LinkedIn’s AI systems, it will be impossible to retrieve, and users will lose control over their data. All LinkedIn users’ data will automatically be used for AI training unless the setting is actively disabled.
Anyone who does not want personal data used for LinkedIn AI training must opt out before 3 November via this link or in the app under “Settings & Privacy > Data Privacy >Data for Generative AI Improvement” and disable the switch.
Vehicle data in the era of the Data Act
On 12 September, the European Commission published the “Guidance on Vehicle Data, accompanying the Data Act.” The document defines the categories of data falling within the scope of he regulation and outlines the access rights granted to users and to third parties designated by them. It clarifies, first of all, that a vehicle qualifies as a “connected product” when it meets two cumulative requirements: it must generate or collect data concerning its use or its surrounding environment, and it must have the ability to communicate such data via an electronic communications service.
More from supervisory authorities
‘Neighbour’s camera’ a major annoyance: The Dutch data protection uthority (DPA) is receiving a growing number of complaints from people concerned about their privacy due to their neighbours’ doorbells or security cameras. The regulator wants to prevent the improper use of doorbell cameras as much as possible. Therefore, the DPA is urging manufacturers to configure doorbell cameras to be privacy-friendly by default. It also wants to raise consumer awareness, for example, by providing information about what is and isn’t permitted.
AI risks in the health profession: A bill sponsored by the California Medical Association (CMA) that addresses dangers associated with the use of AI in health care has passed out of the Legislature and is headed for the Governor’s signature. It prohibits AI systems from being misrepresented as licensed medical professionals and provides California’s state health profession boards with the authority to enforce title protections for health care workers, ensure that new technologies in health care are deployed in ways that protect patient safety, preserve trust, and support the physician-patient relationship.
Medical records: The Swiss FDPIC has published a factsheet on the forms that are given to patients to sign when they go to the doctor. It takes account of the various opinions expressed on the subject and aims to clarify a number of issues raised by these forms: a) the distinction between the duty to provide information on data collection and the issue of patient consent to data processing; b) secure data communication; c) the question of proportionality, regarding what data a patient can legitimately be asked to provide. The document is available in English.
Digital communication and minors

In France, the regulatory authority for audiovisual and digital communication (Arcom) released the results of its study on online risks for minors, digitalpolicyalert.org reports. Over four out of five children use at least one extremely major internet platform on a daily basis, according to the study. 42 per cent of minors use social networks before the age of 13 by lying about their age, and the average age of initial use is 12 years old.
According to the study, 83 per cent of children are regularly exposed to at least one of the six risks: harmful or shocking content, cyberbullying, dangerous challenges, malicious adult contact, and online scams.
E-health data security
The European Union Agency for Cybersecurity (ENISA) has published a good practice guide to support entities of the health sector in strengthening their digital security. The health sector is classified among those in the risk zone, highlighting a significant gap between its cybersecurity maturity and its critical importance: medical systems and data have become growing targets of cybercrime, with ransomware and phishing campaigns on the rise. These actionable practices are designed to be simple to implement and enhance the preparedness and security of all types of health entities, from hospitals and service providers to individual medical specialists. The recommendations cover areas such as systems and network protection, safeguarding devices and patient data, addressing challenges in the ICT supply chain.
Reporting AI incidents
The European Commission has issued draft guidance and a reporting template on serious AI incidents. Under the EU AI Act, providers of high-risk AI systems will be required to report serious incidents to national authorities. This new obligation, set out in Art. 73, aims to detect risks early, ensure accountability, enable quick action, and build public trust in AI technologies. While the rules will only become applicable from August 2026, you can already download the draft guidance and reporting template below. Both these documents will help providers to prepare. The draft guidance clarifies definitions, offers practical examples, and explains how the new rules relate to other legal obligations.
Receive our digest by email
Sign up to receive our digest by email every 2 weeks
Drone use and personal data
The Latvian data protection authority elaborated on this topic, which is becoming increasingly popular today as drones are used in defence, business, and people’s private lives. Personal data processing occurs when materials are obtained with the help of a drone that can identify a specific person. Therefore, it is not possible to say with certainty that personal data processing is performed in all cases when a drone comes into view of a person. If the materials are intended to be distributed publicly, this processing may be justified based on legitimate interests. This may be done after a balancing of interests, in which the proportionality of the processing in relation to the interests of the people depicted is assessed. Similarly, the use of drones may, in some cases, be linked to the public interest, as well as processing for journalistic purposes.
Video games and personal accounts
In the audiovisual and video game sectors, the purchase of digital content can justify a long retention of data. The French CNIL reminds professionals of the rules to follow to manage inactive accounts while respecting the rights of users. Professionals must guarantee uninterrupted access to purchased digital content, as provided for in consumer law. In the audiovisual and video game sector, this access often goes through a personal account that acts as a video library, allowing the user to find their movies, series or games at any time. The deletion of accounts for which no action has been taken by users for two years is considered proportionate. It is recommended that affected users be notified before this deadline to allow them to keep their accounts active.
‘Facial boarding’ at airport

Italian data protection regulator Garante has recently blocked the use of facial recognition in Italian airports (so-called face boarding), with the provision adopted against Società per Azioni Esercizi Aeroportuali, to suspend the use of the specific technological solution adopted, since it is incompatible with the GDPR. Garante specifies that the use of facial recognition technologies at airports in principle is permitted, but requires technological solutions that balance the need for simplified boarding procedures with the need to protect personal data in compliance with current European regulations, particularly regarding the processing of biometric data.
In other news
Automated-decision fine: The Hamburg Data Protection Commissioner HmbBfDI has imposed a fine of almost 500,000 euros on a financial company for violations of the rights of affected customers in automated decisions in individual cases. Despite good credit ratings, several customers’ credit card applications were rejected based on automated decisions, decisions made by machines based on algorithms and without human intervention. When the affected customers subsequently demanded a reason for the rejected applications, the company failed to adequately fulfill its statutory information and disclosure obligations.
Hospital data fine: The Italian regulator Garante has fined a university hospital 80,000 euros for failing to properly configure its health records. The hospital used two applications, on patients and hospitalisation records, through which all healthcare personnel could conduct searches on patients’ medical histories, even if they were not involved in their treatment. They did not include adequate access profiling measures or security measures such as alerts or tracking of operations performed on the applications in dedicated log files. Furthermore, patients were unaware of the existence of the treatments performed through the records and were therefore unable to give or deny their consent to their records or decide whether to obscure certain information, such as that subject to greater protection.
HIPAA violation: A 182,000 dollar settlement has been agreed between the HHS’ Office for Civil Rights and five Delaware healthcare providers to resolve alleged violations of the HIPAA Privacy and HIPAA Breach Notification Rules. The settlement concerns the posting of patients’ protected health information (PHI) on social media without first obtaining HIPAA-compliant authorizations to use PHI for a purpose not expressly permitted by the HIPAA Privacy Rule, then failing to notify individuals about the impermissible use and disclosure.
Candid cameras against theft
The French CNIL fined SAMARITAINE, which operates the store of the same name, 100,000 euros for concealing cameras in the store’s reserves. In 2023, due to the increase in cargo thefts from its reserves, SAMARITAINE placed new cameras in two reserves. These cameras were disguised as smoke detectors and made it possible to record sound. Discovered by employees, the cameras were removed shortly after that. In principle, in order to meet the requirement of loyalty, video surveillance filming employees must be visible and not concealed. However, in exceptional circumstances and under certain conditions, the data controller can temporarily install cameras that are not visible to employees. The company did report the existence of thefts committed in the reserves and explained that the device was temporary (which the technical characteristics of the device seem to confirm).
It nevertheless did not carry out any prior analysis of compliance with the GDPR, nor documented the temporary nature of the installation.
In case you missed it

Human oversight in AI: EDPS’s latest TechDispatch episode explores the human oversight of Automated Decision-Making. While human oversight can occur at different stages of an AI system’s lifecycle, including before deployment (ex-ante), real-time oversight on system operations is considered the one that can be most consequential, when an operator can still review the system’s behaviour and intervene before its output takes effect, helping to prevent potential harm to human lives or infringements on individuals’ fundamental rights.
Dark Net: Sweden’s privacy protection authority IMY answers questions about how data controllers should handle developments following an IT attack where personal data was published on the Darknet. It is NOT recommended to search for or download the information published on the Darknet: the files found may contain, for example, additional malware. It also recommends that the organisations first and foremost, and in accordance with your data processor agreement, contact your data processor. Plus, organisations have a duty to notify the impacted data subjects of the personal data breach as soon as possible, as there is a high risk to the rights and freedoms of natural persons.
Patients’ data and AI boom: Privacy international reports a boom for the UK’s technology sector, with American tech firms collectively investing billions of pounds into the UK’s AI and tech infrastructure. The UK government hailed these investments as an element of a new ‘Tech Prosperity Deal’. A key area mentioned as part of it is healthcare. Last summer, the UK released its 10 year health plan, which emphasised the centrality of technology, innovation and AI for the National Health Service. The plan states that to move the NHS into the 21st century, its unique advantages will be used, including the NHS’s ‘world-leading data’.