In this issue we highlight SOCMINT as a new standardised procedure, data processing in LLMs and supported AI systems, an updated standard data protection model, third-party tracking technologies in health and care, and much more.
Stay up to date! Sign up to receive our fortnightly digest via email.
LLMs and personal data
The Hamburg Data Protection Commissioner discusses whether Large Language Models store personal data. It distinguishes between an LLM as an AI model, (eg, GPT-4), and as a component of an AI system, (eg, ChatGPT). The mere storage of an LLM does not constitute processing. Thus, data subject rights cannot relate to the model itself. Claims for information, deletion or correction can rather relate to the input and output of an AI system of the responsible provider or operator.
To the extent that personal data is processed in an LLM-supported AI system, the processing operations must comply with the requirements of the GDPR. This applies in particular to the output of such a system. Similarly, any training that may violate data protection regulations does not affect the legality of using such a model in an AI system. See the full discussion paper here.
The most recent clarifications by the French CNIL on the deployment of Generative AI systems and the official EU AI Compliance Checker might be useful for your organisation. The latter also recommends that you obtain expert legal advice before using AI solutions.
Privacy notice
The UK Information Commissioner encourages people to check how an app plans to use their personal information before they sign up. It is far too easy to just click “agree” when installing a new app. But signing up often involves handing over large amounts of your sensitive personal information, especially with apps that support our health. An organisation that values your privacy will make its privacy notice easy to understand and set out how it will use your personal information, with whom it will be shared, what are the security measures, and whether your data will be deleted when you stop using it.
CCTV
The operation of CCTV in gym facilities, on the one hand, should aim to ensure the protection of the facilities in question while on the other hand, it should respect the right of customers and employees to protect their privacy, reiterates the Cyprus data protection authority. CCTV can be permitted at a gym entrance/exit, parking space, reception, (only the cashier), and general perimeter of the gym property.
It is not allowed in the areas where persons exercise, kitchens, restrooms/ changing rooms, and offices. Audio recording is not allowed under any circumstances. Video material must be accessible only from a device which is located within the premises of the gym and to which only the director and/or an authorised person has access. Access to said material, from a personal device and on an ongoing basis, is not permitted.
More official guidance
EU-US DPF: The EDPB has published the EU-US Data Privacy Framework FAQ for European individuals and businesses: how to benefit from it, how to lodge a complaint and how this complaint should be handled by the EU and US authorities. It also includes what to do before transferring personal data to a DPF-certified company in the US, (data controllers or processors), and self-certification of US subsidiaries of EU/EEA businesses.
DPIA: Industry professionals and interested parties are invited by the Latvian data protection authority DVI to share their thoughts and provide real-world examples of the Data Protection Impact Assessment. It is a procedure by which, through risk inventory, analysis, and evaluation of prospective outcomes, (identifying severity and likelihood), the organisation can identify potential dangers to natural persons that may occur from planned data processing. The DPIA also includes the identification of measures to prevent possible risks. The draft guidance can be read here, (in Latvian).
AI projects sandbox: The Danish data protection authority has selected two AI projects for examination in its sandbox project. One wants to develop an AI insurance assistant for structuring and summarising accident claims, (to determine the degree of injury more quickly than today). The other one is a public-private innovation to develop a solution that will ease the documentation burden for employees in health and care.
Social media monitoring
According to Privacy International, social media monitoring, or SOCMINT, is becoming more common and standardised but is still mostly uncontrolled and inconsistent. One of the most vivid examples is fraud investigations by the UK Department for Work and Pensions. Alongside covert surveillance tactics, the department’s staff guide has an entire section on “Open Source Instructions” on the use of publicly available information.
However, such invisible monitoring goes against or beyond individuals’ reasonable expectations and their possibility to anticipate intrusive examination.
GDPR in practice
The Fundamental Rights Agency recently published the report “GDPR in practice – the experience of data protection authorities”. All the improvement areas directly or indirectly target the availability of human, financial and technical resources. In particular, underfunded and understaffed authorities are obliged to prioritise complaints handling over other regulatory tasks that the GDPR has entrusted to them – such as promoting awareness and providing advice, undertaking their own investigations and external cooperation.
SDM 3.0
The German Data Protection Conference published the updated Standard Data Protection Model – a method for data protection advice and testing based on uniform objectives, Data Guidance reports. In particular, the model transfers the legal requirements into technical and organisational measures required by the GDPR, which are detailed in the catalogue of reference measures. The SDM is aimed at both the supervisory authorities and those responsible for processing personal data.
EHDS
In the next couple of years, patients, healthcare providers, and authorised researchers within the EU will start using the European Health Data Space, for which a DLA Piper legal blog provides the standards on the electronic health record system. Interoperability and the logging component are two essential components of the software that make up this records system. Further requirements for conformity can be read in the original analysis.
More legal updates
Dark patterns: The Canadian Privacy Commissioner with other counterparts conducted a review of over 1000 websites and apps, and found that nearly all had at least one deceptive design element that potentially violated privacy requirements. This includes complex and confusing language, interface Interference, nagging, obstruction, and forced action, (tricking users into disclosing more personal information to access a service than is necessary). When two or more deceptive design patterns are used together, they can become more effective.
HBNR: Starting in July, the amendments to the US Health Breach Notification Rule went into effect. These now underscore health apps and similar technologies not covered by Health Insurance Portability and Accountability. HBNR requires vendors of personal health records and related entities to notify individuals, the Federal Trade Commission, and, in some cases, the media of a breach of unsecured personally identifiable health data. It also requires third-party service providers to notify such vendors and related entities.
Rhode Island became the nineteenth US state overall and the seventh state in 2024 to enact a comprehensive privacy law, The Future of Privacy Forum sums up. The law will take effect starting in 2026. The law includes familiar terminology and core obligations, such as controller/processor responsibilities, rights of access, correction, deletion, portability, express consent for processing sensitive data, and disclosure requirements, but lacks data minimisation requirements or an obligation for controllers to recognize universal opt-out mechanisms.
Receive our digest by email
Sign up to receive our digest by email every 2 weeks
Enforcement decisions
Smart cameras in Turin: The Italian regulator Garante sent a request for information to the Municipality of Turin on a new video surveillance system that, reportedly, would also use AI. It would allow municipal police to understand in real-time whether it is necessary to intervene in an emergency or for safety reasons. The Municipality was given 15 days to clarify the advanced features of the camera, and also send a copy of the technical documentation, and the purposes and legal basis of the processing of personal data.
Personal details on the intranet: The Finnish regulator ruled that a company, (a bus operator), did not have the right to publish 300 employees’ personal phone numbers on the intranet. The company argued it is important for drivers to communicate with each other while working. On their work phones they can only call predefined numbers, and sending text messages is blocked. The regulator argued that using a work number between drivers should be a prior communication method. In addition, employees’ data may only be processed by persons whose job duties demand it, such as supervisors or HR.
Local government data: The UK Information Commissioner issued the London Borough of Hackney council with a reprimand following a cyberattack in 2020 that led to hackers gaining access to and encrypting 440,000 files. The data included residents’ racial or ethnic origin, religious beliefs, sexual orientation, health, economic data, criminal offences, and other data including basic personal identifiers such as addresses. Hackers also deleted 10% of the council’s backup. The systems were disrupted for many months with, in some instances, services not being back to normal until 2022.
Drugstore visitors’ tracking
The Dutch data protection authority, (AP), has imposed a fine of 600,000 euros on the parent company behind drugstore Kruidvat. The company, (AS Watson BV), tracked millions of visitors of Kruidvat.nl, without their knowledge or permission, and was able to create personal profiles noting which pages they visited, which products they added to their shopping cart and bought, and which recommendations they clicked on. In the cookie banner on Kruidvat.nl, the boxes to agree to the placement of tracking software were checked by default. Visitors who wanted to refuse them had to go through several steps.
More data on the use of third-party tracking technologies in the health and care sector can be read here.
Background checks: The province of British Columbia and the Privacy Commissioner of Canada have joined forces to investigate Certn Inc., a business that provides landlords with tenant screening services. They will look at whether Certn complies with the requirements of both the federal Personal Information Protection and Electronic Documents Act and the Personal Information Protection Act of British Columbia, (where the company is based). In particular, it will look at whether the data it gathers, uses, and discloses for tenant screening is sufficiently accurate, complete, and up to date.
Data security
Differential privacy: The latest US NIST cybersecurity insights discuss protecting trained models in Privacy-Preserving Federated Learning. The techniques must be combined with an approach for output privacy, which limits how much can be learned about individuals in the training data after the model has been trained.
Differential privacy is the most robust known type of output privacy. To protect against privacy threats, techniques for differentially private machine learning incorporate random ‘noise’ into the model during training. The training data cannot be later recovered from the model because the random noise prevents the machine from remembering details from the training set.
Global IT outage: A Reuters analysis briefly explains the latest cyber outage when CrowdStrike’s software update caused Microsoft Windows to crash. Companies such as CrowdStrike employ cloud-based solutions for virus scanning, early warning systems for possible cyberattacks, and barriers against hackers accessing company networks without authorisation. This time, a conflict appeared between CrowdStrike code and the Windows operating system’s code, which is why certain PCs crashed even after they were rebooted.
Big Data
Chromebooks: The Danish data protection authority has assessed that 52 municipalities are now complying with its order from January to stop passing on the personal data of school children for unauthorised purposes to Google. There have been adaptations to the contract that ensure that personal data will only be processed following the instructions of the municipalities. The Danish regulator has also asked for the EDPB’s opinion on a final assessment of the data processing chain in the municipalities’ use of Google’s products, (including for maintenance of infrastructure from the supplier’s side).
Oracle reaches 115 mln privacy settlement in the US. The digital files of hundreds of millions of people reportedly containing where they browsed online, where they did their banking, bought gas, dined out, shopped and used their credit cards were allegedly sold by Oracle directly to marketers. The company also agreed in future not to gather user-generated information from URLs of previously visited websites, or text that users enter in online forms other than on Oracle’s websites.