Data protection digest 18 Mar – 02 Apr 2024: AI and DP standardisation, patient medical data, human factor in data security

The need for AI and data protection standardisation, best practices on customer and employee data protection, rules on restricted cross-border data transfers, tips for DPOs, CISOs, IT specialists, and much more in our latest digest.

Stay tuned! Sign up to receive our fortnightly digest via email.

AI and data protection standardisation

The French CNIL elaborates on the contribution of ISO/IEC 27701 and 42001 standards on compliance with data protection laws. For many years, IT security has benefited from two recognised international standardisation frameworks: ISO/IEC 27001, and 27002, which detail best practices for implementing the necessary security measures. The ISO/IEC 2770, published in 2019, complements these two standards by defining and detailing a “privacy management system”. 

At the same time, the new ISO/IEC 42001, published in 2023, proposes a “management system for AI” for organisations. This standardisation tool describes the processes for managing concerns related to the reliability of AI systems: security, safety, fairness, transparency, and data and system quality throughout the lifecycle. In addition, it provides a series of operational measures to implement them including the various impacts and risks of an AI system, ensuring responsible development and use and documenting and monitoring. 

Public tasks and AI

The Swedish IMY is starting a regulatory sandbox project to test how generative AI can create more efficient data processing when issuing public documents. The goal of Lidingö city’s project “Right to transparency 2.0” is to be able to use generative AI to get help with masking personal data and confidential information. In addition to IMY, the Atea Sweden company will participate with technical expertise and know-how. 

CPPA enforcement

California’s Privacy Protection Agency has issued its first enforcement advisory – on applying data minimisation to consumer requests. Businesses should apply this principle to every purpose for which they collect, use, retain, and share consumers’ personal information. For example, it shall not require a consumer to provide additional information beyond what is necessary to send the opt-out signal, (of selling/sharing their data), or when determining the method by which to verify the consumer’s identity. What is the minimum personal information that is necessary to achieve this purpose? Read in the original guidance.

More official guidance

Patient medical apps: The Italian ‘Garante’ has published a guide on apps and sites that connect patients with healthcare professionals, including general practitioners and pediatricians, concentrating on free choice, the booking of visits, and the sending and archiving of health documents, (in Italian only). The compendium provides clarifications concerning three macro types of processing: 

  • patient data, necessary to offer them online services,
  • data of healthcare professionals processed for various purposes,
  • data on the health of patients, processed for diagnosis and treatment purposes.

Tech vendors and HIPAA: The US government reminds us of the correct use of online tracking technologies by covered entities and business associates under the Health Insurance Portability and Accountability Act, (HIPAA). As a rule, they are not permitted to use tracking technologies in a manner that would result in impermissible disclosures of protected health information, (PHI), to tracking technology vendors, (eg, via user webpages and mobile apps). This primarily includes the disclosures of PHI for marketing purposes without a user’s HIPAA-compliant authorisation.

AI-powered employment practices: Privacy International has responded to the UK ICO’s draft guidance for employers and recruiters on deploying AI tools. Its response focuses on the processor/controller designation of recruiters and the third-party LLMs they outsource and candidates’ employment rights that may be undermined by algorithmic decision-making.  PI’s submission relates to the different technologies used and different types of data collected, the use of candidate data for model training purposes, the role of DPIAs and what constitutes meaningful human intervention

UK standard clauses

As of 21 March 2024, any contracts depending on the old EU SCCs for data transfers with the UK should have been upgraded to the UK IDTA or UK Addendum. From 21 September 2022, organisations had to utilise the IDTA or the Addendum if they intended to engage in new, (or update the existing), arrangements for transfers that are subject to the UK GDPR. The deadline is further explained by the TechGDPR blog post

German healthcare data

The country’s new Health Data Use Act entered into effect on 26 March, IAPP News reports. By allowing pharmaceutical corporations to access patient health data for research reasons, the act seeks to further health research. Researchers will only be permitted to access pseudonymised data, and any violations of patient privacy would result in administrative sanctions. The original legal text in German can be consulted here

More legal updates

Florida’s under 16 law: The Florida Governor signed a bill that bans children aged under 14 from social media platforms and requires 14 and 15-year-olds to get parental consent. The measure requires social media platforms to terminate the accounts of people under 14 and those of people under 16 who do not have parental consent. It also requires the use of a third-party verification system to screen out those who are underage. On 1 January 2025, the measure will become law. The critical views can be read in the original analysis by Reuters.

Australia’s doxxing reform: The Government proposes new provisions to address doxxing as part of the Privacy Act Review. ‘Doxxing’ is the intentional online exposure of an individual’s identity, private information or personal details without their consent, (eg, for de-anonymising, targeting purposes). A new statutory tort for serious invasions of privacy would allow individuals to seek redress through the courts if they have fallen victim to doxxing, as well as access, objection and erasure rights, and the right to correct their personal information.

Chinese restricted transfers: The Cyberspace Administration finalised guidelines setting out exemptions to certain cross-border data transfer laws, DLA Piper reports. This includes collection outside of mainland China, cross-border HR management, cross-border contracts, volume thresholds and others. The guidelines include updated filing templates for those still falling outside the exemptions and a reminder that consent and contractual/other measures remain in place. More details on the current security assessments and standard contracts for data exporters are available here

Receive our digest by email

Sign up to receive our digest by email every 2 weeks

UK data protection reform

UK civil society organisations have issued an alert on the financial surveillance powers proposed in the UK Data Protection and Digital Information Bill, (in the Committee stage now). It introduces mass algorithmic surveillance aimed at scrutinising banks and any third-party accounts purportedly to detect welfare fraud and errors. Reportedly, there are no restrictions on the type of information that can be requested. Enacting a law that allows for disproportionate mass surveillance could also impact the adequacy status of the EU. 

Facial recognition abuse at the workplace

Facial recognition to check attendance in the workplace violates employee privacy, stated the Italian ‘Garante’ when sanctioning five companies all engaged in various capacities at the same waste disposal site, for having unlawfully processed the biometric data of a large number of workers. In particular, three companies had shared the same illegal biometric detection system for more than a year, without having adopted adequate technical and security measures. The companies had not provided clear and detailed information to workers nor had they carried out an impact assessment. They should have more appropriately used less invasive systems to control the presence of their employees in the workplace, (such as by badge). 

More enforcement decisions

Cookie walls: The Danish data protection authority has confirmed its decisions in the cases concerning JFM’s, (media company), and GulogGratis’, (online marketplace), approach to using cookie walls. In particular, statistics were not a necessary part of the paid access alternative – the processing of personal data to generate statistics was not directly linked to financing the content. The marketing purpose – unlike the statistical purpose – made it possible for advertising partners to buy access to banner advertisements etc. on the website to process personalised ads and thus generate advertising revenue

Access and log control: The Norwegian data protection authority has issued an approx. 1.7 mln euro fine and several injunctions to the Norwegian Labor and Welfare Agency, (NAV). NAV lacked management and understanding of the importance of safeguarding data confidentiality through access management and log control. The majority of Norwegian citizens receive benefits from NAV at one time or another during their lives. 

There is therefore an inherently high privacy risk in NAV’s operations. But in fact, local offices were given greater freedom to organise themselves in their own ways. As a result, special categories of personal data were often treated for a long time and involved a large number of people, without the necessary security measures being established, and despite repeated calls for compliance.

Retailer’s indefinite data storage: The Finnish data protection commissioner has ordered to pay an administrative fine of 856,000 euros, as the company had not defined how long the data of online store customer accounts would be kept. The limitation of the data retention period was left to the responsibility of the customer. In addition,’s policy of making online purchases require the creation of a customer account violates data protection regulations. 

Data breaches

Ransom attack: The Estonian privacy regulator explains the recent Asper Biogene data leak. Sensitive personal health data was leaked. The company learned of the intrusion through a ransom demand. Thanks to the notification made by the data controller, people learned about the situation – this allowed them to protect themselves from possible fraudulent letters. The data leak involved a healthcare service provider and an authorized processor, (Asper Biogene). In this case, the agreement concluded between the controller and the authorised processor largely helped to confirm the parties’ roles and goals in data processing. 

Data security 

Human factor: What is the weakest link in the data security chain? The Estonian regulator states that it is still a person that interacts with that data. Therefore every month there are cases where the requirements for personal data processing are violated due to an employee’s mistake, carelessness or lack of organisation in the workplace. Some recent cases resolved by the regulator included: 

  • an intranet was accessible from the public Internet, where the only measure to protect its content was the same username and password used by multiple persons.
  • the employees of a cafe discovered that paper documents concerning the inmates of a detention facility had been left there.
  • a hosting company sent a newsletter to its customers in a way where the e-mail addresses of others were visible to all recipients.
  • an employee of a financial company was mistakenly given access to a bank account used for salary payments of the company’s employees.
  • the publication of people’s debt data in various default registers without a legal basis. 
  • a ransomware and code injection attack, hijacked employees emails and phishing. 

Latest technology guide: The French CNIL has published a new edition of its Personal Data Security Guide, (available in English). The new version restructures the guide and introduces new fact sheets, including tips on artificial intelligence, mobile applications, cloud computing, and application programming interfaces. For instance, current practices such as the use of BYOD have been added to the existing fact sheets. This guide references DPOs, CISOs, IT specialists, and the CNIL assessments. 

Big Tech

Google Incognito data deletion: The Guardian reports that Google settled a lawsuit alleging it surreptitiously monitored the internet activities of users who believed they were surfing incognito on its Chrome browser, and it agreed to delete billions of data sets. Users alleged that Google’s analytics, cookies and apps let the Alphabet unit improperly track people who set Google’s Chrome browser to “incognito” mode and other browsers to “private” browsing mode. This included Google’s analytics, cookies and apps. As part of the settlement, Google will update its disclosures on the data it gathers during “private” surfing. Users in incognito mode will also be able to disable third-party cookies.

Mozilla/Onerep data brokerage case: The nonprofit that supports the Firefox web browser is winding down its new partnership with Onerep, an identity protection service recently bundled with Firefox that offers to remove users from hundreds of people-search sites. The move comes just days after a report by US cybersecurity expert Brian Krebs forced Onerep’s CEO to admit that he has founded dozens of people-search networks over the years. 

In the US, data brokers, people-search services like Onerep, and online reputation management firms exist because virtually all US states exempt so-called “public” or “government” records from consumer privacy laws. Those include voting registries, property filings, marriage certificates, motor vehicle records, criminal records, court documents, death records, professional licenses, bankruptcy filings, social media data and known associates.

Do you need support on data protection, privacy or GDPR? TechGDPR can help.

Request your free consultation


Show more +