EU Product Liability
The new Product Liability Directive has been published in the Official Journal of the European Union and will take effect in 20 days. The new law extends the definition of “product” to digital manufacturing files and software, (not excluding AI manufacturers in the future). Also, online platforms can be held liable for a defective product sold on their platform just like any other economic operator if they act like one. Equally, under the new rules, to make sure that consumers are compensated for damages caused by a product manufactured outside of the EU, the company importing the product or the EU-based representative of the foreign manufacturer can be held liable for damages.
Stay up to date! Sign on to receive our fortnightly digest via email.
More legal updates
UK privacy legislation: The new government has proposed reforms to data protection and e-privacy laws through the new Data (Use and Access) Bill, DLA Piper reports. This follows the previous government’s unsuccessful attempts to reform these laws post-Brexit, which led to the abandonment of the Data Protection and Digital Information Bill in the run-up to the general election.
The new proposal maintains several changes to the UK data protection regime: definitions for scientific research and special categories of personal data, broader consent to research, easier consent requirements, new criteria for a recognised legitimate interest, and wider use of automated decision-making.
US data transfers: The European Data Protection Board inquired into the implementation of the EU-US Data Privacy Framework. Regarding commercial aspects, the EDPB notes that the US Department of Commerce has taken all relevant steps to implement the certification process. In addition, the redress mechanism for EU individuals has been implemented and there is comprehensive complaint-handling guidance published on both sides of the Atlantic. However, the regulator recommends that the Commission monitors future developments related to the US Foreign Intelligence Surveillance Act, in particular, the extended reach of Section 702 after its re-authorisation by the US Congress earlier this year.
Data brokers: The California Privacy Protection Agency, (CPPA), is conducting a public investigative sweep of data broker registration compliance under the Delete Act. Covered businesses must register by 31 January if they operated as a data broker during the previous year. The Delete Act also requires data brokers to pay an annual fee which funds the registry and the development of a first-of-its-kind deletion mechanism, (DROP). Once established, it will allow a consumer to, in a single request, direct all data brokers to delete their personal information. DROP will be available to consumers in 2026.
Blockchain
The Spanish AEPD has published a technical note regarding Blockchain infrastructures from a data protection perspective, (text in Spanish, and video version in English). It discusses real-life cases of implementing changes and managing governance common in such infrastructures. Policies, including organisational and technical measures, are then developed to implement the right to erasure in a blockchain infrastructure, including information relating to smart contracts.
Digital identities
“Verifiable credential,” “digital wallet,” “mobile driver’s license,” are terms that reference a growing ecosystem around what we are calling “verifiable digital credentials”, explains America’s NIST. Though the concept seems simple, deploying it and understanding its impact on security, privacy and usability in practice can be challenging. The new blog post series by NIST helps to navigate the terminology, technology, data formats, and protocols that underpin this new and rapidly evolving ecosystem, and leverage the collective expertise of stakeholders from across both government and industry.
More official guidance
PETs costs and benefits: The UK government and Information Commissioner published the Privacy Enhancing Technologies, (PETs), Cost-Benefit Awareness Tool. This resource is designed to help organisations understand and assess the costs and benefits associated with adopting a variety of PETs. Alongside this resource, the Commissioner has also published a checklist to support organisations. Examples of PETs include homomorphic encryption, trusted execution environments, secure multi-party computation and differential privacy. The report explores various scenarios to illustrate potential data protection concerns, including Data Security, Discrimination or Bias, Transparency and Consent, and Purpose of Use. The potential expansion of genomic data use beyond its original purpose is leading to concerns around data minimisation and purpose limitation.
Healthcare data: The Finnish data protection authority elaborates on frequently asked questions about healthcare – instructions for checking, correcting, deleting and disclosing your personal information. The answers include information on whether an incorrect diagnosis can be corrected and what to do if you suspect that your patient records have been viewed without justification, or if the patient wants to deny contact by the healthcare provider due to a scientific research finding. The content of these guidelines will soon also be available in English on the regulator’s website.
Genomics
The UK Information Commissioner has also prepared a report on genomics that could soon impact everyday life in remarkable ways: hospitals might use DNA to predict and prevent diseases, insurers could adjust policies based on genetic health markers, and wearable tech could personalise fitness plans based on genetic tendencies.
Genomics also continues to reshape and expand into sectors such as insurance, education, and law enforcement. The report explores various scenarios to illustrate potential data protection concerns, including Data Security, Discrimination or Bias, Transparency and Consent, and Purpose of Use. The potential expansion of genomic data use beyond its original purpose is leading to concerns around data minimisation and purpose limitation.
Receive our digest by email
Sign up to receive our digest by email every 2 weeks
Top exploited vulnerabilities
Cybersecurity agencies around the globe have just identified that malicious cyber actors exploited more zero-day vulnerabilities to compromise enterprise networks in 2023 compared to 2022, allowing them to conduct operations against high-priority targets. The authoring agencies strongly encourage vendors, designers, developers, and end-user organisations to implement several recommendations, including secure-by-design practices into each stage of the software development life cycle, secure-by-default configurations, and timely patches to systems. For more findings and technical details see the original publication.
Data Clean Rooms (DCRs)
Data Clean Rooms are cloud data processing services that let companies exchange and analyse data, restrained by rules that limit data use, explains America’s FTC. They are typically used when two companies want to exchange limited information about their customers, (eg, the efficacy of an advertisement by identifying grocery sales made to newspaper subscribers). In some cases, DCRs can add privacy protections to the handling of consumer data.
In others, disclosure of consumer data via DCRs presents the same privacy risks as disclosure through other means like tracking pixels, states the regulator.
Data breach case study
The Guernsey Data Protection Authority has published its latest breach statistics. In one recent instance, a retailer filed a breach report after police notified them of a claim that a staff member had shown a member of the public CCTV footage taken inside the shop. This footage contained images of customers and was not viewed according to the retailer’s policy on CCTV use. This event highlights how crucial it is to limit employee access to personal information to that which is necessary for them to carry out their jobs. In data governance, a “need-to-know” basis can greatly decrease the chances of a data breach, states the regulator. It further highlights the importance of having audit trails for instances where personal data is misused.
More enforcement decisions
Cookie fine: Data Guidance reported a case where the Spanish AEPD fined SEAT SA 20,000 euros, (reduced to 12,000 euros), for placing non-technical cookies without user consent on their website.
SEAT’s website set cookies at session start before any user action, including functionality and segmentation cookies, and continued to do so even after users withdrew consent. In principle, functionality or preference cookies are not considered strictly necessary for the basic functioning of the website, which implies that, under the GDPR, it is necessary to request the user’s prior consent for their installation, since they affect the personal experience, although they are not invasive in terms of data collection. More of the original decision in Spanish can be read here.
Electronic services: Finland’s Data Protection Commissioner has ordered Posti, (a delivery service), to pay a penalty of 2.4 million euros for an automatically created electronic OmaPosti mailbox for customers without a separate request. The Commissioner states that electronic services are a significant part of the digital society, and they must be implemented according to the data protection rules. The OmaPosti mailbox has been linked to a wider service package, which has also included, for example, mail resending and the Oma Noutopiste service. The investigation revealed that the customer could not choose whether to use the OmaPosti box or not, because the different services were linked to each other in one contract. The electronic mailbox could also not be discontinued without the other services also being discontinued.
Data security
Log auditing: The Danish regulator reported the results of an inspection visit to Kerteminde Municipality back in 2023. The inspection focused on logging and log auditing, internal procedures for data handling, notification and registration of breaches of personal data security, including the use of auto-complete, testing of backups, testing of preparedness, and procedures for deletion, as well as impact analyses. Previously the municipality had not implemented fixed procedures or random checks for ongoing log checks to ensure that the users only accessed information they had a work-related need for. It only checked the log in case of specific suspicions of abuse. In addition, the municipality must continue to identify which processing activities require impact analyses, and in this case, a plan should be drawn up for the implementation of these analyses.
Thinking of using AI to assist recruitment? The UK Information Commissioner has shared key questions organisations should ask when procuring AI to help with their employee recruitment. Any recruiter may be looking to procure these tools to improve the efficiency of their hiring process, helping to source potential candidates, summarise CVs and score applicants. If not used lawfully, however, AI tools may negatively impact job seekers who could be unfairly excluded from roles or have their privacy compromised.
For instance, some features in those tools could lead to discrimination by having a search functionality that allowed recruiters to filter out candidates with certain protected characteristics. They could estimate or infer people’s gender, ethnicity, and other characteristics from their job application or even just their name, rather than asking candidates directly. Moreover, this could be processed without a lawful basis or the candidate’s knowledge.
Big Data
There are two ways to have your information removed from the Internet: by deleting it from the relevant website or by removing the content from the search engine, states the Hamburg Data Protection Authority. The Federal Court of Justice in Germany earlier this year ruled on a case where an association board of directors’s data was still available in the online register of associations 20 years after the board changed. The register entry was therefore to be removed from the Internet register, and could only be made available to third parties if a legitimate interest was demonstrated. In parallel, the CJEU has just ruled that personal data published on the Internet that are not subject to the disclosure obligation under commercial law is entitled to deletion and, in case of doubt, documents may only be published in redacted form.
Digital Health, Edtech, Surveillance… Privacy International has posted a series of analyses including Big Tech’s dominant vision of digital health, which might pose risks to fundamental rights and the autonomy of society as these digital tools may not always have been designed with people’s privacy in mind; concerns over the Edtech unchecked implementation that can jeopardize students’ rights through potential privacy violations, discrimination, and the lack of student input in the adoption of these technologies; and surveillance databases that are on the rise all around us – from countering terrorism and investigating crimes to border management and migration control.