non-material damage

Data protection digest 3 – 17 Apr 2024: non-material damage dilemma when losing control of your data

In this issue, an alternative to the pay or okay consent model, the right for compensation for non-material damage, FISA reauthorisation and GDPR enforcement procedural rules updates, AI development and personal data

Stay tuned! Sign up to receive our fortnightly digest via email.

Non-material damage under the GDPR

In one of its recent decisions the CJEU clarifies the right to compensation for non-material damage for data subjects. The request was made in proceedings between a natural person and Juris GmbH, concerning compensation for the damage suffered by the claimant as a result of various processing operations involving their personal data which were carried out for marketing purposes, despite the objections he had sent to that company. The CJEU upheld its previous decision, (of 25 January 2024 MediaMarktSaturn, C‑687/21), that infringement of the GDPR which confers rights on the data subject is not sufficient to constitute ‘non-material damage’, irrespective of the gravity of the damage suffered by that person:

“The existence of ‘damage’, material or non-material, or of ‘damage’ which has been ‘suffered’ constitutes one of the conditions for the right to compensation laid down in Art. 82 (1) of the GDPR, as does the existence of an infringement of that regulation and of a causal link between that damage and that infringement, those three conditions being cumulative.” 

At the same time, it is not sufficient for the data controller, in order to be exempted from liability, to claim that the damage in question was caused by the failure of a person acting under his or her authority, within the meaning of Art. 29 of the GDPR. More legal reasoning of the case as well as rules on determining the amount of damages due as compensation for damage can be read in the court ruling

 ‘Pay or okay’ consent model

non-material damage

The EDPB adopted a long-awaited Opinion on Valid Consent in the context of Consent or Pay models implemented by Large Online Platforms. In most cases, it will not be possible for large online platforms to comply with the requirements for valid consent if they only offer users a binary choice between consenting to the processing of personal data for behavioural advertising purposes and paying a fee. The EDPB underlines that personal data cannot be considered a tradeable commodity, and controllers should consider the need to prevent the fundamental right to data protection from being transformed into a feature that data subjects have to pay to enjoy. 

Thus, controllers should consider also offering a further alternative, free of charge, without behavioural advertising, with a form of advertising involving the processing of less or no personal data. 

GDPR enforcement: new rules, strict deadlines, dispute resolution

On 10 April, the European Parliament adopted amendments to a proposal laying down additional procedural rules relating to the enforcement of the GDPR. In its 2023 work programme, the Commission announced that it would propose harmonising some national procedural aspects to improve cooperation between national data protection authorities. The MEPs amendments include:

  • the right of all parties to equal and impartial treatment regardless of where their complaint was lodged;
  • their right to be heard before any measure is taken that would adversely affect them, and 
  • their right to procedural transparency, including access to a joint case file. 

MEPs want to standardise procedural deadlines for a supervisory authority to acknowledge that they have received a complaint and declare it admissible or inadmissible. Then, the authority would have to determine if the case is a cross-border one, and which authority should be the lead authority. Draft decisions must be delivered within nine months of receiving the complaint, outside of certain exceptional situations.

MEPs also want to clarify the rules involving amicable settlements, (consensual, negotiated resolutions to disputes). However, these do not prevent a DPA from starting its own initiative investigation into the matter. Finally, all parties to complaint procedures have the right to effective judicial remedies, for example when DPAs do not take necessary actions or comply with deadlines. 

FISA Section 702 reauthorsation

Last week the US House of Representatives voted to reauthorise Section 702 of the Foreign Intelligence Surveillance Act, (FISA), which includes a crucial provision allowing for American citizens to be surveilled without a warrant for another two years. The law has made it possible to monitor foreign communications in great detail, but it has also resulted in the gathering of phone conversations and correspondence from US individuals. 

Some privacy protections, such as the ban on sweeping up communications about a target along with communications to or from the target, were maintained. However, other amendments, including a new definition of internet service providers, might broaden FISA’s application. Prior to the statutory expiration of Section 702 on April 19, the measure now goes to the Senate. More analysis by the Lawfare Institute can be read here

More legal updates

Child safety online: On 10 April, the European Parliament endorsed certain derogations to the E-Privacy Directive to combat online child sexual abuse. In particular, MEPs adopted a temporary extension that allows the voluntary detection, by internet platforms, of child sexual abuse material, (CSAM), online. The implementation measures follow strict data protection safeguards pursuant to the GDPR, (legal basis for data processing, data retention policies, restricted data transfers, etc.). The derogation will be extended until 3 April 2026 so that an agreement on the long-term legal framework can be reached. The provisional rules will now have to be formally adopted by the Council before they can become law. 

US privacy legislation: Last week, a bipartisan group of lawmakers in Congress announced the Federal Privacy Bill, (APRA), with the likelihood of long months of discussions before the bill’s passage. This comprehensive draft legislation promises clear, national data privacy rights and protections for Americans, boosts data minimisation in the commercial sector and curbs large data holders and brokers, harmonises the existing state data privacy laws, and establishes new enforcement mechanisms and a private right of action for individuals. At the same time, the Federal Trade Commission would still have the authority to provide further recommendations and rules covering a significant portion of the APRA. 

Right of access basics 

The Luxembourg data protection authority has published a new illustrative factsheet, (only available in French), on the right of access. Any individual can ask a private or public entity, (the data controller), whether it holds their personal data and obtain a copy of the data processed. This right allows in particular to check whether the data is correct. The organisations can be asked to provide the categories of data processed, retention periods, explanations on how to exercise your rights, the lawful basis for processing, other recipients of your data, data transfers to third countries, data sources, and explanations on decisions made by automated processing or profiling. 

However, the right of access is not an absolute right. The organisation may refuse to provide you with data about third parties in some cases or a confidentiality obligation may be imposed by law. The organisation must respond to the request within one month including the justifications for refusal or possible delays in providing information. If the organisation does not respond, does not meet deadlines or you are not satisfied with its response, you can submit a complaint to the data protection authority. 

AI development and data protection guide

The French data protection authority CNIL has published its first recommendations on the development of artificial intelligence, in a way that respects personal data. The recommendations, (in French only), concern the development of AI systems involving the processing of personal data, (Machine Learning, general purpose AI, systems that are trained “once and for all” or continuously). The points addressed in the initial recommendations make it possible to:

  • determine the applicable legal regime;
  • define a purpose;
  • determine the legal qualification of the actors;
  • define a legal basis;
  • perform tests and verifications in case of data reuse;
  • carry out an impact assessment if necessary;
  • take data protection into account when making system design choices;
  • take data protection into account in the collection and management of data.

More official guidance

Legal basis for customer health data processing: When obtaining data from a person about their health condition, their explicit consent is required – confirms an administrative court in Poland. In the related case, a law firm contacted people injured in traffic accidents to represent them against insurance companies in courts in order to obtain compensation and pensions, as well as reimbursement of treatment and rehabilitation costs. The company obtained information about potential customers based on, among other things, press releases, online publications or content available on social media, as well as information provided or disseminated by organisations engaged in charitable activities. 

Subsequently, when meeting prospective clients, a representative of the law firm received only oral consent to the processing of personal data ahead of a possible conclusion of a contract with these persons but did not record or register it in any way. Also, the collection of this data was not necessary to perform the contract, because the persons from whom the data was obtained were not yet customers. However, this data was processed for other purposes, (eg. examining the profitability of concluding a contract with a potential customer and possibly establishing contact with such a person again). 

Recruitment data: The Latvian data protection regulator reminds us that an employer must avoid excessive data processing when selecting applicants. For example, a job advertisement should indicate as specifically as possible what information the employer expects from the candidate, and develop its own CV form. Also, after submitting their data, applicants as data subjects have the right to submit information requests asking for clarification on various aspects related to the processing of their personal data, so the employer must ensure that it is able to respond to such requests. Finally, there must be established procedures for how information obtained during the selection process, including applicants who are not hired, is stored and deleted. 

In the event that, after data collection, the employer concludes that data processing could also be carried out for a purpose different from that originally collected, the employer must assess whether this purpose is compatible with the initial processing, and also ensure that the applicant is informed. If the employer chooses to use the services of recruitment companies to find suitable employees, it is important to determine the role of such service providers and if the company is considered a data processor, an agreement on the data processing must be concluded. 

Receive our digest by email

Sign up to receive our digest by email every 2 weeks

Avast non-anonymised data fine

Internet security company Avast has contested a fine of approx 13 mln euros from the Czech data protection agency over transferring the non-anonymised data of 100 million users to its subsidiary Jumpshot in 2019. Although Avast stated that it used robust anonymisation techniques, it was proven that at least some of the data subjects using its antivirus program and browser extensions could be re-identified. Moreover, the purpose of processing this data was not (only) to create statistical analyses, as Avast stated

In fact, the pseudonymised Internet browsing history was linked to a unique identifier. Jumpshot, among other things, presented itself as a company that made data available to “marketers,” providing them with insight into online consumer behaviour and offering “atomic-level” tracking of user journeys. The decision, (a cross-border case under the EU one-stop-shop procedure), comes after a 16.5 million fine from the US Federal Trade Commission and restrictions on selling user data for advertising. Avast, now part of Gen Digital, faces challenges both in the Czech Republic and the US.

Other enforcement decisions

Biometrics abuse in the workplace: In the UK, dozens of companies including national leisure centre chains are reviewing or pulling facial recognition technology and fingerprint scanning used to monitor staff attendance after a clampdown by the Information Comissioner’s Office. In February, the regulator found that the biometric data of more than 2,000 employees had been unlawfully processed at 38 centres managed by Serco Leisure. The ICO’s latest recommendations require companies to consider alternative and less intrusive options rather than biometrics scanning to meet their staff management objectives. In light of the ICO decision, a number of other leisure centre operators, like Virgin Active and 1Life, are either reviewing or stopping the use of similar biometric technology, according to The Guardian.  

Ransom attack on a healthcare system: Italian privacy regulator Garante issued fines on several technical and administrative entities, (in the Lazio region), in proceedings opened after a cyber attack on a regional healthcare system back in 2021. The ransomware was introduced into the system through a laptop used by an employee. It blocked access to many health services, preventing, among other things, management of reservations, payments, collection of reports or registration of vaccinations. Local health authorities, hospitals and nursing homes were unable to use some regional information systems, through which data on the health of millions of patients is processed, for a period of time that ranged from a few days to a few months. 

non-material damage

Outdated systems and inadequate management of the data breach failed to mitigate the negative consequences of the attack – from the inability to determine which of the servers were compromised by the IT service provider, to the inability to avoid further propagation of malware targeting numerous healthcare facilities under the umbrella of the data controller, (the regional administration). 

Audit methodology

The UK ICO conducted a consensual data governance audit of East Surrey College, (ESC). The recommendations by the regulator not only provided the ESC with independent assurance of compliance but also could serve as guidance for other organisations concerning:

  • Data Governance and Accountability, (creating a privacy culture; comprehensive and up-to-date data maps and ROPA; training needs analysis).
  • Records Management, (eg, creating a local-level asset register alongside the ROPA; correct use of attachments, encryption and the security of personal data in transit).
  • Data Sharing, (reviewing, updating and creating data sharing policies, procedures and registers; documenting and appropriately justifying the lawful basis for sharing personal data;  data sharing agreements containing sufficient detail;  documenting and regularly reviewing technical and organisational security arrangements with data sharing parties, etc). 

Data security

Underestimated risks to data subjects: The Dutch national data protection agency AP claims that an excessive number of Dutch organisations that suffer from cyberattacks neglect to notify individuals that their personal information has been compromised. Approximately 70% of the time, organisations underestimate the likelihood of an attack. Therefore, the individuals whose personal information was compromised are unable to defend themselves against potential fraud or other crimes committed by online criminals.  They often target IT suppliers that manage large amounts of personal data. However, the organisations contacting them generally remain responsible if anything happens to this data. 

Countering cyber threats: An organisation that takes security measures seriously will not only be able to protect its data but will also be a trusted partner and a role model for others. The Estonian privacy regulator reiterates some simple but important recommendations on how to safely handle personal data in everyday work: 

  • data encryption and pseudonymisation for long-term data storage;
  • strong password rules or at least two-factor authentication;
  • monitoring system activity and detecting unusual activity or requests;
  • an incident response plan that is reasonable and clear;
  • regular training or testing so that employees recognise scams and phishing emails;
  • Security audits, testing; 
  • Involvement of the data protection specialist;
  • Implementation of the information security standards;
  • authorised processor due diligence.

Do you need support on data protection, privacy or GDPR? TechGDPR can help.

Request your free consultation

Tags

Show more +