Data protection & privacy digest 4 – 17 March 2023: position of DPOs, user behavior analysis, creditworthiness and profiling

TechGDPR’s review of international data-related stories from press and analytical reports.

  Legal processes and redress

DPOs enforcement action: The EDPB launches coordinated enforcement action on the designation and position of data protection officers. DPOs across the EU will be sent questionnaires, with the possibility of further formal investigations and follow-ups. According to the Portuguese data protection agency, it will ask DPOs to voluntarily participate in the action and they do not have to identify themselves or the organisation concerned. The Spanish privacy regulator says it will analyse the practices of tens of thousands of public and private sector entities, (education, banking, health, security, financial solvency, etc.) 

The questions will be related, among others, to the designation, knowledge, and experience of the data protection officers, their tasks, and resources. Special attention will be paid to the independent and effective performance of the tasks of the DPO, and their possible conflict of interest, (where they exercise additional functions of compliance officers, IT managers, etc.), explains the Bavarian data protection supervisor. The requirement for DPOs to report directly to the highest management level of the controller or processor, and their operating conditions, (based on organisational charts, annual reports, etc), also will be checked.

UK Data Protection reform resumes: The Data Protection and Digital Information Bill was reintroduced in the House of Commons. Followed by a rapid change in the UK government last summer, the reading of the old document did not occur as expected. Much of the new bill is the same as the withdrawn one. The new document also followed a detailed co-design process with industry, business, privacy, and consumer groups. It would reduce burdens on companies and researchers and boost the economy by 4,7 billion pounds over the next decade. The research briefing on the draft reform bill is available here

Creditworthiness and profiling risks: The CJEU’s Advocate General suggests that the automated establishment of the ability of a person to service a loan constitutes profiling under the GDPR. In the related case, a German company governed by private law, (SCHUFA), provided a credit institution with a score for the citizen in question, which served as the basis for a refusal to grant credit. The citizen requested SCHUFA erase the entry concerning her and to grant her access to the corresponding data. The latter merely informed her of the relevant score and of the principles underlying the calculation method, without informing her of the specific data included, arguing that the calculation method is a trade secret. Other related cases concerned the lawfulness of the storage of citizen data from public registers, (on discharge from remaining debts), by credit information agencies.

Official guidance

Data subject access rights: The Latvian data protection agency DVI explains what the right to access your data means. Every natural person has the right to obtain accurate information about their data, (or a copy of it), held by an organisation. For example, a person participated in a job interview and has not passed the rounds of applicant selection. In order to find out whether or not the company has stored personal data, the person can contact the company and ask, and if this is the case, demand an explanation for what purpose it is processed. The individual must first contact the organisation using the communication channels or methods specified in the privacy policy. The request should be as clear as possible, and include:

  • identifying information of the requester, (the organisation has the right to additional information, so the person can be identified correctly);
  • an indication whether the information is desired for all data or for a specific case;
  • an indication of the period for which information is to be provided;
  • precise requests referring to all or any of the above questions.

The organisation may refuse the request if it was already answered or it is disproportionally large, unidentified, or the information is covered by other regulatory acts. But if the organisation does not respond to the request within a month, and does not provide the information, (or the reasons for refusal), the person has the right to file a complaint with the data protection authority. 

Dematerialised receipts: The French privacy regulator CNIL looked at dematerialised receipts that merchants can offer you in place of traditional printed ones. You still must have the choice of whether or not to receive it, (via email, sms), as dematerialisation is not provided for by law. The dematerialised receipts allow the merchant to collect and reuse your data for advertising: but they must respect your rights by asking for your consent or by allowing you to opt out. If a merchant offers the retrieval your receipt by scanning a QR code with your smartphone, only the technical data necessary to establish the connection between the devices should be collected. Finally, the creation of a loyalty or online account is not mandatory to obtain your receipt. 

User and Entity Behavior Analysis: UEBA techniques have a multitude of applications that always have something in common: recording user behavior in the past, then modeling this behavior in the present, and, if possible, predicting what it will be like in the future. According to the Spanish privacy regulator AEPD, techniques used online collect massive amounts of data and almost always apply machine learning or AI. Users are always people, entities can be animals, vehicles, mobile devices, sensors, etc. The application of these techniques depends on the specific application domain, since it may be interesting to analyse the individual behavior of people or their behavior from a social perspective in three main domains: 

  • service and marketing optimisation; 
  • cybersecurity; 
  • health and safety.

When personal data is processed, the principles established in the GDPR are mandatory, including transparency, data minimisation, and purpose limitation. But in many cases, users are not informed about the types of techniques that are being used, the depth of the treatment, the scope of data sharing, or the potential impact that a data breach may have.

Algorithmic fairness: The UK privacy regulator ICO decided to update its guidance to help organisations adopt new technologies while protecting people and vulnerable groups. New content was added on AI and inferences, affinity groups, special category data, as well as things to consider as part of your DPIA. The updated guidance explains the differences between fairness, algorithmic fairness, bias, and discrimination. It also explains the different sources of bias that can lead to unfairness and possible mitigation measures. There is a new section about data protection fairness considerations across the AI lifecycle, from problem formulation to decommissioning. Technical terms are also explained in the updated glossary.

Enforcement decisions

Irish queries: The Irish data protection authority DPC in its 2022 report stated that the most frequent GDPR topics for queries and complaints were: access requests, fair-processing, disclosure, direct marketing, and right to be forgotten, (delisting and/or removal requests). At the same time, breach notifications were down 12% on 2021 figures. The most frequent cause of breaches reported arose as a result of correspondence inadvertently being misdirected to the wrong recipients, at 62% of the overall total. Where possible the DPC endeavored to resolve individual complaints informally – as provided for in the Data Protection Act 2018. Overall, the DPC concluded 10,008 cases in 2022 of which 3,133 were resolved through formal complaint handling. 

Medical research data: The French privacy regulator CNIL reminds two medical research organisations of their legal obligations – to carry out an impact assessment on data protection and to properly inform individuals. Health research must be authorised by the CNIL or comply with a reference methodology. These methodologies require a DPIA to be carried out before starting the research. A single analysis may cover a set of processing operations that present similar risks, (eg, similar projects, using the same IT tools). 

Information notices provided by the two organisations also did not specify the nature of the information collected or its retention period, contact details of the data protection officer or the procedures for appealing to the CNIL. Finally, an information notice stated that the data was anonymised, which was not the case since the identity of the patients was only replaced by a three-digit “patient number” and a “patient code” composed of two letters corresponding to the first initial of the name and surname of the person concerned.

Political affiliation data: In Romania, a political party was fined following a data breach notification. The data stored in an operator’s server hosting an application became subject to a phishing attack. It was found that the operator did not implement adequate technical and organisational measures to ensure an appropriate level of security, such as the encryption/pseudonymisation of personal data stored, which led to the loss of the confidentiality of the data processed by accessing unauthorised use of personal data such as name, surname, personal number code, e-mail, telephone number, and political affiliation data.

Non-conformant data breach notice: The Norwegian data protection authority Datatilsynet imposed a fine of approx. 220,000 euros on the US company Argon Medical Devices for breaching the GDPR. In July 2021, Argon discovered a security breach that affected the personal data of all their European employees, including in Norway. Argon believed that they did not need to report the security breach until after they had a complete overview of the incident and all its consequences. The US company sent a notice to the Norwegian regulator only in September 2021, long after the 72-hour deadline for reporting a breach under the Art. 33 of the GDPR. The security breach concerned personal data that could be be used for fraud and identity theft.

Data Security

PETs: The OECD offers guidance on emerging privacy-enhancing technologies – digital solutions that allow information to be collected, processed, analysed, and shared while protecting data confidentiality and privacy. This often includes zero-knowledge proofs, differential privacy, synthetic data, anonymisation, and pseudonymisation tools, as well as homomorphic encryption, multi-party computation, federated learning, and personal data stores. However, the majority of these tools lack standalone applications, have limited use cases, and are still in the early stages of development.

Big Tech

Meta and Dutch users: Facebook Ireland acted unlawfully when processing the personal data of Dutch users, states an Amsterdam court. Between 2010 and 2020, users’ personal information was processed illegally for marketing purposes. Additionally, it was distributed to third parties devoid of legal justification and without properly informing users about it. Also, consent was not obtained before processing sensitive personal data for advertising purposes, such as sexual orientation or religion. This concerned both information voluntarily provided by users and information that Facebook Ireland collected by observing users’ online browsing patterns outside the Facebook service. 

Meta tracking tools: According to the Austrian data protection authority DSB, the use of Facebook’s tracking tools (Login and Meta Pixel) is a violation of both the GDPR and the “Schrems II” ruling. As a result of US surveillance laws requiring companies, like Facebook, to disclose users’ information to the authorities, the CJEU determined in 2020 that using US providers violates the GDPR.  According to the NOYB foundation, which launched the complaint, numerous websites track users using Meta tracking technology to display personalised ads. Websites using this technology also send all user data to US multinationals. And while the EU-US Data Privacy Framework is waiting for approval from the European Commission, the US government continues bulk surveillance of EU users. 

Meta’s WhatsApp settlement in the EU: The European Commission and the European network of consumer authorities have closed their investigation into Meta’s messaging app WhatsApp following a complaint made by the BEUC, (the European Consumer Organisation). WhatsApp has committed to better explain the policy changes it intends to make and to give users a possibility to reject them as easily as to accept them. Unfortunately, this will only apply to future changes to the app. However, the complaint identified multiple breaches of consumer and data subject rights since 2021 including aggressive commercial practices, and unclear and misleading terms of use and notices to its users. 

Do you need support on data protection, privacy or GDPR? TechGDPR can help.

Request your free consultation


Show more +