end-to-end algorithmic audit

Data protection digest 18 Jun – 2 Jul 2024: end-to-end algorithmic audit, DPOs for small business, Vinted fine

In this issue we look at an end-to-end algorithmic audit, Vinted multimillion fine, Meta and Apple AI projects frozen in the EU, the fight against addictive feeds to minors in the US, and the Avanza Bank and Meta Pixel error case.

Stay up to date! Sign up to receive our fortnightly digest via email.

End-to-end algorithmic audit

The EDPB offers a non-binding auditing methodology for AI systems, specifically focused on impact assessment. A socio-technical, end-to-end algorithmic audit (E2EST/AA), should inspect a system in its actual implementation, processing activity and running context, looking at the specific data used and the data subjects impacted. It is designed to inspect algorithmic systems used in ranking, image recognition and natural language processing. An AI system may be composed of several algorithms, and an AI service or product may include several AI systems. 

It is also an iterative process of interaction between the auditors and the development teams. The method provides templates and instructions to guide such interaction, specifying the data inputs that are necessary for auditors to complete the assessment and validate results. In particular, one of them is ‘Model cards’ – documents designed to compile information about the training and testing of AI models, as well as the features and the motivations of a given dataset or algorithmic model. 

Vinted fine

The Lithuanian Data Protection Inspectorate VDAI imposed a 2,385,276 euro fine on Vinted, an online second-hand clothing trade and exchange platform. Violations concern transparency of information, notification and conditions for the data subject rights. VDAI investigated the 2021 and 2022 complaints from applicants forwarded by the French and Polish supervisory authorities regarding the company’s possible improper implementation of their requests for data deletion, (“right to be forgotten”), and the right to access data.

In response to the requests, the company stated that it would not take action because the individuals did not detail their requests following Art. 17 of the GDPR. It was also established that to ensure the platform’s and its users’ safety, the company applied “shadow blocking” without individuals knowing about such processing, (and thus unable to exercise other rights established by the GDPR and their remedies). In addition, the company did not take sufficient technical and organisational measures to ensure and to be able to demonstrate that it took, (or reasonably refused to take), steps regarding the right to access the data. 

Meta non-compliance under DMA

The European Commission stated Meta’s “Pay or Consent” advertising model failed to comply with the Digital Markets Act. The binary choice forces users to consent to the combination of their data and fails to provide them with a less personalised but equivalent version of Meta’s social networks. In response to regulatory changes in the EU, Meta introduced a binary offer whereby EU users have to choose between a subscription for a monthly fee to an ads-free version, or free-of-charge access with personalised ads.

The possible solution would be for users who do not consent to still get access to an equivalent service which uses less of their data. In case of non-compliance, the Commission can impose fines of up to 10% of the gatekeeper’s total worldwide turnover. Such fines can go up to 20% in the case of repeated infringement. The Commission is also empowered to adopt additional remedies such as obliging a gatekeeper to sell a business or parts of it or banning the gatekeeper from acquisitions of additional services.

Non-material damage under the GDPR

The CJEU has found that the damage caused by a personal data breach is not inherently less serious than a physical injury. In the related case, a data controller managed a trading application in which a data subject opened accounts and entered personal data to do so. In 2020, their data were seized by third parties whose identity and purposes remain unknown. 

An individual requesting compensation under the GDPR must prove not only that the infringement occurred but also that the violation caused them harm; this cannot be automatically assumed. In the event of identity theft, as in the above case, the data must have been misused by a third party. Also, determining the damages payable is up to the legal system of each Member State in each given context. 

Apple AI delayed in the EU

Apple decided to delay the release of three new AI features in Europe due to EU competition regulations requiring competing goods and services to be compatible with its devices. The company is concerned that to meet the interoperability requirements of the Digital Market Act, it may be required to make compromises to the integrity of its devices that endanger user privacy and data security. The features will debut in the US this autumn, but they won’t make it to Europe until 2025. 

More legal updates

US privacy legislation: On July 1, the Florida Digital Bill of Rights, Oregon Consumer Privacy Act, and Texas Data Security and Privacy Act entered into effect, joining California, Colorado, Connecticut, Virginia, and Utah. Among many things, they guarantee consumers rights to access, correct, delete, and opt out of the sale of their data concerning targeted advertising, and certain profiling. There are also provisions relating to data minimisation, children’s data, sensitive data consent, biometric data, and impact assessments. 

Foreign adversaries: On June 23,  the Protecting American’s Data from Foreign Adversaries Act of 2024 entered into effect. It makes it unlawful for a data broker to sell, license, rent, trade, transfer, release, disclose, or otherwise make available specified personally identifiable sensitive data of individuals who reside in the US to North Korea, China, Russia, Iran or an entity controlled by those countries. Sensitive data includes government-issued identifiers, financial account numbers, biometric information, genetic information, precise geolocation information, and private communications.

Minors’ data: To safeguard children’s internet privacy, New York State established new laws. The SAFE For Kids Act defines operators that offer minors an “addictive feed” as a major component of their online or mobile service. Addictive feeds rely on the user’s past interactions, privacy or accessibility settings related to their device, content displayed or blocked by the user, private communication, search inquiries, chronological order etc. The other piece of legislation – the Child Data Protection Act governs, (GDPR-enhanced), processing obligations of relevant minors’ data by operators, processors and third parties. 

More official guidance

end-to-end algorithmic audit

Messenger standardised audit: The EDPB offers the Standardised Messenger Audit initiative to inspect any messenger service used within businesses from a data protection perspective. It consists of two documents – the requirement catalogue and the audit methodology. The requirements within this catalogue are formulated in such a way so that a distinction is made between MUST, SHOULD and MAY requirements of the respective data protection principles. It is also closely based on the structure and outline of the GDPR.

Data processor: According to the Latvian data protection regulator, for an organisation to be considered a processor, it must meet two basic conditions – be a separate and independent organisation and process personal data on behalf of the controller. The organisation usually appoints a processor when it needs more knowledge, resources, etc. Finding such a processor would require a feasibility study: compliance of the set of security requirements chosen by the processor with the controller’s wishes and needs, reputation, and responsibility. Finally, the signing of the agreement indicates the readiness of both parties to cooperate. Further guidance can be read here.

Joint controllership: The Bavarian State Data Protection Commissioner publishes new guidance, (in German), on the legal concept where two or more controllers jointly determine the purposes and means of processing. The GDPR requires a clear allocation of responsibilities, including where a controller determines the purposes and means of processing jointly with other controllers or where a processing operation is carried out on behalf of a controller. However, joint responsibility may still seem less “familiar” than the contractual data processing that has always been established. 

Receive our digest by email

Sign up to receive our digest by email every 2 weeks

DPOs getting into small business

The Data Protection Officer is a profession that is increasingly represented in small enterprises, according to the French data protection regulator CNIL. The regulator came to such a conclusion after a joint survey of 3,625 DPO respondents in the country, including 2,842 internal, 366 shared and 417 external. Certain components, such the age distribution, territorialisation, and contract type, have stabilised, but certain responder characteristics have changed significantly between 2019 and 2024. 57% of respondents now work in structures with fewer than 250 employees, (+19% compared to 2019). Also, 91% are convinced of the social usefulness of the DPO’s function and profession for the protection of customers’, users’ and citizens’ personal data. 

Digital identity

The US NIST meanwhile has launched a collaborative project to adapt its digital identity guidelines to support public benefits programs, such as those designed to help beneficiaries pay for food, housing, medical and other basic living expenses. In response to heightened fraud and related cybersecurity threats during the COVID-19 pandemic, some benefits-administering agencies began to integrate new safeguards such as individual digital accounts and identity verification, also known as identity proofing, into online applications.

However, the use of certain approaches, like those reliant upon facial recognition or data brokers, has raised questions about privacy and data security, (and potential biases that disproportionately impact communities of colour and marginalized groups).

Enforcement decisions

Avanza Bank and Meta Pixel: Sweden’s privacy regulator fined Avanza Bank AB 1,3 mln euros for failing to implement security measures, leading to the unauthorised transfer of personal data of more than half a million data subjects to Meta by accidentally turning on two functions of the Meta Pixel analytics tool. The controller used Meta Pixel to measure the effectiveness of the bank’s Facebook advertising. Two new functions of the analytics tool, the Automatic Advanced Matching and the Automatic Events,(for the recognisable form fields and buttons used on the page), were activated by mistake. 

Avast browsing data: The US Federal Trade Commission will require Avast to pay 16,5 million dollars and prohibit the company from selling or licensing any web browsing data for advertising purposes. The FTC alleged that UK-based Avast Limited, via its Czech subsidiary, unfairly collected consumers’ browsing information through the company’s browser extensions and antivirus software, stored it indefinitely, and sold it without adequate notice and consumer consent. 

Car retail software: A cyber outage at a major retail software provider for automobile dealers delayed car sales throughout North America, (approx. 15,000 retail locations), the Guardian reports. CDK, which provides different kinds of software to car dealerships, proactively shut down most of its systems but is working to reinstate its services. 

Cloud banking security

In terms of data security, operational continuity, and regulatory compliance, outsourcing cloud services to outside providers entails serious risks, according to a new analysis by DLA Piper. One example is financial institutions that retain full operational responsibility even when they outsource critical services. This includes risk management, performance monitoring, and vendor selection. To that end, the EU has established two legal frameworks concerning the provision of cloud and ICT services, (DORA, NIS 2), complementing guidelines issued by the European Central Bank.  

Neuro data processing

In addition to privacy and data protection, fundamental rights such as human dignity and physical and mental integrity are jeopardised by certain uses of neuro data, states an EDPS analysis. The use of AI systems may also make technically possible exploitation of neuro data by private entities for workplace or commercial surveillance. Certain uses of neuro data pose unacceptable risks to fundamental rights and are likely unlawful under EU law

In other cases, mitigating techniques should always include impact assessments, data minimisation, transparency, accuracy, necessity and fairness of processing, local storage of raw data, efficient anonymisation for re-use and analysis, (eg, controlling specific aspects of a videogame, monitoring concentration in educational environments, managing chronic pain by modifying brain activity, etc).

Do you need support on data protection, privacy or GDPR? TechGDPR can help.

Request your free consultation


Show more +