processors and sub-processors

Data protection digest 2 – 16 Oct 2024: knowing your processors and sub-processors, automated driving, election technologies

Reliance on processors and sub-processors

The EDPB has issued an opinion on the interpretation of certain duties of controllers relying on processors and sub-processors, arising from Art. 28 of the GDPR, as well as the wording of controller-processor contracts. In particular, controllers should have information on the identity of all processors and sub-processors etc. readily available at all times, regardless of the risk associated with the processing activity. To this end, the processor should proactively provide the controller with all this information and should keep them up to date at all times. Download the opinion here

Stay up to date! Sign on to receive our fortnightly digest via email.

More legal updates

processors and sub-processors

Scaling up user tracking: The EDPB also clarifies the applicability of the ePrivacy Directive to emerging tracking solutions. It explains several key elements, namely ‘information’, ‘terminal equipment of a subscriber or user’, ‘gaining access’ and ‘storage of information’. For instance, information could mean non-personal and personal data, regardless of how this data was stored and by whom, (third party,  user, manufacturer, or any other scenario).

Also, it would be incorrect to interpret that the third party does not require consent to access the user information simply because it did not store it. The consent requirement applies even when a read-only value is accessed, (eg, requesting the MAC address of a network interface via the OS API), etc. It applies to a non-exhaustive list of use cases including URL and pixel tracking, Local processing, Tracking based on IP only, Intermittent and mediated Internet of Things reporting, Unique Identifier.

Legitimate interest assessment: The CJEU’s recent decision, that legitimate interests can cover purely commercial interests, is now being followed by new EDPB guidelines. For processing to be based on legitimate interest, three cumulative conditions must be fulfilled: a) the pursuit of a legitimate interest by the controller or by a third party; b) the need to process personal data for the legitimate interest(s) pursued; and c) the interests or fundamental freedoms and rights of the concerned data subjects do not take precedence over the legitimate interest(s) of the controller or of a third party. The assessment should be done before carrying out the relevant processing activity, with special attention when the data subjects are children.

Consent management in Germany

processors and sub-processors

The German government has tabled a new regulation on cookie consent management. It establishes a recognised consent management service, intended to provide a user-friendly alternative to the multitude of individual decisions that end users have to make through cookie banners. The aim is to strengthen trust in such services through a recognition procedure by an independent body. For providers of digital services, this process offers a way to request and store consent “without having to disturb the end user” by displaying the consent banner each time. Read further technical modalities in the original publication, (in German).

AI programming assistants: As AI usage continues to intensify, the use of AI programming assistants has already spread to numerous public and private entities. These tools are being employed at different stages of the software development process – primarily to generate source code, to help developers familiarise themselves with the source code of new projects, or to generate tests and documentation. The French and German Information Security agencies have prepared recommendations (in English) on the risks associated with the use of AI programming with concrete mitigation measures: internal security guidelines, training, instructions on permissible tools and data usage, and risk and success assessments.

More official guidance

Children and the digital environment: The Spanish regulator AEPD stresses the importance of having an age verification system where the burden of proof is on the person who is of the age required to access the content, and never on the minor. The system does not need to verify a specific age or date of birth, but only that the established age threshold has been exceeded. These efforts by default will protect minors from the risks related to accessing adult content, such as contact with people who may put them in danger, the contracting of products and services, the monetisation of their data, the incitement of addictive behaviours that affect their physical or mental integrity and other aspects. 

Data protection audit framework: A new toolkit from the UK Commissioner’s Office helps organisations assess their compliance with some of the key requirements under data protection law. Data controllers, auditors or data protection specialists may use it for various purposes such as for creating a privacy management programme, auditing your existing practices against the ICO’s expectations, improving existing practices, recording, tracking and progress reports, or increasing senior management engagement and privacy awareness across the organisation.

processors and sub-processors

Automated driving: Several data protection authorities in Germany are consulting with Volkswagen AG about new types of data processing. Volkswagen intends to use sequences of sensor and image data of the environment from customer vehicles to further develop driver assistance systems and automated driving functions more quickly and continuously as key technologies for improving road safety. From the fourth quarter of 2024, the company plans to start triggering the extraction of such data and processing it in some vehicle series – initially only in Germany – based on predetermined, narrowly defined scenarios, subject to the consent of vehicle users. 

Enforcement decisions

US hotels fine: America’s FTC is taking action against Marriott and Starwood over multiple data breaches, from 2014 to 2020 impacting more than 344 million customers worldwide. Marriott and Starwood failed to implement appropriate password controls, access controls, firewall controls or network segmentation, patch outdated software and systems, adequately log and monitor network environments and deploy adequate multifactor authentication. In addition to monetary and other penalties, (certify compliance to the FTC annually for 20 years), the companies now must provide a method for consumers to request a review of unauthorized activity in their loyalty rewards accounts and restore any loyalty points stolen by malicious actors.

“Afraid of answering the phone”: The UK Information Commissioner meanwhile issued hefty fines to two companies for predatory marketing campaigns, often targeting elderly people with dementia. These calls were made to people who had explicitly opted out of receiving marketing communications. Some individuals were subjected to repeated phone calls, attempting to pressure them into buying warranties for white goods, such as fridges and washing machines, that they did not need. 

To that end the ICO is encouraging the public to take proactive steps to safeguard their loved ones: a) look out for rogue direct debits being paid for unknown reasons, b) ensure they are registered for the TPS, which provides a free and easy way to opt out of unwanted marketing calls, c) if they are still receiving unsolicited marketing calls despite opting out, report these incidents to the regulator without delay.

Receive our digest by email

Sign up to receive our digest by email every 2 weeks

‘Deposit and return’ app

processors and sub-processors

The Danish data protection authority has investigated Dansk Retursystem’s app “Pant”, (a deposit and return system for bottles and cans). The app allegedly processed users’ financial information. The investigation showed that it has a built-in component that needs to obtain the user’s account information to pay out money to the right account. But the component, which is made available by a third party, can also collect information about the user’s balances, identity information, transaction history, etc.

If the app’s APIs allow for the processing of more personal data than is necessary for its intended use, the authority can decide to issue a warning for non-compliance. These especially concern APIs and services when an external supplier is used.

Data security

Police access to personal data: The CJEU has ruled that police access to data contained in a mobile telephone is not necessarily limited to the fight against serious crime. The review must strike a fair balance between the legitimate interests relating to the investigation and the fundamental rights. Such access must, moreover, be subject to a prior review carried out either by a court or an independent administrative authority. The data subject must be informed of the grounds on which the authorisation to access their data is based, as soon as the communication of that information is no longer liable to jeopardise the investigations. 

Meta AI avoiding the EU market: Meta has introduced its AI assistant in the UK and Brazil after launching it in the US and Australia. However, because of strict regulations in the EU, services are still not available there. Users must complete an objection form found in the privacy settings of their applications if they would like to prevent Meta from using their Instagram and Facebook posts to train its AI models, The Guardian reports. Users of Meta’s AI products, however, are unable to prevent the Llama model from being trained and improved by their interactions with the AI tools.

Election technologies

Electors’ data: When it comes to elections around the world, we find ourselves in a terrain that is more and more populated by digital technologies, (Biometric Voter Registration, Electronic Voter Identification, and Result Transmission), explains Privacy International. This calls for changing customs and procedures to guarantee free, fair, and transparent elections. Election observers must also learn new techniques and abilities. Use of biometric information should only occur when it is required to properly identify or authenticate voters. It must be kept safe, apart from other information, and not on any publicly accessible record where access may be purchased.

If the digital system fails, backup plans should be in place, such as distributing hardcopy registers to voting locations. No further use of the collected data, including sharing with law enforcement or security agencies, is permitted. The lowest possible access level should be the default setting. Modern encryption and secure data channels should be used for transmission. When there is less than 100% internet coverage across all stations, for example, a backup mechanism, like using satellite phones, should be provided. 

Party political use of personal data: Finally, on a related item, ahead of the recent UK General election NGO The Good Law Project asked its supporters to contact all Britain’s political parties requesting they stop processing their personal data, (eg, political parties can combine the electoral roll with other data for targeting campaigns), and refrain from using it. Every party complied except for Nigel Farage’s Reform Party. The NGO has sent Reform a pre-action protocol letter warning them they are breaking the law.

Do you need support on data protection, privacy or GDPR? TechGDPR can help.

Request your free consultation

Tags

Show more +