digital services

Data protection digest 3 – 17 Jan 2024: digital services transparency and risk assessment in the focus of regulators

Our latest data protection bulletin focuses on digital services transparency and safety from decentralised clinical trials and health apps to electronic payments and audience measurements. Data transfer impact assessments and the performance of DPOs also feature in this issue.

Sign up to receive our fortnightly digest via email.

Legal processes

Digital Services Act: Online services will have new obligations when the application of the EU’s digital services regulation begins as of 17 February. The purpose of the new regulation is to reduce illegal content and increase the transparency of advertising and recommendation systems and the protection of minors. The internet giants have been already supervised and regulated directly by the European Commission since mid-2023, whereas Member States are responsible for the supervision of smaller platforms as of mid-February. 

EU adequacy decisions list: The European Commission successfully concluded its review of 11 existing adequacy decisions. Thus Andorra, Argentina, Canada, Faroe Islands, Guernsey, the Isle of Man, Israel, Jersey, New Zealand, Switzerland and Uruguay continue to benefit from adequate data protection safeguards. The Commission also monitors the latest arrangements that are in place with the UK, US, Japan and South Korea. 

Regulatory updates

Decentralised clinical trials: To support sponsors in the design of their decentralised clinical research projects, the French data protection regulator CNIL and its state partners are setting up a pilot phase, from January to June 2024. 20 projects will be selected and will receive targeted support. In 2022, the European Commission published the European recommendations on decentralised clinical trials in the wake of the COVID-19 pandemic.  Each application must include:

  • a specific question mentioning the decentralised component and summarizing the problem encountered;
  • a proposal for a complete scenario for the implementation of the decentralised element of the research project, a summary of the protocol and the information notice for future participants.

DPO evaluation: The EDPB identified areas of improvement to promote the role and recognition of data protection officers. In 2023, thousands of organisations, as well as DPOs were contacted across the EEA, covering a wide range of sectors, and more than 17,000 replies were received and analysed. The majority of the DPOs interrogated declare that they have the necessary skills and knowledge to do their work and receive regular training; they have clearly defined tasks in line with the GDPR and do not receive instructions on how to exercise their duties. They generally have sufficient resources to carry out their tasks and are, in most cases, involved in decisions relating to personal data.

However, the answers provided highlight the significant disparity in resources between the DPOs of large companies and those of small communities: the public officer often carries out his duties alone while the private delegate generally has a team.

Transfer Impact Assessment

A Transfer Impact Assessment must be undertaken by controllers or processors acting as data exporters, with the assistance of the importer, before transferring data from a European Economic Area country to a third country where such transfer is based on an Art. 46 of the GDPR transfer tool. Since the importer has a lot of information needed for this assessment, its cooperation is essential for the realisation of the TIA. To that end, the French data protection authority decided to give indications on how the analysis can be carried out by following the steps set out in EDPB’s recommendations. You can read the draft TIA guide, (in English), here. The consultation on it is open until 12 February. 

(If the country of destination is covered by an adequacy decision by the European Commission, the exporter is not subject to this obligation. The same applies if the transfer is carried out based on one of the derogations listed in Art. 49 of the GDPR).

Cookies and audience measurement

The Spanish data protection authority published a guide on the use of cookies for audience measurement, (in Spanish). The management of a website, or mobile application, by a publisher generally requires the use of traffic or performance statistics. The information processed through the use of cookies for this purpose can be managed directly by the publisher or by a provider who can provide a comparative audience measurement service. In that case, the provider would act as a data processor for one or more publishers. 

Cookies used to obtain traffic or performance statistics may be exempt from consent under certain conditions, (limited strictly to what is necessary for the provision of the service). On the contrary, to be exempt from consent, these cookies or similar technologies must not result in the data being compared with other processing operations or in the data being transmitted to third parties. In addition, they should not allow aggregate tracking of the navigation of the person who uses different applications or browsers, (as is the case with audience measurement offers available on the market).

Similarly, the Austrian data protection authority published a FAQ on cookies and data protection, (in German). In particular, it explains what are “technically necessary” cookies,  how to use industry standards or “cookie consent tools”, and finally how to identify the GDPR-governed roles and responsibilities of a data controller or a processor if cookies are set for your digital services.

More official guidance

Fitness trackers: Such apps and devices are usually connected to the Internet as well as other apps and devices of various kinds. This implies the exponential multiplication of sensitive data processed and shared and the possible risks related to IT security. According to the Italian data protection agency, when using these tools it is therefore always good to adopt some important precautions

  • always read the information notice carefully, (who and how will process your data);
  • minimise data collection, (disable features that are not essential, use a pseudonym, delete data);
  • If the connection to other devices is not essential for the device or app to function, do not grant permission, (such as contacts in the address book, photos, agenda or microphone);
  • safety first (complex and secure authentication, downloads via official digital services, periodic updates);
  • If you don’t use it, turn it off, or uninstall it from your device, and 
  • avoid the use of devices and apps by minors unless supervised by an adult. 

Generative AI: Meanwhile the UK Information Commissioner’s Office, (ICO), has launched a consultation series on generative Artificial Intelligence. Generative AI models are being used across the economy to create new content, from music to computer code. The first consultation examines when it is lawful to train generative AI models on personal data scraped from the web. The ICO is seeking views from a range of stakeholders, including developers and users of generative AI, legal advisors and consultants working in this area, civil society groups and other public bodies with an interest in generative AI. The first consultation is open until 1 March.

Receive our digest by email

Sign up to receive our digest by email every 2 weeks

CJEU ruling

Controller’s (non) strict liability: In one of its recent decisions the CJEU held that a controller will be held liable for a breach committed by a processor intentionally or negligently if the processor was carrying out processing operations on its behalf. However, a processor may be held solely liable if the processor carried out the processing for:

  • their purposes; or
  • non-compliance with the framework of, or arrangements for, the processing as determined by the controller, or 
  • in such a manner that it cannot reasonably be considered that the controller consented to such processing.

The case relates to the development of a COVID-19 mobile application, raising questions of joint controllership between the IT service provider and the Lithuanian Public Health Centre that ordered its creation but did not enter into a contract to proceed with its publication. The app was eventually made available on Google Play, and its privacy policy still referenced the public centre and the service provider as controllers. 

Unsolicited marketing

Food delivery spam: The UK Information Commissioner fined food delivery company HelloFresh 140,000 pounds for 79 million spam emails and 1 million spam texts over seven months. The marketing messages were sent based on an opt-in statement which did not make any reference to the sending of marketing via text. Whilst there was a reference to marketing via email, this was included in an age confirmation statement which was likely to unfairly incentivise customers to agree. Customers were also not given sufficient information that their data would continue to be used for marketing purposes for up to 24 months after cancelling their subscriptions.

“Do not call” register: The UK Commissioner also fined Poxell Ltd 150,000 pounds for making over 2.6 million unlawful marketing calls between March and July 2022. The company made dozens of calls to individuals with dementia and other serious illnesses offering home improvement solutions. The aggressive salesperson failed to identify themselves, allow their number to be displayed to the person receiving the call or provide a contact address or freephone number if asked. After receiving the initial investigation letter, it continued to make unsolicited direct marketing calls until its account was terminated by its communications service provider. 

Customer data deletion: The Danish data protection regulator imposed a fine of approx. 33,000 euros against the Royal Theater for not having laid down rules for deleting customer information for marketing use. The theatre stored information on approx. 520,000 customers and newsletter recipients for marketing purposes, without having set deletion deadlines or established fixed procedures or guidelines for deleting the information. The information was only deleted in cases where individual customers specifically requested deletion or revoked their consent to receive direct marketing. 

Data breaches

Inappropriate coding: The Danish data protection regulator also recommended a record fine of approx. 2 mln euros against Netcompany. As a data controller it had not implemented appropriate security measures in connection with the development of This system enabled users to read and respond to their digital correspondence from the authorities, while also being able to access their medical records and pay bills. Netcompany used inappropriate coding in the component that authenticated users. When was put into operation in March 2022, an error therefore occurred almost immediately when several users logged on and accessed other users’ sensitive information.

Password recycling: Finally, tech giant 23andMe, a DNA-testing company, blames its users for data breaches, reports. The recent October breach exposed the 23andMe accounts of about 6.9 million users. Customers received a letter from the corporation informing them that 23andMe was not responsible for the occurrence. Rather, the incident was a result of users’ failure to safeguard their account credentials: a key that allowed criminal actors to use 23andMe’s DNA Relative matching service was supplied by some customers who recycled passwords that were exposed in prior data breaches that targeted other websites. Due to the data breach, the corporation has been sued many times, with every claim citing inadequately secured customer information.

More enforcement decisions

Electronic payments: The French data protection regulator imposed a fine of 105,000 euros on NS CARDS France. The company publishes the website and the mobile app “Neosurf” which allows you to make online payments after registering for digital services. The company had set a ten-year retention period at the end of which user accounts were deactivated, but not deleted. The account data was therefore kept for an indefinite period. In addition, the ten-year retention period was applied to all user accounts, without sorting out the data to be kept, for example by certain consumer rights. Another failing was the user account password complexity rules were insufficiently robust, (eg, stored in plain text in the database and associated with the users’ email address and ID). 

The regulator also noted the deposit of Google Analytics cookies on the user’s terminal without their consent. NS CARDS France also used a reCAPTCHA mechanism, provided by Google, when creating the account and logging in to the website and mobile application. The collected data was transmitted to Google for analysis but the company did not provide any information to the user and did not obtain their prior consent.

Risk assessment failed: Meanwhile, the Dutch data protection authority imposed a fine of 150,000 euros on International Card Services (ICS). ICS failed to carry out a DPIA before the company started digitally identifying customers in the Netherlands in 2019. Furthermore, the personal information used for identification was sensitive. In addition to customers’ names, addresses, telephone numbers and e-mails, this included a photo that customers had to take of themselves and send via a mobile phone or webcam. ICS then used these photos to compare them with copies of customers’ IDs. 

Data security

Data breach types: The Danish data protection authority focuses on 10 typical breaches of personal data security and comes up with concrete proposals on how they can be avoided, (in Danish). This includes things like auto-complete which causes e-mails to be sent to the wrong recipients, broad access to data on network drives, unauthorised access to data due to poor design, coding errors and insufficient testing, failure to delete data using digital tools, loss/theft of portable devices with unencrypted data, disclosure of data stored in template and form solutions, and more.

My Health My Data: Washington State published a FAQ on the My Health My Data Act. It is the first privacy-focused law in the United States to protect personal health data collected and shared outside the state and under federal healthcare privacy regulations. This concerns information that can identify a consumer’s past, present, or future physical or mental health status. For example, information about the purchase of toilet paper or deodorant is not consumer health data, while an app that tracks someone’s digestion or perspiration is. Regulated entities or small businesses shall:

  • publish a separate and distinct link to their consumer health data privacy policy on their homepage;
  • secure valid authorisation from a customer to sell their data. 

Consumers have a right to withdraw consent and a right to have their data deleted. The act takes effect on 31 March for regulated entities. Small businesses have until the end of June to comply with new rules.

Big Data

Meta’s “Pay or okay” consent model: Privacy-advocacy group NOYB stated that Meta unlawfully ignores the users’ right to easily withdraw consent. The group has filed a new complaint with the Austrian data protection authority. According to Meta, the Facebook and Instagram service tries to abide by EU regulations requiring users to have the option of whether or not their data may be gathered and used for targeted advertising. Users who agree to be monitored receive a free service funded by advertising income. However, while one click is enough to consent to be tracked, users can only withdraw their consent by switching to a paid subscription, NOYB concludes.

Do you need support on data protection, privacy or GDPR? TechGDPR can help.

Request your free consultation


Show more +