cross-border enforcement

Data protection digest 18 Dec – 2 Jan 2024: EDPB says too early to revise GDPR, cross-border enforcement challenge ahead

In this issue, you will find the main trends in data privacy that 2024 will inherit from last year. The main areas of concern include GDPR modernisation and cross-border enforcement, the fair use of AI, international data transfers, the balance between data security and the data-driven economy, as well as children’s privacy online.

Regulatory updates

5 years of the GDPR: The EDPB considers that the application of the GDPR in the first 5 and a half years has been successful. It is too early to revise the regulation, although several important challenges lie ahead, such as procedural rules relating to cross-border enforcement. The EDPB will keep on supporting the implementation of the GDPR in particular by SMEs, seeking greater clarity and uniformity of guidance and powers available. The existing tools in the GDPR have the potential to achieve this goal, provided that they are used in a sufficiently harmonised way. In addition, the supervisory authorities need sufficient resources to continue carrying out their tasks. 

“Cookie fatigue”: The EDPB also welcomed the voluntary business pledge initiative by the European Commission to simplify the management of cookies and personalised ads choices by consumers. It would ensure that users receive concrete information on how their data is processed, as well as on the consequences of accepting different types of cookies. Users would therefore have greater control over the processing of their data. However, the EDPB flagged that adherence to the cookie pledge principles by organisations does not equal compliance with the GDPR or ePrivacy Directive.

COPPA: The US Federal Trade Commission plans to strengthen children’s privacy rules to further limit companies’ ability to monetize children’s data. The new rule would require targeted ads to be off by default, limit push notifications, restrict surveillance in schools, limit data retention, and strengthen data security. COPPA rules require US websites and online services that collect information from children under 13 to obtain verifiable parental consent before collecting, using, or disclosing personal information from these children, (persistent identifiers, geolocation data, photos, videos, and audio). 

UK BCRs

The UK Information Commissioner updated a guide on the binding corporate rules for organisations managing data transfers between the UK and EU. Organisations with an existing EU BCR can add the UK Addendum thus creating a new UK BCR, to include UK-restricted transfers. It contains all relevant provisions of Art. 47 of the UK GDPR, meaning that your EU BCR will work in the UK. Finally, under the terms of the UK BCR Addendum, if your EU BCR is suspended, withdrawn or revoked, this also suspends, withdraws or revokes your UK BCR. This means that you must not transfer personal data under your UK BCR and you must use another international transfer mechanism.

Log data access

An administrative court in Finland has published a decision regarding the right to inspect log data. An employee of the bank, who was also a customer of the bank, demanded to know the persons who had reviewed his customer information during the bank’s internal audit. The bank refused to disclose the identity of the employees because the log data resulting from viewing the data was the personal data of the employees in question. However, the bank did give the reason why customer data had been viewed. 

The person complained about the bank’s procedure to the data protection commissioner’s office. The regulator rejected the request and stated that the bank does not need to provide information about the identity of employees. The case ended in the CJEU. The EU top court ruled that everyone has the right to know the times and reasons for queries made to their data. However, there is no right to receive information about persons who have processed information under the authority of their employer and by the employer’s instructions.

Health data processing

Certain processing of health data is subject to the performance of preliminary formalities with the data protection authority. To facilitate the procedures of the bodies concerned and the compliance of their processing, the French regulator CNIL has published, (in French), reference standards to which they must refer

Other official guidance

Sports archives: The storage of sports archives must comply with the regulations on the protection of personal data. Some personal data collected on athletes, federal officials or club presidents, such as results, awards, photographs and posters, may be of historical interest, invoked by the players in the ecosystem, (in particular institutions, clubs, sports federations, professional leagues), to justify the retention of data without limitation in time. In practice, the purposes associated with the retention of this data are very numerous, and the retention periods will vary. 

Also, depending on the status of the person who produced or received them, these records are either public or private. For example, the results of a sports competition organised by a delegated federation, (eg, the results of the championships of France), constitute public archives. On the other hand, in the context of a gala, if a sports competition is organised by the same delegated federation, the documents produced constitute private archives (the gala does not fall within the scope of the public service missions assigned to the organising delegated federation).

Purchase data: The Finnish data protection authority considers that keeping purchase data for the entire duration of the customer relationship does not adhere to the data minimisation principle. In the related Kesko, (retail company), case, the purchase data of a loyalty system, detailed and product-specific, had been processed for various reasons including for business development, and targeting of marketing. The customers themselves had been able to see their purchase information for five years. Kesko was then ordered to clearly define retention periods, clarify the purposes of the use of personal information, and delete or anonymize data that had been stored longer than necessary. 

Cross-border enforcement

Joint controllership: The EDPB published the final decision of the Hungarian supervisory authority about infringement of Art. 26 of the GDPR. The Slovak supervisory authority objected to processing carried out by a foundation as the presumed controller of two Hungarian–language websites. Certain recordings available on the foundation’s websites presumably feature children performing and singing specifically from a Slovak primary school. The Hungarian regulator established that there was no arrangement between the foundation and the school within the meaning of Art. 26 (1) of the GDPR, concerning joint processing and their respective responsibilities.  

Sanctions

Illegal university telemarketing: In the US, the Federal Trade Commission has sued Grand Canyon University for deceptive advertising and illegal telemarketing. The agency says the university, its marketer, and its CEO deceptively advertised the cost and course requirements of its doctoral programs and made illegal calls to consumers. Prospective students were told that the total cost of “accelerated” doctoral programs was equal to the cost of just 20 courses.

In reality, the school requires that almost all doctoral students take additional “continuation courses” that add thousands of dollars in costs. The defendants also used abusive telemarketing calls to try to boost enrollment. The university advertised on websites and social media urging prospective students to submit their contact information on digital forms. Telemarketers then used the information to illegally contact people. 

AI facial recognition banned: Also in the US, Rite Aid will be prohibited from using facial recognition technology for surveillance purposes to settle charges that the retailer failed to implement reasonable procedures and prevent harm to consumers in hundreds of stores. From 2012 to 2020, Rite Aid deployed AI-based facial recognition technology to identify customers who may have been engaged in shoplifting or other problematic behaviour. The complaint, however, charges that the company failed to take reasonable measures to prevent harm to consumers, who, as a result, were falsely accused of wrongdoing

Deleted CCTV footage: The Greek data protection agency fined Alpha Bank for failure to satisfy the right of access of its customer, who exercised the right of access to the recorded material from the store’s video surveillance system. It emerged that the bank failed to deal with the complainant’s request promptly, resulting in the material being scheduled to be deleted when the retention period expired. The authority found a violation of Art. 12 and 5 of the GDPR.

Audit reports

Cyber security framework: The UK Information Commissioner has carried out a voluntary data protection audit of Lewisham and Greenwich NHS Trust. One of the areas of improvement found included a cyber security framework that should be further embedded, by integrating new cyber staff roles into the organisation, and ensuring staff with key cyber security responsibilities complete additional specialised training relevant to their responsibilities. 

This should be supported by continuing security controls in place, such as plans to implement multi-factor authentication to protect higher risk or more sensitive personal data processing activities, and a regular programme of practical social engineering or phishing tests to ensure staff are familiar with such scams and what action to take.

Cyber risks relating to third-party suppliers should be reviewed periodically to ensure the Trust has assurance that cyber security controls are in place and effective. Further to this, Data Protection Impact Assessments should identify cyber risks and mitigating controls. Additionally, Information Asset Owners should be actively involved in assessing the cyber risks and monitoring the effectiveness of the mitigating controls. 

Ongoing work to replace or decommission legacy devices that cannot receive security patches and phase out or update servers with unsupported operating systems should continue. All network devices should be able to receive security patches that address cyber vulnerabilities, and systems approaching the end of life should be removed or updated on time.

Data breaches

Car parking data stolen: Europe’s largest parking app operator has reported itself to information regulators in the EU and UK after hackers stole customer data. EasyPark Group, the owner of brands including RingGo and ParkMobile, said customer names, phone numbers, addresses, email addresses and parts of credit card numbers had been taken but said parking data had not been compromised in the cyber-attack, the Guardian reports. The breach brings to light the centralisation of parking services, as physical meters and parking attendants are gradually replaced by websites and apps

Data security

Children’s privacy: The Spanish data protection authority presented its age verification system. It consists of the principles that an age verification system must comply with, a technical note with project details and practical videos that demonstrate how the system works on different devices and using several identity providers. The risks of the age verification systems currently used on the Internet, eg self-declaration or sharing credentials with the content provider, have demonstrated clear risks of the location of minors, lack of certainty on the declared age, exposure of the identity to multiple participants, and mass profiling. 

PETs: Privacy-enhancing and preserving technologies generally refer to innovations that facilitate the processing and use of data in a way that preserves the privacy of individuals. While there is no unified definition denoting a technology as a PET, the Centre for Information Policy Leadership’s year-long study investigates and provides 24 case studies on its three main categories: 

  • cryptographic tools that allow certain data elements to remain hidden while in use; 
  • distributed analytics tools where data is processed at the source; and 
  • tools for pseudonymisation and anonymisation. 

Authentication: Logging in with a password is still one of the most commonly used forms of authentication. Depending on what you have to protect, this may also be enough, states the Dutch data protection authority. Yet logging in with a single factor remains unsafe. It is better to use multiple factors, such as a password combined with a code via SMS. Using biometric data, even if very reliable, demands extra protection and must therefore meet stricter security requirements. Another alternative is a digital token – the unique series of numbers is not generated from your characteristics but is stored on a chip in your access card. However, it would only work if it is and remains strictly personal. 

Big Data

TikTok Australia: The Australian Information Commissioner has launched an inquiry into the platform’s use of marketing pixels to track people’s online habits, The Guardian reports. This can include where they shop, how long they stay on websites and personal information, such as email addresses and mobile phone numbers of non-TikTok users. The probe will determine whether TikTok is harvesting the data of Australians without their consent. Chinese conglomerate, ByteDance, which owns the video-sharing platform has denied it violated Australian privacy laws. New privacy legislation in response to a review of the Privacy Act is expected to land in the Australian parliament this year and will allow more inquiries like this.

Body-related data: Organisations building immersive technologies, from everyday consumer products like mobile devices and smart home systems to advanced hardware like extended reality headsets, often rely on large amounts of data about individuals’ bodies and behaviours, states the Future of Privacy Forum. Thus, it offers detailed and illustrated instructions, on how to document body-related data categories, (raw voice recording, facial geometry, fingerprints), handle complicated data practices, (eg, eye tracking), evaluate privacy and safety risks, and implement best security practices. Download the framework here

Cookie depreciation: Google begins the next step toward phasing out third-party cookies in Chrome: testing Tracking Protection, a new feature that limits cross-site tracking by restricting website access to third-party cookies by default. The company will roll this out to 1% of Chrome users globally, (a key milestone in their Privacy Sandbox initiative to phase out third-party cookies for everyone in the second half of 2024).  Participants for Tracking Protection are selected randomly — and if you’re chosen, you’ll get notified when you open Chrome on either desktop or Android.

Do you need support on data protection, privacy or GDPR? TechGDPR can help.

Request your free consultation

Tags

Show more +