Data protection digest 15 Sep – 1 Oct 2023: cross-border cases get the highest level of attention from regulators

In this issue, cross-border cases get the full attention of the EDPB via its rulemaking on future enforcement procedures to complement the GDPR, resolving a complex case on TikTok children’s privacy, and being asked to permanently ban behavioural ads by Meta in the EU.

Legal processes and redress: cross-border enforcement, Grindr fine, EU Data Governance Act, UK-US data transfers

Cross-border cases: The EDPB and the EDPS welcomed a proposal by the European Commission to complement the GDPR by specifying procedural rules in cross-border cases. The recommendations set by the regulators include harmonisation of complaints admissibility, as well as the consensus-finding process during the preliminary and final stages of an investigation, to minimise the need for agency procedures such as a dispute resolution process. Regarding the amicable settlements of complaints, regulators call on the co-legislators to enable its efficient implementation, particularly in Member States that do not have such procedural laws. 

Grindr fine confirmed: In Norway, the Privacy Appeals Board has decided on the Grindr case. The board upholds the data protection authority’s decision on an administrative fine of approx. 5,7 million euros. Grindr is a location-based dating app for the LGBTQ+ community. In 2020, the Norwegian Consumer Council complained about the app. The reason was that Grindr shared information about GPS location, IP address, mobile phone advertising ID, age and gender – in addition to an individual being a Grindr user – to several third parties for marketing purposes. The data protection authority concluded that Grindr disclosed personal data about users to third parties for behavioural advertising without a legal basis. 

The case concerns Grindr’s practices in the period from when the GDPR became applicable until 2020 when Grindr changed its consent mechanism. The data protection authority has not assessed the legality of the current practices of Grindr. The board points out, among other things, that the user was not given a free choice to consent to the disclosure of their data during registration in the app, and that the relevant information about data sharing was only included in the privacy policy. Moreover, information revealing that someone is a Grindr user may constitute a special category of personal data.

UK-US adequacy decision: Regulations leading to a UK-US Data adequacy decision were introduced to the UK parliament. The ‘Data Bridge’ will take effect on 12 October. Thus organisations in the UK will be able to transfer personal data to US businesses certified to the “UK Extension to the EU-US Data Privacy Framework” without additional safeguards, such as international data transfer agreements, (the UK version of the EU’s standard contractual clauses or binding corporate rules). Both UK and US organisations will also have to update their privacy policies. In parallel, the US Department of Justice will add the UK as a qualified jurisdiction, whose citizens can seek legal redress under the data privacy framework. 

Data Governance Act applicable since September: It sets up common European data spaces, involving both private and public players, in sectors such as health, environment, energy, agriculture, mobility, finance, manufacturing, and public administration. Both personal and non-personal data are concerned. The act also defines a set of rules for providers of data intermediation services to ensure that they will function as trustworthy organisers of data sharing or pooling. One example might be Deutsche Telekom’s data marketplace in which companies can securely manage, provide and monetise good quality information, to optimise processes or entire value chains.

Official guidance: biometrics, AI transparency, gossip at work

Biometrics and employment: The use of biometric data can be considered excessive on the part of the employer and not by the requirements of regulatory acts, states the Latvian data protection regulator. A desired goal, for example, recording working hours or entering the office – can be achieved with less interference in the employee’s privacy. The biggest “stumbling block” for employers when implementing a biometric data processing system is not security issues only, but how to process data legally. 

Biometric data is a special category of data, the processing of which is permitted for employers only in certain cases, (GDPR Art. 9 exceptions in conjunction with Art. 6 legal bases). For example, if companies plan to use their employees’ fingerprints or face scans to enter the workplace, the processing of biometric data must be based on the employees’ consent, It must be freely given, specific and informed. There should not be a situation where the employee suffers negative consequences because they did not give their consent. 

AI Transparency: The proposed EU AI Act, whose material scope is AI systems, establishes a concept of transparency that differs from the same term established in the GDPR, whose material scope is the processing of personal data. Transparency within the framework of both regulations involves different actors, and is intended for different recipients, explains the Spanish data protection authority. Transparency in terms of the proposed AI is the information on AI systems and their providers and entities that deploy these systems. When AI systems are included in or are a means of processing personal information. data controllers must also comply with the GDPR. 

Typically, personal data processing is implemented through various types of systems, such as cloud systems, communication systems, mobile systems, and encryption systems, and some of them could be AI systems. AI system designers, developers, suppliers and entities deploying it can be data controllers and/or processors in various scenarios. At the same time, the natural persons who could be affected by these systems are not always data subjects as defined in the GDPR. For example, in the case that natural persons are recipients of multimedia content created by an AI ​​system.

Gossip and personal data: There are ongoing examples of employees having unauthorised access to personal data. The Danish data protection authority states that most often it is only discovered when an individual becomes aware that someone is using information about them. It can be really difficult for the data controller to find out when employees use their system access in a way that is not related to work. Abuse of access rights cannot be completely prevented but may depend on systematic rights management, good control procedures and effective enforcement on the part of the data controller. If despite these measures employees snoop on other people’s information, they can be punished with a fine or even reported to the police. 

Enforcement decisions: electronic monitoring, recruitment, data deletion

Electronic surveillance: A privacy fine of approx. 10,000 euros was issued against the University of Iceland due to electronic monitoring. Complaints were made about surveillance cameras inside and outside the university buildings with no visible markings that would indicate that electronic surveillance was in place, (a total of 97 security cameras, 75 indoors and 22 outdoors). There was also a complaint that there had been no presentation of the purpose, nature, scope, location or other aspects of the monitoring, which had been operational for several years.. The institution hosts around 15,000 students and 4,900 employees per year, and hosts hundreds of annual events. 

Certain points were evaluated as in the university’s interest, but in light of the scope of the surveillance camera system, the number of those recorded and the duration of the violation, the decision to impose a fine was reached.  The university claimed that due to repeated break-ins, a decision had been made to increase the use of access cards and number of security cameras. Nothing else was defined about the nature, extent, or other things related to electronic monitoring by the institution. On top of the fine, the regulator also ordered the updating and installation of electronic monitoring signs in buildings and outdoor areas of the university complying with the law.

Excessive recruitment data: Meanwhile the French regulator CNIL fined SAF Logistics 200,000 euros for excessive employee data collection and lack of cooperation. SAF Logistics is an air cargo service whose parent company is located in China. As part of internal recruitment for a position within the parent company, it requested information about the family members of employees such as their identity, contact details, function, employer and marital status, along with sensitive data such as blood type, ethnicity and political affiliations. It also stored extracts from criminal records. When the CNIL requested the company translate the employee questionnaire, which was written in Chinese, the incomplete translation missed ethnicity or political affiliation fields.

Data (non)deletion: The hotel chain Arp-Hansen has been fined approx. 134,000 euros by a court in Denmark, regarding violation of the storage of personal data. The hotel chain did not comply with the erasure deadlines it had set itself, (of 1 year). The Danish data protection authority estimated at the time that approx. 500,000 customer profiles should have been deleted at the time of the inspection visit. The case highlighted which financial statements should be used as a starting point when calculating a fine. The amount was determined after the court considered the hotel chain’s revised and published annual accounts for 2018, which reflected the company’s financial situation during the period of the offence. 

Data security: US healthcare and mergers data

Healthcare data: The US FTC-HHS outlined privacy and security laws and rules that impact consumer health data. Collecting, using, or sharing consumer health information in the US focuses on four primary sources: the Health Insurance Portability and Accountability Act (HIPAA), HIPAA Privacy, Security, and Breach Notification Rules, the FTC Act, and the Health Breach Notification Rule. The publication addresses some of the basic questions. What entities are covered? What do you have to do to maintain the privacy and security of consumers’ health information? and so on. You can also check out the FTC-HHS Mobile Health App Interactive Tool as you design, market, and distribute your mobile health app. 

M&A and data protection: US researchers from the Electronic Privacy Information Center are urging the Department of Justice to include data protection and consumer privacy as factors in the newest Merger Guidelines. In a data-driven economy, businesses’ mass accumulation of personal data can have anticompetitive effects that further undermine consumer privacy and data security. Mergers frequently involve the consolidation of data sets, which can extend a firm’s market dominant position, impact entry for smaller firms, and exacerbate the effects of harmful consumer data practices. As a result of such mergers, there is no meaningful opportunity for firms to compete with better privacy practices.

Big Data: Meta behavioural ads, TikTok minor’s privacy enforcement

Norway case goes to the European level: The Norwegian data protection authority has requested a binding decision from the EDPB in the Meta case. It asked that Norway’s temporary ban on behavioural advertising on Facebook and Instagram be made permanent and extended to the entire EU/EEA. The Norwegian regulator is only authorised to make a temporary decision in this case. The decision expires on 3 November. Earlier this year, the authority found that Meta processes personal data for illegal behavioural advertising and intrusive monitoring of users in the context of the Facebook and Instagram services. For this reason, it imposed a temporary sanction on the company. The regulator also won against Meta in court. Nonetheless, the company continues its activities and has not yet complied with the decision. Meta has submitted several administrative complaints against the Norwegian data protection authority’s decision so far. 

TikTok minors data: The Irish data protection commission adopted its final decision regarding TikTok’s processing of minors’ data and age verification during the registration procedure imposing fines totalling 345 million euros, with an order to bring the processing into compliance. The investigation found: 

  • children’s account settings were made public, 
  • certain features were enabled, exposing users under the age of 13,
  • privacy gaps in the “family pairing” function, 
  • misleading “dark patterns” during account creation and video uploading, and
  • failure to convey appropriate information to minors.

Interestingly, objections to the draft decision by the Irish regulator were raised by other concerned supervisory authorities, working as part of a cross-border investigation uncovering additional infringements including privacy-intrusive dark patterns. The case ended up at the EDPB for dispute resolution, which obliged the DPC to amend its draft decision to include new findings. 

Do you need support on data protection, privacy or GDPR? TechGDPR can help.

Request your free consultation

Tags

Show more +