Data protection & privacy digest 3 – 17 Apr 2023: US data transfers and AI tools occupy EU

TechGDPR’s review of international data-related stories from press and analytical reports.

Legal processes and redress

US data transfers: A full parliamentary vote on the upcoming EU-US Data Privacy Framework is planned in the coming weeks. So far, a resolution adopted by Civil Liberties Committee MEPs argues that the European Commission should not grant the US an adequacy decision deeming its level of personal data protection essentially equivalent to that of the EU and allowing for transfers of personal data between the two. However this resolution will not be binding on the European Commission. 

MEPs note that the framework still allows for bulk collection of personal data in certain cases, does not make bulk data collection subject to independent prior authorisation, and does not provide for clear rules on data retention. The transparency and independence of the new redress mechanism for EU data subjects are also under question. Finally, the US Intelligence Community is still updating its practises based on the framework, so an assessment of its impact on the ground is not yet possible, say MEPs. 

CCPA/CPRA: The updated CCPA regulations were approved by the California state and come into effect in three months’ time. These revisions reflect the CCPA’s amendment by the California Privacy Rights Act of 2020, which added new business obligations addressing: consumer rights regarding the sharing, sale, and restriction of sensitive personal data, information notice, user-enabled privacy controls, out-out options, contractor and third-party contract requirements, and more. 

Employees data: In its recent judgement the CJEU ruled out important aspects of data processing in the employment context, interpreting Art. 88 of the GDPR. The preliminary ruling concerns the lawfulness of a system for the live streaming of classes by videoconference introduced in state schools in Hessen, (Germany,) without the prior consent of the teachers. Art. 88 of the GDPR enables the national legislator to enact “more specific regulations” in employee data protection.  However, they should not be general clauses that simply repeat the GDPR’s provisions. 

Instead, they should include suitable and specific measures to safeguard the data subject’s human dignity, legitimate interests and fundamental rights, with particular regard to the transparency of processing. For organisations and employers this means that in the absence of valid national provisions GDPR rules must be complied with, including the balancing tests for the appropriate legal basis for employee data processing, (employment contract, legitimate interest or consent). 

In response to the decision, the Hamburg data protection commissioner also stated that Section 23 of the Hessian data protection act does not constitute a ‘more specific rule’, and that the moment had arrived for a new federal employment data protection act. 

Automated employment tools: Meanwhile, on the other side of the Atlantic, the New York City Department of Consumer and Workforce Protection promulgated its final regulations on the Automated Employment Decision Tools Law (AEDTL). Once enforced, it will restrict employers’ ability to use machine learning, statistical modelling, data analytics or AI tools in hiring and promotion decisions within New York City. Employers who use automated employment decision tools must also disclose it to candidates before the tool is used, as well as systematically undergo and disclose independent “bias audits”. Read the full analysis here.

EDPB guidance

A set of updated guidance and studies, along with the annual 2022 report, was published by the EDPB.

National administrative rules: The EDPB conducted a study on national administrative rules applicable when the national supervisory authorities carry out their duties under the One-Stop-Shop, (OSS), procedure. For instance, the requirements for the admissibility of complaints from individuals vary considerably from one country to another. Furthermore, the possibility to reach an amicable settlement between controllers or processors and complainants does not exist in all countries, and there is no clear indication of differing regulations’ impact on the OSS procedure. Finally, there is no convergence regarding the prior notification of forthcoming investigations or exercise of corrective powers. Read more challenges and possible solutions in the original publication.

Entities outside the EEA: Another study by the EDPB looks at the enforcement of GDPR obligations against entities established outside the EEA, (California, the UK and China). It aimed to analyse the possibilities available to enforce supervisory authorities’ investigative and corrective powers against third-country controllers/processors that fall under the scope of the GDPR but are not willing to cooperate with regulators and did not designate an EEA representative. This included the possibility to summon third-country controllers/processors to appear before the SA’s office, or in the SA’s national courts or tribunals, choice of jurisdiction and additional restrictive measures. 

Right of access: The right of access of data subjects is enshrined in Art. 8 of the EU Charter of Fundamental Rights and Art. 15 of the GDPR, says the EDPB’s latest guidance. The overall aim of the right of access is to provide individuals with sufficient, transparent and easily accessible information about the processing of their personal data so that they can be aware of and verify the lawfulness of the processing and the accuracy of the processed data. This will make it easier – but is not a condition – for the individual to exercise other rights such as the right to erasure or rectification. 

Personal data breach notification: The EDPB considers that complying with the notification requirement has a number of benefits. When notifying the supervisory authority, controllers can obtain advice on whether the affected individuals need to be informed. Breach notification should be seen as a tool for enhancing compliance. At the same time, failure to report a breach to either an individual or a supervisory authority may mean a possible sanction applicable to the controller. Controllers and processors are therefore encouraged to plan in advance and put in place processes to be able to detect and promptly contain a breach.

Lead supervisory authority: The EDPB has noticed that there was a need for further clarifications, specifically regarding the notion of main establishment in the context of joint controllership and taking into account the concepts of controller and processor in the GDPR. Correct identification of the main establishment is in the interests of controllers and processors because it provides clarity in terms of which supervisory authority they have to deal with in respect of their various compliance duties under the GDPR. 

The most complex situations are when it is difficult to identify the main establishment or to determine where decisions about data processing are taken. This might be the case where there is cross-border processing activity and the controller is established in several Member States, but there is no central administration, or none of the EEA establishments is taking decisions about the processing.

Other official guidance

Generative AI risks: The UK privacy regulator the ICO poses eight questions about generative AI that developers and users need to answer. The EU legal backlash on ChatGPT is just the beginning of the journey states the analysis, and organisations developing or using generative AI should be considering their data protection obligations from the outset, taking a data protection by design and by default approach. This isn’t optional – if you’re processing personal data, it’s the law, (data protection law still applies when the personal information that you’re processing comes from publicly accessible sources):

  • Are you a controller, joint controller or processor? 
  • What is your lawful basis for processing personal data? 
  • How will you comply with individual rights requests? 
  • How will you limit unnecessary processing? 
  • How will you mitigate security risks? 
  • Have you prepared a Data Protection Impact Assessment? 
  • Will you use generative AI to make solely automated decisions? 
  • How will you ensure transparency? To know more, here’s the ICO publication. 

AI-assisted employment: Meanwhile the Spanish data protection authority AEPD explains how to apply AI tools for employment activities. In essence the data controller decides when designing the programme whether or not to include an additional operation of human supervision on the results produced by the AI ​​system. AI systems will form part of the nature of data treatment when they have been included in some of the necessary operations for this explicit purpose. This may include AI systems implemented locally or in the cloud, mobile systems, outsourced data processors, etc. Therefore, the fact that decision-making is automated is not a feature of the AI ​​system itself. 

For example, the procedure to guide candidates to complete an application form where they would include their CVs could be implemented using a chatbot. In addition, the number of applications, and therefore the number of CVs, could be so large that the manager could decide to use an AI system for the automatic selection of the most interesting CVs, according to certain criteria that the manager should also establish. The manager could go further and implement the evaluation of the candidates through another AI system that performs and evaluates the tests for the previously selected candidates. 

Sports industry: A large amount of personal data including special categories is generated in digitised sports, states the German federal data commissioner. If these are not so comprehensively anonymised that it is impossible to trace them back to individual athletes, data protection rules on purpose limitation, storage limitation, lawfulness data minimisation, transparency, and data security apply. This extends to all bodies and organisations that process athletes’ personal data – coaches, associations, doping agencies, sports facility operators, scientific institutes, doctors, laboratories, consultants, agents, and sometimes also sponsors, betting shops or even manufacturers of hardware and software.

Investigations and enforcement decisions

Data breach statistics: The Guernsey data protection agency ODPA published the latest personal data breach statistics: Nearly 10 million people were reported to be affected by 38 personal data breaches from January to March. Reportedly, the majority of those were customers of a UK-based company which was the victim of a large cyber-attack. Although the company is not based locally, it reported the breach to data protection regulators in all jurisdictions where its customers are based. Additionally, the most striking examples of personal data breaches involved:

  • people using personal email accounts to send work-related information, (email providers are outside the control of the organisation meaning usual security policies do not apply and the organisation does not know what its data is being used for),
  • accounts shared by couples or devices, (the boundaries of your personal life and your job intersect in a way that is not helpful for you or your workplace, which means information could fall into the wrong hands.)

Failed data subjects’ right of access: Following a complaint the Spanish AEPD fined Banco Bilbao Vizcaya Argentaria, or BBVA, 84,000 euros, according to Data Guidance. Despite ceasing to be a client of BBVA in 2012, the complainant discovered in 2021 that there were two debts registered in their name in the Bank of Spain’s Risk Information Center. Regarding the use of the right of access, the AEPD explained that BBVA had asked the complainant for additional details in order to recover the recordings, which constituted an unfair burden on the data subject for the fulfilment of their request

In another recent enforcement decision by the AEPD, the claimant requested access to the images from the video surveillance system located at a commercial centre. Unable to find a way to make a request in person, the claimant submitted one via electronic means of communication, (using the company’s marketing email address). This email address is not related to the processing of personal data nor was the means of contact enabled for the exercise of any rights. However, the company responded only to state that such access was not possible, except when there is a prior complaint, or when requested by the police or authorised personnel. The regulator found that the right of access of the complainant to their personal data was not respected, as established in Art. 15 of the GDPR.

Data security

Established cooperation: A long-term relationship between a controller and a processing entity does not guarantee data security, states the Polish privacy regulator UODO. In the related case, the verification of the competence of the processor was not formalized, because it consisted of conducting an interview, and the services provided by the entity, (a file depositary service), did not raise objections from the controller. The explanations of both the controller and the processor indicated that these entities only applied the controller’s internal regulations, (the Personal Data Protection Policy). The lack of any risk analysis resulted in the selection of inadequate measures.

The mere signing of a contract for entrusting the processing of personal data without proper assessment of the processing entity cannot be considered as fulfilment of the data security obligation. The determinant for such an assessment cannot be only long-term cooperation and the use of the services of a given processor. In the opinion of UODO, positively assessed cooperation may only be a starting point when verifying whether the processing entity provides sufficient guarantees for the implementation of appropriate technical and organisational measures. 

Certifying employees’ qualifications: The Hungarian data protection agency NAIH publishes detailed recommendations on how to handle documents certifying employees’ qualifications according to the data protection requirements. The employer may require the employee to present a document in its legitimate interest. The employer can also keep their own, internal records of the education of each employee, the date and the method of proof of education. However, “objective evidence”, (as defined in ISO 9000:2015 Quality management systems), needs to be supported by documented information.

A copy of a document certifying education or training does not have the power to prove that it is an authentic copy of a valid public document, so it is not suitable for establishing the authenticity of the data contained therein, and it may include additional unnecessary personal information.

Instead, the organisation may prepare a note or protocol stating that the given employee presented the original documents certifying their education, the relevant data of which is now recorded by the organisation, (eg, serial number of the document, date of qualification).

Tracking pixels: The Norwegian data protection authority encourages businesses to review their websites for tracking pixels or other tracking technologies. Recent media reports revealed that a large number of European online pharmacies have shared customers’ personal data through tracking technologies. For website users this is potentially a major privacy risk, while for the websites it poses a significant legal and reputational risk. The regulator now encourages all Norwegian websites to review for tracking pixels and other tracking technologies. Unless the business has assessed the tools, has an overview of data flow and is confident that their use is in line with privacy rules, the trackers should simply be removed

Cyber ​​risks management: The German Federal Office for Information Security updated its manual on ‘Management of Cyber ​​Risks’. It is dedicated to a comprehensive corporate culture that takes cyber security into account at all times, aiming to increase the resilience of companies. As cyber ​​security starts with senior management, IT managers need the necessary support and the right understanding on the part of company management. The guide formulates six basic principles that support management and supervisory boards when considering cyber risks:

  • Understanding cyber security as a component of company-wide risk management.
  • Understanding and closely examining the legal implications of cyber risks.
  • Ensuring access to cyber security expertise and regular exchange.
  • Implementing suitable frameworks and resources for cyber risk management.
  • Preparing risk analysis based on business risk appetite, goals and strategies.
  • Encouraging company-wide collaboration and sharing of best practices.

Big Tech

Meta binding decision: The EDPB adopted a dispute resolution concerning a draft decision of the Irish data protection authority DPC on the legality of data transfers to the US by Meta Ireland for its Facebook service. The decision will be announced soon and may constitute an order on blocking Facebook’s transatlantic data flows. The Irish regulator shall adopt its final decision, addressed to Meta Ireland, on the basis of the EDPB binding decision and taking into account the EDPB’s legal assessment, at the latest one month after the EDPB publishes its decision. 

In January this year the DPC, also instructed by the EDPB, ordered Meta to pay a hefty fine for making users accept targeted ads and was directed to bring its processing operations into compliance with the GDPR within a period of 3 months. The EDPB also directed the DPC to conduct a fresh investigation of all of Facebook and Instagram’s data processing operations and would examine special categories of personal data that may or may not be processed. However, the DPC stated that EDPB is not entitled to instruct and direct a national authority to engage in a new “open-ended and speculative” investigation.

TikTok privacy fine: Finally, the UK fined TikTok 12.7 million pounds for misusing children’s data. More than one million British children under 13 were estimated to be on TikTok in 2020, contrary to its terms of service. As a result, personal data belonging to children was used without parental consent. TikTok  “did not do enough” to check who was using their platform and take sufficient action to remove the underage children. Since the conclusion of the investigation of TikTok, the ICO has published a statutory Children’s Code to help online services, such as apps, gaming platforms and web and social media sites, that are likely to be accessed by children. 

Do you need support on data protection, privacy or GDPR? TechGDPR can help.

Request your free consultation


Show more +