In this issue, the EU legislators moved closer to the implementation of a digital data strategy to provide European organisations with the ability to grow and compete globally. Despite the transformative effects, civil societies and cohorts of experts warn that such consolidation of European data can undermine individual data protection rights. EU regulators and courts are trying hard to strike a balance between market power and consumer privacy, as in the case of scoring individuals by debt information agencies.
CJEU decisions
Automated decision-making: The EU top court identified data processing practices by credit information agencies that contradict the GDPR. While the so-called ‘scoring’ of individuals is permitted only under certain conditions, the prolonged retention of information relating to the granting of a discharge from remaining debts is contrary to the GDPR, (the case refers to SCHUFA, a private company providing credit information for clients in Germany).
As regards the ‘scoring’ of individuals, the court holds it as an automated individual decision prohibited in principle by the GDPR, in so far as SCHUFA’s clients, such as banks, attribute to it a determining role in the granting of credit. The court also considers that it is contrary to the GDPR for private agencies to keep such data for longer than the public insolvency register. The discharge from remaining debts is intended to allow the data subject to re-enter economic life and is therefore of existential importance to that person.
Non-material damage: Another decision by the CJEU concludes that the fear of possible misuse of personal data is capable of constituting non-material damage. Nonetheless, courts cannot conclude that the protective measures put in place by the data controller were ineffective if cybercriminals gain unauthorised access to or disclose personal data. The courts must assess the security measures concretely, by taking into account the risks associated with the processing concerned and by assessing whether the nature, content and implementation of those measures are appropriate to those risks. Finally, the controller may be required to compensate the data subjects who have suffered damage, unless they can prove that they are not responsible for that damage.
EU’s AI act
Agreement reached: On 8 December, the legislative trilogue on the draft AI Act ended and the provisional agreement was reached. AI systems are going to be regulated according to how much risk they pose to society and fundamental rights, including a list of high-risk and prohibited practices, supported by various monetary fine levels. Limited exceptions will be available for law enforcement purposes. General-purpose AI systems will be also subject to transparency obligations, with additional codes of practice imposed on the most powerful models.
Allocation of GDPR-governed roles: Meanwhile, the German Data Protection Conference demands that the intended AI Act properly allocate responsibilities along the entire AI value chain. This is the only way to protect the fundamental rights of those affected whose data is processed by AI, states the regulator body. Any legal uncertainty in this area would harm citizens, especially small and medium-sized companies, because they must bear the brunt of legal responsibility. The upcoming AI regulation should therefore specify for all those involved – including manufacturers and providers – which requirements they must meet.
EU regulatory updates
Workforce monitoring: The Council and the Parliament have reached a provisional agreement on a proposed directive to improve working conditions for platform workers. In particular, it will help ensure that those workers who have wrongly been classified as self-employed have easier access to their rights as employees under EU law. The proposal also established the first EU rules on the use of algorithm systems in the workplace.
Digital labour platforms regularly use algorithms for human resources management. As a result, platform workers are often faced with a lack of transparency on how decisions are taken and how personal data is used. Under the new rules, algorithms would be monitored by qualified staff, who enjoy special protection from adverse treatment. The new law also prevents the processing of certain personal data using automated monitoring or decision-making including:
- emotional or psychological state,
- private conversations,
- actual or potential trade union activity,
- racial or ethnic origin, migration status, political, religious beliefs or health status,
- biometric data, other than data used for authentication.
Youth data protection: The Dutch data protection authority objects to a bill that leads to large-scale data collection in youth care. The proposal should enable research into the availability of youth care within municipalities. This includes child protection, assistance to young people with psychological problems and the probation service. However, it needs to be sufficiently clear why a lot of sensitive information from young people and their parents, healthcare providers and municipalities must be shared in such research. The availability of youth care could be investigated in a way that is much less invasive, (eg, random research, distribution of waiting times or development of new statistics).
European Health Data Space
Pros: Both the Parliament and the Council have agreed on their positions on the European Health Data Space (EHDS). The new legislation would make exchanging and accessing health data at the EU level easier. The proposed regulation aims to improve individuals’ access to and control over their electronic health data, while also enabling certain data to be reused for research and innovation purposes, and to foster a single market for digital health services and products. The new rules aim to make it possible for a Spanish tourist to pick up a prescription in a German pharmacy, or for doctors to access the health information of a Belgian patient undergoing treatment in Italy.
Cons: However, several civil groups and experts have already warned about the privacy shortcomings of the cross-border exchange of electronic health data. The Irish Council for Civil Liberties recommends that the EHDS should specify the legal basis consistent with the GDPR and be specific about the allowed purposes of secondary use of electronic health data. It should also further narrow the categories of health data allowed for secondary use to reduce risks to fundamental rights. Another international consortium of experts believes the proposal significantly reduces transparency requirements, in contrast to the GDPR, as it:
- introduces waivers related to the provision of individual-level information to data subjects;
- disfavors consent as a legal basis for data sharing;
- builds up large datasets which may be extensively used for secondary purposes, that
- increases the risk of re-identification.
US privacy updates
FISA 702 short extension: US lawmakers reached a deal to temporarily extend major federal surveillance programs until mid-April, while talks on the future reform of the intelligence powers continue. Section 702 permits the government to conduct warrantless surveillance on any foreign national to gather “foreign intelligence information.” However, communications between Americans and the people under monitoring result in the collection of their data as well. Privacy campaigners warn that reauthorization of the intelligence powers must come with safeguards against abuse.
Opt-out preference signals: Meanwhile the California Privacy Protection Agency has approved a legislative proposal that requires browser vendors to include a feature that allows users to exercise their California privacy rights through opt-out preference signals. Through an opt-out preference signal, a consumer can opt out of the sale and sharing of their personal information with all businesses they interact with online without having to make individualised requests with each business. To date, only a limited number of browsers offer native support for opt-out preference signals: Mozilla Firefox, DuckDuckGo, and Brave. Google Chrome, Microsoft Edge, and Apple Safari—which make up over 90% of the market share—have declined to offer these signals, although these companies are also heavily reliant on advertising business models.
Data subject rights
Right to delete: Every time personal data is processed, the question arises as to how long the data controller may store this data. Art. 5 of the GDPR as a starting point provides principles of purpose limitation, data minimisation and storage limitation. In addition, the data subjects whose personal data has been processed have a right to deletion under Art. 17 of the GDPR, with which they can request the deletion of their data under certain conditions. There are also legal retention and deletion obligations that the person responsible must comply with. The Liechtenstein data protection agency has put together information on its website (in German), that sheds light on the topic both from the side of the data subject and from the side of the person responsible for data processing.
Employment guidance
The UK Information Commissioner’s Office produced an online resource with topic-specific guidance on employment practices and data protection, with two new pieces of guidance now out for public consultation: a) keeping employment records, b) recruitment and selection. Data protection law applies whenever you process your workers’ personal information. The law does not stop you from collecting, holding and using records about workers. It helps to strike a balance between your need to keep employment records and workers’ right to private lives, explains the regulator.
Additionally, the labour market supply chain can be complex, with end-to-end recruitment processes often involving several organisations. The use of novel technologies in recruitment processes means that organisations are processing increasingly large amounts of information about people – candidates, prospective candidates, employees, contractors, volunteers or gig and platform workers, referees, emergency contacts, and dependants.
UK-US data transfers
The ICO also offers a guide on how to comply with restricted transfers of personal data to the US using Art. 46 of the UK GDPR transfer mechanism. There are a range of reasons why you may wish to use it, including:
- if your US recipient is not certified to the UK Extension to the EU-US data protection framework or the restricted transfer is not covered under your recipient’s certification;
- none of the eight exceptions set out in Art. 49 of the UK GDPR apply to your restricted transfer;
- you are making the restricted transfer under UK Binding Corporate Rules, or
- you or your US recipient uses the Addendum or the International Data Transfer Agreement as the preferred standard transfer mechanism.
You can make restricted transfers to recipients in the US using Art. 46 only if you have first completed a transfer risk assessment. This includes the latest analysis of US laws related to access and use of personal information by US agencies for national security and law enforcement, the circumstances of each transfer, and the commercial practices of you and your recipient. The requirement to complete a transfer risk assessment applies regardless of which mechanism you use or why.
Investigations
DPO for public services: The Luxemburg data protection regulator CNPD concluded an investigation into the appointment of data protection officers by municipalities. According to article 37.1.a) of the GDPR, any data controller or subcontractor must designate a DPO if “the processing is carried out by a public authority or body, except for courts acting in their judicial capacity”. 4 out of 6 municipalities at the time of the opening of the investigation, (in 2022), either appointed a DPO or communicated the latter’s contact details to the CNPD. No further corrective measures have been taken, as the municipalities have regularised their situation over the course of the investigations.
Enforcement decisions
Google Workspace at school: Meanwhile in Sweden, a penalty fee was issued against a municipality that did not assess the impact of using Google Workspace in 24 of the municipality’s schools since autumn 2020. Among other things, the platform was used for students’ feedback on school assignments. The personal data of nearly 6,000 students and 1,300 employees was processed, without a proper impact assessment conducted, (Art. 35 of the GDPR). In particular, when the student system was put into use, it was supported by an older assessment from 2014, by another municipality, carried out about the use of Google solutions in education, and it was considered satisfactory.
Employee data requests: The Italian privacy regulator fined Autostrade per l’Italia and Amazon Italia Transport 100,000 and 40,000 euros respectively, for not having given timely and reasoned feedback, not even denial or deferral, to requests for access to their data presented by some employees and former employees. In the first case, the group requested information on the calculation of their pay slips. When asked for explanations by the regulator, the company had not responded so as not to compromise its right to defence in court, as several legal proceedings were underway between the company and the workers regarding the methods of calculating severance pay.
In the case of Amazon, the authority followed the complaint of a former employee about the company’s failure to respond to a request for data relating to his employment relationship. The company had not responded to the request because it was drawn up in a very broad and generic manner. In both cases, the regulator concluded that the data controller should have responded at least with the reasons not to proceed with the request or ask for more details as in the case with Amazon.
Reprimands
Failed TOMs: Meanwhile in the UK Finham Park Multi Academy Trust was reprimanded in respect of Art. 5 and 32 of the GDPR. An unauthorised third party utilised compromised credentials to access and encrypt Finham Park’s systems. 1843 data subjects were affected by the incident, and the ICO’s investigation found Finham Park did not have adequate account lockout or password policies in place.
The regulator also reprimanded Bank of Ireland UK for mistakes made on more than 3,000 customers’ credit profiles. It sent incorrect outstanding balances on 3,284 customers’ loan accounts to credit reference agencies, organisations that help lenders decide whether to approve financial products. This inaccurate data could have potentially led to these customers being unfairly refused credit for mortgages, credit cards or loans, or granted too much credit on products they were potentially unable to afford.
Data security
IoB and data protection: In its latest TechSonar report the EDPS explains privacy concerns behind the so-called ‘Internet of Behaviours’ (IoB). It is described as a “network in which behavioural patterns would have an IoB address in the same way that each device has an IP address in the Internet of Things, (IoT)”. An example could be the use of patients’ and employees’ location data in hospitals during the COVID-19 pandemic to identify the behaviours that spread or mitigate the virus.
General IoB relies on the collection and processing of data from different IoT devices, such as wearables, smart cameras or Bluetooth and Wi-Fi sensors. Thus, it suffers from transparency and control issues because it often lacks appropriate means to inform its users. Their data collection is seamless and the means to exert control over the processing are limited, states the report.
Password storage: The Italian data protection regulator and the national cybersecurity agency offer new Password Retention Guidelines, (in Italian). Too often identity theft is caused by the use of computer authentication credentials stored in databases that are not adequately protected with cryptographic functions. Stolen data then is used to illicitly enter entertainment sites, social media and e-commerce portals. They can also allow fraudulent access to forums and websites for paid and financial services. The guidelines are aimed at:
- data controllers or data processors that store the passwords of their users on their systems, which refer to a large number of interested parties, (eg, digital identity providers, email service managers, banks, insurance companies, telephone operators, healthcare facilities),
- subjects who access databases of particular importance or size, (eg, public administration employees), or to
- types of users who usually process sensitive or judicial data, (eg, healthcare professionals, lawyers, magistrates).
Big Data
Data breach notification for telecoms: The US Federal Communications Commission adopted rules to modify 16-year-old data breach notification rules to ensure that providers of telecommunications, interconnected Voice over Internet Protocol, and telecommunications relay services adequately safeguard sensitive customer information. They often collect large quantities of sensitive customer data, including telephone numbers a person has called and mobile phone location data showing the places they have been. The new rules cover certain personally identifiable information that carriers and providers hold concerning their customers and expand the definition of “breach” to include inadvertent access, use, or disclosure of customer information. It will also eliminate the mandatory waiting period to notify customers, after notification to the commission and law enforcement agencies.
Apple push notification data: Apple says it now requires a judge’s order to hand over information about its customers’ push notifications to US law enforcement, putting the iPhone maker’s policy in line with rival Google, Reuters reports. Users of smartphones receive push notifications informing them of fresh messages, breaking news, etc. The servers of Apple and Google handle almost all of these alerts. The practice placed the corporations in a unique position to help the government monitoring of users’ usage of certain applications.
Google location data: Meanwhile Google offers updates on its Location History and new controls coming soon to Maps. For example, when you first turn on Location History, the auto-delete control will be set to three months by default, which means that any data older than that will be automatically deleted. Previously this option was set to 18 months. Also, for users who have chosen to turn Location History on, the timeline will be saved only on their device. Just like before, users can delete all or part of the information at any time or disable the setting entirely.