EU-US redress mechanism

Data protection digest 18 Apr – 02 May 2024: EU-US redress mechanism and European Health Data Space taking shape

As part of the new EU-US redress mechanism, data subjects in the EU/EEA will have access to specific complaint forms in the event that they suspect violations regarding their data transferred to the US, whether related to commerce or unlawful access to it by signals intelligence activities.

Stay tuned! Sign up to receive our fortnightly digest via email.

EU-US redress mechanism

The EDPB has completed its much-anticipated Information Note and a Complaint Form for EU/EEA individuals about alleged violations of US law concerning personal data collected by US national security authorities. It applies regardless of the transfer tool used to transfer the complainants’ data to the US, (Data Privacy Framework, standard or ad hoc contractual clauses, binding corporate rules, codes of conduct, certification mechanisms, derogations). However, this redress mechanism only applies to data transmitted after 10 July 2023. 

In short, after receiving and verifying the complaint, the data protection authority, (DPA), will transmit it, in an encrypted format, to the EDPB Secretariat. The latter will then transmit it to the US authorities for a binding decision, taken by the Office of the Director of National Intelligence’s Civil Liberties Protection Officer, (CLPO). Complainants can appeal the CLPO’s decision before the Data Protection Review Court within 60 days after receiving the notification by the DPA. There is also a possibility to complain about commercially related violations to EU DPAs. 

In July 2023, the European Commission decided that the US ensures an adequate level of protection for personal data transferred from the EU to organisations in America that are included in the ‘Data Privacy Framework List’, without the need to rely on Art. 46 GDPR transfer tools, (standard data protection clauses, binding corporate rules). The US Government in the meantime aims to introduce safeguards against bulk and targeted collection of intelligence signals, (eg, FISA Section 702), that apply to all data transferred to the US, regardless of the transfer tool used by the EU exporters.

More legal updates

FISA Section 702 reauthorised: In parallel, a new US bill just signed into law extends a key US surveillance program for another two years. Legislators claim the surveillance tool first authorised in 2008 is crucial in disrupting terrorist attacks, cyber intrusions, and foreign espionage. It permits the government to collect without a warrant the communications of non-Americans outside the country. Amendments to protect Americans’ communications when they are in contact with those targeted foreigners, by getting a prior warrant from a judge, failed the final passage. 

UK adequacy threatened: The Parliament Justice Committee, (LIBE), has criticised the overall direction of the data policies of the UK Government. Its current governmental actions are eliminating constraints arising from European or international law and limiting the impact of European court jurisdiction and interpretations on UK law. Concerns exist about UK intelligence agencies, especially their bulk collection of communication data, which is not in line with the EU Charter of Fundamental Rights. Thus, the UK could become a transit country for data that cannot be sent from the EU/EEA to “inadequate” third countries.

UK data protection reform moves on: The new Data Protection and Digital Information Bill went through the final examination of the committee stage. After the final reading, followed by the consideration of amendments stage in Parliament, (which can be a lengthy process), it will be presented for Royal Assent to become law. The new law promises to solve the complexity of the current regulatory regime, reduce compliance costs, and remove barriers to responsible innovation so that firms, public sector organisations and consumers can take “full advantage of the benefits” of data. 

Data Scraping

Data scraping by private actors is almost always illegal, explains the Dutch data protection authority AP. Scraping is the automatic collection and storage of information from the Internet. In several cases, it is already not allowed anyway, including: a) scraping the internet to create profiles of people and resell them; b) scraping information from protected social media accounts or private forums; c) scraping data from public social media profiles for insurance matters, etc. 

A widespread misunderstanding is that scraping is allowed because everything on the internet is already available to everyone. This does not imply consent by the individual. Scraping for the legitimate interest of private businesses or individuals should not be used if the sole purpose is making money. However, scraping can be justified when a company gets information from media outlets on its activities.

More official guidance

EU-US redress mechanism

Targeted advertising: A CJEU Advocate General’s opinion in the Schrems/Meta case, (C-446/21), similarly states that processing data for personalised advertising purposes cannot be justified just by meeting “the manifestly made public” condition for special category data. It rather elevates the particular protection granted to the special categories of data under Art. 9 of the GDPR, which means that it still must be evaluated as “ordinary” personal data, treated lawfully, clearly, and proportionately, and respecting the purpose limitation principle.

BCRs maturity test: The French data protection authority CNIL published a self-assessment tool to test the level of maturity of organisations’ Binding Corporate Rules for restricted data transfers. The companies concerned are private businesses of multinational types, established in several countries of the EU and abroad.  The set of resources covers all stages of a project, from its preparation to the approval procedure. The test is to be completed by the data protection officer or any other person in charge of the BCR project.

Health Breach Notification: The US Federal Trade Commission finalised changes to the Health Breach Notification Rule. It underscores its application to health apps and similar technologies not covered by HIPAA, and obliges them to notify individuals, the Commission, and, in some cases, the media of a breach of unsecured personally identifiable health data. It also requires third-party service providers to those vendors of related entities to notify them following the discovery of a breach.

Safe biometric technology use

The Dutch data protection authority AP answers some frequently asked legal questions about facial recognition. The document is intended for privacy professionals and organisations that want to use facial recognition. Facial recognition is in principle prohibited. One of the exceptions is when facial recognition is necessary for authentication or security purposes (eg, the security of a nuclear power plant, or military production needs). However, this applies only once the data protection impact assessment ,(DPIA), has been carried out, demonstrating that it is necessary and that there is an important public interest. 

The AP also defines under which conditions there can be ‘personal or household use’ when applying facial recognition. For example, unlocking a phone with facial recognition, if the biometric data is stored on the phone itself, and the user decides what happens to that data. It must be up to the user to decide – whether to unlock the phone using a PIN code or face recognition. 

European Health Data Space

MEPs approved the creation of the European Health Data Space, improving citizens’ access to their health data and boosting secure sharing in the public interest. Universal Electronic health records, (EHR), will include patient summaries, electronic prescriptions, medical imagery and laboratory results. They will be available for health professionals across the EU, (with the patient’s consent), and for trusted entities such as clinical researchers, statisticians and policy-makers, (in an anonymised or pseudonymised format). Once officially published after the Council’s approval, it will be applied two years later, with some temporary exceptions for specific categories of data. 

Receive our digest by email

Sign up to receive our digest by email every 2 weeks

Sim cards illicit activation fine

A company in Italy that manages two phone shops will have to pay 150 thousand euros for having illicitly activated SIMs, subscriptions and charges for the purchase of cell phones and GPS trackers using the personal data of hundreds of users without their knowledge. The company had activated 1300 telephone cards using data and identity documents extrapolated from the systems of the telephone operator whose products it sold to unduly saved in-store. For instance, a complainant was charged on her credit card relating to the activation of a new contract in the name of her deceased husband.

The company had also activated unsolicited services by inducing customers to sign, via a tablet, without clarifying the consequences of such consents, along with selling mobile phones which had not been requested by customers nor delivered to them. The company had evaded the controls of the telephone operator and the related provisions regarding the processing of user data, thus acting as an independent data controller.

More enforcement decisions

Cookie collection without notice: The Croatian data protection regulator issued administrative fines of 15,000 and 20,000 euros on managers of gambling and betting activities due to the illegal processing of personal data through cookies, and without allowing the users to give or withdraw their informed and voluntary consent. In particular, the processing managers did not separate the cookie banner or enable respondents to consent to different purposes, (marketing, analytics/statistics). 

The processor also did not adequately inform the users about the legal basis, groups/types of cookies, the function/purpose of each cookie, and the cookie storage period. In addition, the data controller was fined for processing the respondents’ data at the very moment of loading the website, (since the respondents were not informed about the processing). 

Prohibited employment practices: The French CNIL notified a company to minimise candidates’ data collection. The company required applicants to provide their place of birth, nationality, marital status, (spouse’s name and surname, date and place of birth, their profession, the number of children and their age), as well as all salaries received in previous companies. This information was not necessary for assessing the candidate’s ability to perform the job. An aggregate level of detail reflecting the candidate’s nationality, (French, EU and non-EU categories), would suffice. The candidate could, however, on their initiative, provide any useful information, including to justify their salary claims.

Ring case

In the US, following a settlement with Ring, the Federal Trade Commission is returning more than 5.6 million dollars to customers. The company allowed employees and contractors to access consumers’ private videos and failed to implement security protections, enabling hackers to take control of consumers’ accounts, cameras, and videos. Ring also deceived its customers by failing to restrict employees’ and contractors’ access to its customers’ videos, using its customer videos to train algorithms without consent. 

Data security

Ransom attack: The EDPB provided a summary of a recent Greek regulator fine where a company, (Hellenic Post Services ELTA SA), failed to implement technical and organisational measures resulting in unauthorised access by third parties. The first incident involved a breach of data which was encrypted to demand a ransom, the result of a malicious attack by third parties while the second incident involved the leakage of personal data, which was subsequently published on the Dark Web. 

Cybersecurity tool: The UK National Cybersecurity Centre issued the latest version of the Cyber Assessment Framework reflecting the increased threat to critical national infrastructure. The guide is for all organisations responsible for securing any critical network and information systems, covering remote access, privileged operations, user access levels and multi-factor authentication, (B2a and B2c principles). Other organisations may find this tool useful too.  

Strong password rule: In the UK makers of phones, TVs, and other internet-connected smart devices are now legally required to meet minimum security standards, states the Department for Science, Innovation and Technology. Manufacturers are banned from having weak default passwords like ‘admin’ or ‘12345’ and if there is a common password the user will be prompted to change it on start-up. 

Big Tech 

Data brokerage: A new data broker restriction was signed into law on 24 April in the US, JDSupra law blog reports. ‘Protecting Americans’ Data from Foreign Adversaries Act of 2024’ prohibits data brokers from sharing sensitive personal information with a broad range of entities that may have ties to Russia, China, Iran, and North Korea. This includes data on finances, genetics, health, biometrics, communication contents, exact geolocation, and data about minors. Any organisation that provides data to another organisation that isn’t serving as a service provider in exchange for a significant fee is known as a “data broker.” 

US TikTok/China row: ByteDance prefers TikTok be shut down rather than sold if the Chinese owner exhausts its legal options in fighting legislation to ban the platform from US app stores, according to Reuters. The US recently passed legislation allowing for the suspension of the popular service due to widespread concerns that China may access Americans’ data or use the app for spying. TikTok’s major assets include its algorithms, source codes, user data, and product operations and management. However, Chinese rules preserve TikTok’s intellectual property, making it difficult for US buyers to obtain source codes and similar data acquisition.

“Cookie pledge” fails: As Google delays the demise of third-party cookies, a European Commission campaign to get Big Tech companies to voluntarily commit to a “cookie pledge” has reportedly failed. The draft pledging principles ensure that users receive concrete information on how their data is processed, and the consequences of accepting different types of cookies; consent should not be asked again for a year once it has been refused. Some companies lost interest in the proposal since they depend on data harvesting for income, while others were worried that it would not comply with existing laws. 

Do you need support on data protection, privacy or GDPR? TechGDPR can help.

Request your free consultation

Tags

Show more +