TechGDPR’s review of international data-related stories from press and analytical reports.
Legal processes: UK IDTA, EU Clinical Trials Regulation, digital Surveillance & International law
The implementation of the UK (post-Brexit) international data transfer agreement, (IDTA), stepped into its final stage after being laid before Parliament. If no objections are raised, the IDTA, the Addendum to the EU Commission’s Standard Contractual Clauses and transitional provisions come into force on 21 March. All documents will be immediately of use to organisations to comply with Art. 46 of the GDPR when making restricted transfers outside of the UK to countries not covered by adequacy decisions. The IDTA and Addendum replace the current standard contractual clauses for international transfers. They also take into account the binding judgement of the CJEU, in the case commonly referred to as “Schrems II”, which invalidates the EU-US data transfer framework. Read more on the UK restricted transfers including a checklist with various examples and exemptions for the organisations here.
The EU Clinical Trials Regulation, enacted back in 2014, took effect on 31 January. It repealed the Clinical Trials Directive and national implementing legislation in the EU Member States. Under the Regulation, clinical trial sponsors can use the Clinical Trials Information System (CTIS) from 31 January, but are not obliged to use it immediately, in line with a three-year transition period. The CTIS provides a single-entry point for clinical trial application submission, authorisation and supervision in the EU/EEA while ensuring the highest levels of protection and safeguarding the integrity of the data generated from the trials. Recently the European Federation of Pharmaceutical Industries and Associations also confirmed that its GDPR Code of Conduct on Clinical Trials and Pharmacovigilance had progressed to the final phase of review by Data Protection Authorities prior to formal submission to the EDPB for approval.
Privacy International published updated analysis into International Law and digital Surveillance due to a rapid development in the technological capacities of governments and corporate entities to intercept, extract, filter, store, analyse, and disseminate the communications of whole populations. A 282-page document includes legal updates on UN resolutions, independent expert reports and European and international human rights bodies’ jurisprudence. The right to privacy is analyzed through the lens of legality, necessity, proportionality and adequate safeguards. In particular, it offers a deep dive into: a) extraterritorial application of surveillance capabilities, (intelligence data sharing, adequacy mechanisms, EU-US data transfer dilemma), b) distinctions in safeguards between metadata and content, c) right to privacy and roles and responsibilities of companies, d) encryption, e) biometric data processing, and much more.
Official guidance: GDPR-CARPA, health industry PETs, commercial management data, US Health Breach Notification
The EDPB adopted its opinion, (the first of its kind), on the GDPR-CARPA nationwide certification scheme submitted by the Luxembourg Supervisory Authority CNPD. It is a general scheme, which does not focus on a specific sector or type of processing, but helps data controllers and processors demonstrate compliance with the GDPR. The EDPB believes that organisations adhering to it will gain greater credibility, as individuals will be able to quickly assess the level of protection of their processing activities. After approval by the CNPD, the certification mechanism will be added to the register of certification mechanisms and data protection seals in accordance with Art. 42 of the GDPR. However, the EDPB stresses that GDPR-CARPA is not a certification according to Art. 46 of the GDPR and therefore does not provide appropriate safeguards within the framework of transfers of personal data to third countries or international organisations. Read the full report here.
The UK Information Commissioner’s Office, (ICO), invites organisations in the health sector to participate in workshops on privacy-enhancing technologies (PETs). The aim is to facilitate safe, legal and valuable data sharing in the health sector and understand what’s needed to help organisations use these technologies. According to the Director of Technology and Innovation at the ICO, PETs help organisations build trust and unlock the potential of data by putting data protection by design into practice, but their implementation appears to be incredibly slow. The information gathered from the workshops will help the ICO develop updated guidance and advice. It welcomes people from both the private and public sectors, namely:
- health organisations and health technology start-ups that aren’t using PETs yet;
- health or care organisations already using PETs;
- academic experts and researchers in this field;
- suppliers of PETs; and
- legal and data protection experts. (Interested organisations can sign up through this link until 14 February.)
The French regulator CNIL has published two new standards – on commercial management and management of outstanding payments. Both tools provide legal certainty to the organizations and allow them to bring their processing of personal data into compliance. These guidelines are not mandatory: organizations can deviate from their recommendations provided they can justify their choices. The framework applies to management of orders, delivery, performance of the service or supply of goods, management of invoices and payments, unpaid debts, loyalty programs, monitoring customer relations for carrying out satisfaction surveys, managing complaints and after-sales service, or carrying out commercial prospecting actions. Some processing activities are excluded from the standards, such as fraud detection and prevention or processing implemented by debt management and collection organizations. It also does not include scoring outstanding debts, sharing data with or from a third party, etc. Both documents can be read here and here.
The US Federal Trade Commission, (FTC), has updated Guidance on the Health Breach Notification Rule, JD Supra reports. For most hospitals, doctors’ offices and insurance companies, the Health Insurance Portability and Accountability Act (HIPAA) governs the privacy and security of health records stored online. Health Breach Notification Rule requires certain organizations not covered by HIPAA to notify their customers, the FTC and, in some cases, media, if there is a breach of unsecured, individually identifiable health information. Makers of health apps, connected devices, and similar products must comply with the rule (vendor of personal health records, (PHRs), PHR related entity, third party service provider for a vendor of PHRs or a PHR related entity). Read more on the definition of the above terms, as well what to do if a breach occurs, who and when notify, and what information to include, in the original publication.
Data breaches, investigations and enforcement actions: lack of specific consent, SIM Card fraud, failed anonymisation, IAB Europe/TCF
The EDPB published an analysis, at the request of the Spanish data protection regulator AEPD, of the recent Caixabank (Payments & Consumer) 3 mln euro fine. The case relates to lack of specific and informed consent regarding profiling and decision-making for commercial purposes. The financial establishment and payment institution’s business activities include marketing credit or debit cards, credit accounts with or without a card, and loans through three channels: direct, through an agent, or through prescribers, (points of sale with whom you collaborate — for example, IKEA). In the framework of its commercial activities, Caixabank makes profiles for the following purposes:
- Analyzing the risk of default upon application for a product.
- Analyzing the risk of default during the application for a product.
- Selecting target audience.
Consent is requested in the various channels of prescribers and agents for study and profiling purposes. In this case, the interested party was provided only with generic information on the different profiling and was not able to know exactly what the treatment was they were consenting to. Nor was there any provision for the person concerned to express his or her choice on all purposes for which the data are processed. The controller also has to bring processing operations into compliance with the provisions of the GDPR within six months of the decision.
The AEPD has also fined Vodafone 3,9 mln euros for accountability and security failings, (Art. 5 of the GDPR), Data Guidance reports. Several customers lodged complaints with the AEPD as victims of fraud due to the deceitful use of their SIM cards. Reportedly the criminals obtained a replica of the data subjects’ SIM cards through Vodafone, and consequently carried out various bank transfers from online banking services and concluded contracts at the expense of those affected. The investigation found that Vodafone:
- had not properly checked the identity of the fraudsters before issuing the SIM cards;
- was unable to prove that they had verified the identity of the requester of the replication, the invoices issued, or the effectiveness of the measures implemented;
- any person who had the basic personal data of a data subject could avoid Vodafone’s security policy, and obtain a replica of the data subject’s SIM card;
- the duplication of SIM cards occurred as a result of human error, indicating a deeper problem within the organisation, which demonstrated a lack of foresight of the risks;
- data subjects lost their power to organise and control their personal data, as a SIM card allows the access to apps and services that require authentication or password retrieval via SMS. You can read the full decision (in Spanish) here.
The Greek data protection authority imposed a total 9,2 mln euro fine on telecommunications companies for personal data breaches and illegal data processing. The regulator investigated the circumstances under which the breaches took place and the legality of record-keeping, as well as the security measures applied. A leaked file contained subscribers’ traffic data and was retained in order to handle any problems and malfunctions. for a period of 90 days from the date of making the calls. At the same time, the file was also “anonymised”, (in fact pseudonymised), and kept for 12 months to reach statistical conclusions about the optimal design of the mobile telephony network, once it has been enriched with additional simple personal data. As a result, the companies were found responsible for poor data protection impact assessment, poor anonymisation, inadequate security measures taken, insufficiently informing subscribers, and failure to allocate the GDPR-governed roles of collaborating companies (COSMOTE/OTE).
The Belgian data protection authority has found that the Transparency and Consent Framework (TCF), developed by Interactive Advertising Bureau (IAB) Europe, fails to comply with a number of provisions of the GDPR. The TCF is a widespread mechanism that facilitates the management of users’ preferences for online personalised advertising, and which plays a pivotal role in so-called Real Time Bidding. When users access a website or application with advertising space, technology companies representing thousands of advertisers can instantly bid behind the scenes for that advertising space through an automated algorithmic auction system, in order to display targeted ads. The draft decision was examined within the cooperation mechanism of the GDPR, (the one-stop shop mechanism), and was approved by all concerned authorities representing most of the thirty countries in the EEA. IAB Europe now has two months to present an action plan to bring its activities into compliance.
Individual rights: blocking user tracking methods
The French regulator CNIL published a user-oriented guide, (in French), on New online tracking methods and solutions to protect yourself. Cookies are not the only means used to track your online activity. Web players are increasingly using alternatives such as:
- unique digital fingerprinting uses all the technical information provided by your computer, phone or tablet (language preference, screen size, browser type and version, hardware components, etc.) sometimes combined with the collection of the IP address;
- tracked link (one of the most common is the insertion of web beacons in emails to find out if a message has been opened by its recipient);
- unique identifiers (most often, this data is the e-mail address. When you give your email address, for example to register for a site or a newsletter or to place an order online, it is hashed in order to generate a unique identifier).
The main solutions include either blocking the technical solution or blocking solution provider (eg, blocking domains using these techniques, link cleaning, web beacon blocking, browser extensions, one-time emails, etc.)
Big Tech: supermarket age verification system, mental health helpline
Technology used in checkout-free supermarkets is being trialled to identify underage drinkers in several UK supermarket chains, BBC Tech reports. Designed to cut waiting times in queues, the automated age verification system, which requires the customer’s consent, uses an algorithm to guess how old they are. This is based on a sample of 125,000 faces aged six to sixty. If it decides they are under 25, ID is required at the till. The maker, Yoti, claims that on average the system is accurate to within 1.5 years for 16 to 20 year-olds. This is not facial recognition, Yoti stresses, which tries to match individual faces to those on a database, and the system will not retain the images it takes.
US-based mental health helpline Crisis Text Line, (CTL), is ending data sharing with AI customer support Loris.ai, reports Politico and BBC Tech. Nonprofit CTL, a giant in its field, says it has “the largest mental health dataset in the world”. However it spun Loris.ai off as very much a for-profit venture, and Loris uses the data to create and market customer service software. One CTL board member now says they were “wrong” to share the data with Loris, even anonymised, and transfers have been stopped. CTL insisted that any initial responses to calls for help included a consent feature, and that it was ‘transparent’ about data sharing. Criticisms however questioned the validity of the consent in many cases, considering the state of mind of crisis callers.