TechGDPR’s review of international data-related stories from press and analytical reports.
Legal processes: UK new data protection draft bill, rules to prevent child abuse online
A UK new data protection draft bill was published on a parliamentary website. This document is intended to update and simplify the UK’s data protection framework to reduce organisational burdens while maintaining high data protection standards. The bill was introduced to the House of Commons and given its first reading on 18 July. This stage is formal and takes place without any debate. MPs will next consider it at the second reading on 5 September. The main provisions of the bill include:
- greater flexibility on how to comply with certain aspects of the data protection legislation (eg, relying on legitimate interest or amending the requirement for controllers to keep logs relating to processing);
- improving the clarity of the framework, particularly for research organisations;
- more certainty and stability for cross-border flows of personal data;
- changes to the Privacy and Electronic Communications Regulations 2003, relating to the confidentiality of terminal equipment, (eg, cookie rules), unsolicited direct marketing communications, (eg, nuisance calls), and communications security (eg, network traffic and location data);
- a framework for providing digital verification services in the UK to secure those services’ reliability and enable digital identities to be used with the same confidence as paper documents;
- a wider application of provisions on information standards extending to persons including providers of IT, IT services or information processing services used, or intended for use, in connection with the provision of health or the adult social care sector in England;
- smart data schemes to allow for the secure sharing of customer data, (eg, held by a communications provider or financial services provider), upon the customer’s request, with authorised third-party providers;
- use of personal data for law enforcement and national security purposes.
Meanwhile, the Irish government has approved the expansion of the Data Protection Commission, (DPC). The intention is to appoint two additional commissioners to support the evolving organisational structure, governance and business needs of the DPC. The appointments are to be made following the Data Protection Act 2018, which allows up to three commissioners to be appointed. The commission and its stakeholders, like the Irish Council for Civil Liberties, have regularly highlighted the increased working burden and investigative complexity. Ireland is a notable one-stop shop for the Big Tech companies headquartered in the EU. The DPC’s GDPR enforcement capacity, especially its cross-border aspects, has also been a point of debate in recent years across Europe.
The EDPB and EDPS have adopted a joint position on the proposal for a regulation of the European Parliament and of the Council laying down rules to prevent and combat child sexual abuse. The proposal lacks clarity on critical elements, such as the notions of “significant risk”. Furthermore, the entities in charge of applying those safeguards, starting with private operators and ending with administrative and/or judicial authorities, enjoy a very broad margin of appreciation, which leads to legal uncertainty on how to balance the rights at stake in each case. The EDPB and EDPS also believe scanning audio communications is particularly intrusive and must remain outside the scope of the obligations in the proposed regulation, both concerning voice messages and live communications. The regulators express doubts regarding the efficiency of blocking measures and consider that requiring providers of internet services to decrypt online communications to block those concerning CSAM would be disproportionate.
Official guidance: UK BCRs, use of biometric data, age verification online
The UK Information Commissioner’s Office, (ICO), has released updated guidance on GDPR-governed Binding Corporate Rules, (BCRs), application forms, and tables for data controllers and processors. The concept of BCRs to provide adequate safeguards for making restricted transfers was developed under EU law and continues to be part of UK law under the UK GDPR, (specifically, Art. 47). BCRs are intended for use by multinational corporate groups, groups of undertakings or enterprises engaged in a joint economic activity such as franchises, joint ventures or professional partnerships. The guidance is intended to assist controllers when preparing the UK BCR pack for approval: the application form, the binding instrument, and any supporting documents. EU and UK BCRs requirements in both jurisdictions currently overlap. Therefore, the ICO has simplified the UK BCR approval process for applicants.
The Spanish privacy regulator AEPD published a blog post, (in Spanish), on the use of biometric data from a data protection perspective. Biometric data processing techniques are based on collecting and processing people’s physical, behavioral, physiological, or neural traits through devices or sensors, creating signatures or patterns that enable the identification, monitoring, or profiling of people. Some methods require the cooperation of the individual. In contrast, other methods can capture biometric data remotely, without requiring the cooperation of the individual and without the individual being aware of it. When demonstrating the adequacy of treatment to the GDPR, it is convenient to use classification criteria of biometric operations:
- purpose of operations with biometric data concerning the purpose of the treatment,
- legal framework,
- scope of treatment,
- qualified human intervention,
- transparency,
- free choice of the data subject,
- adequacy, sustainability and necessity,
- minimum data,
- degree of user control,
- Implicit collateral effects in the biometric operation, (eg, proctoring), etc.
How to perform age control on a website? The French CNIL offers some effective and privacy-friendly solutions. After analyzing existing systems, the French privacy regulator recommends developing new solutions. The age control to protect young people is compatible with the GDPR, provided that sufficient guarantees are presented to minimize privacy breaches and prevent age control from being an opportunity for publishers to retrieve additional data on Internet users visiting their site. In addition, it is necessary to avoid the data being captured by a third party for malicious uses, (biometric data breach, phishing, spoofing, blackmail).
It is possible to verify age by using an automatic system’s credit card, facial analysis of facial features. However, these solutions must be operated by third parties with sufficient security and reliability to avoid data theft and ensure that the additional risks generated by their use are considered. Another solution is possible, says the CNIL, but presents specific technical difficulties or a lower maturity. In this case, a trusted third party is provided with reliable proof of age by an administration or a company that knows the Internet user and can certify his age. This proof would then be transmitted by the trusted site or by the user himself to the site to which the user requests access. The system recommended by the CNIL would provide triple protection of privacy:
- the person providing proof of age knows the identity of the user, but does not know which site is being visited;
- the person who transmits the proof of age to the site may know the site or service consulted, but does not know the identity of the user;
- the site or service subject to age verification knows that the user is of legal age and that a person is consulting it, but does not know their identity.
Investigations and enforcement actions: vehicle rental, progressive health research, wrongful patient referral, passwords in plain text, cookie violations
The supervisory authorities, (SAs), of the Baltic States launched coordinated preventive supervision on the compliance of personal data processing in the field of short-term vehicle rentals, the EDPB reports. The SAs have agreed that supervision will be carried out on enterprises whose main recipients of services are natural persons (eg, electric scooters). Primarily, merchants whose principal place of business is located in one of the Baltic States and who offer their services throughout the Baltics will be monitored. Concerning its decision-making, each SA may extend the scope of the supervision to the activities of enterprises that are also active in only one Member State.
The EDPB has published a selection of cases of strategic importance where there is a likely high risk to the rights and freedoms of natural persons. The degree of public debate and media attention is not included as a separate criteria, but the data protection authorities can take these factors into account. A proposal may be made if it concerns:
- a structural or recurring problem in several Member States;
- a case related to the intersection of data protection with other legal fields;
- a case that affects a large number of data subjects in several Member States;
- a large number of complaints in several Member States;
- a fundamental issue falling within the scope of the EDPB strategy;
- a case where the GDPR implies that high risk can be assumed, such as the processing of special categories of data, processing regarding vulnerable people such as minors, situations where a data protection impact assessment, (DPIA), is required, or situations where a DPIA is required based on the criteria for processing operations that are likely to result in high risk (as laid down in the EDPB Guidelines).
The Italian privacy regulator ‘Garante’ gave a favorable decision on the processing of data by a hospital aimed at the study of patients suffering from neoplastic, infectious, degenerative, and traumatic pathologies of the thoracic region. The project envisages the creation of a database and research activity in nine areas that will be the subject of further specific protocols and submitted to the competent ethics committees for each area. To give the green light, however, the authority asked the researchers to base the collection – and the subsequent processing of health data for medical research purposes – on “progressive stages” consent.
Garante previously authorized the collection and storage of data in the “Torax” database based on an initial consent expressed by patients at the time of participating in the study, provided that the hospital subsequently acquired specific consent from the patients. Garante decided for deceased or no longer contactable patients, and research projects were better defined and approved by the territorially competent ethics committees. The authority has favorably taken note of the technical measures implemented by the hospital to eliminate the risk of patient identification, deeming them suitable for ensuring the anonymization of the data processed. However, the company must periodically check these measures and possibly adjust.
Meanwhile, the Polish supervisory authority UODO imposed an administrative fine on the University Clinical Center of the Medical University of Warsaw. The decision was due to the failure to notify the UODO of a breach of personal data protection and the failure to notify the data subject. A patient received a referral from a doctor to a specialist clinic containing personal data about another person: their name, surname, address, identification number, information about the diagnosis and purpose of the advice. The administrator confirmed that there was a mistake in entering another patient’s personal data on the referral to a specialist clinic. Still, after analyzing it, he concluded that the referral used the personal data of a person who did not exist in reality. Although the controller qualified the incident as a security incident, it was not considered to have significant effects on the rights and obligations of the data subject.
In the opinion of the UODO, there was a breach of personal data protection consisting of the disclosure of personal data to an unauthorized person, (another patient), as a result of an error by a doctor issuing a referral to a specialist clinic. The document issued by the doctor contained only one mistake in the patient’s favour. However, the rest of the data contained in the referral, eg, name, address, and identification number, did apply to the patient. Hence, it cannot be considered that the event concerned a non-existent person. Despite the mistake to this person’s advantage, they can be easily identified.
The Danish data protection authority criticized and issued two orders to EG Digital Welfare ApS. The IT system Mediconnect offered by EG, among other things, is used by municipalities, regions, and insurance companies to handle sensitive and confidential information about citizens. In this context, EG acts as a data processor for the Mediconnect IT system. It appears from the case that passwords are stored in the Mediconnect IT system in plain text, opening the possibility of access to special categories of data that are username and password-protected. The regulator issued an order to carry out irreversible encryption of passwords, and to ensure that the login solution is not done exclusively using a username and password (eg, multi-factor login, certificates, tokens, or a PKI solution).
Spain’s AEPD fined Vueling Airlines 30,000 euros for cookies violations. According to the complaint, when accessing Vueling’s website, users could not reject cookies or purchase tickets without accepting the sending of commercial communications and promotions. Vueling’s misuse of cookies on its website constituted a violation of Art. 22 of the country’s Information Society Services and Electronic Commerce legislation. The AEPD imposed on Vueling the above fine, which was subsequently reduced to 18,000 euros following Vueling’s admission of guilt and the voluntary payment of the fine.
Audits: an insurance company’s data processing
The UK ICO has audited Somerset Bridge Insurance Services Ltd data processing. The company agreed to it consensually. It was agreed that the audit would focus on direct marketing: the processes in place where an organisation undertakes marketing activities directed at customers on their database and/or obtained from third-party lists. This would include controls for management structures, policies, and procedures, monitoring and reporting, training, fairness and transparency, lawful consent, accuracy and integrity of records, operations, and data subjects’ rights. The summary of the audit was as follows:
- The company processes personal data from customers obtaining insurance quotes and policies.
- It collects personal data directly from its customers through its website, aggregator sites, or telephone calls.
- It only relies on active opt-in consent for any form of marketing, including via email, phone, or SMS.
- It currently does not use soft opt-in. Electronic marketing is mainly through a monthly newsletter. Each email to the customer includes the option to unsubscribe.
- It does not process special category data when processing data for marketing purposes.
- Automated marketing calls are not made.
- It does not buy in marketing lists from third parties.
The ICO auditors reported a high level of assurance that the direct marketing activities conducted by the company were compliant with the UK GDPR, DPA 2018 and the Privacy and Electronic Communications Regulations.
Data security: ransomware attacks
The EU cybersecurity agency ENISA stated that ransomware is one of the most devastating types of cybersecurity attack over the last decade and has grown to impact organisations of all sizes across the globe in the last year:
- About 10 terabytes of data were stolen each month by ransomware threat actors. 58.2% of the data stolen included employees’ data.
- At least 47 unique ransomware threat actors were found.
- For 94.2% of incidents, it is unknown if the company paid the ransom.
- When negotiation fails, the attackers usually publish the data on their web pages. This happens often and is a reality in 37,88% of incidents.
- The remaining 62,12% of companies either came to an agreement with the attackers or found another solution.
Several different ransomware business models emerged from the study: a) individual attackers; b) ransomware-as-a-service model; c) a data brokerage model; and d) a model aimed mostly at achieving notoriety. Thus the ENISA report recommends the following:
- keep an updated backup of your business files & personal data;
- keep this backup isolated from the network;
- apply the 3-2-1 rule of backup: 3 copies, 2 different storage media, 1 copy offsite;
- run security software designed to detect most ransomware in your endpoint devices;
- restrict administrative privileges, etc.
Big Tech: Paramount Global, US tech in Russia, TikTok in US, Manchester City’s smart scarf
Paramount Global, owner of CBS, is facing a class action lawsuit that alleges the Hollywood giant tracked and collected CBS.com subscriber data and sold it to Facebook without users’ consent. Paramount is accused of violating the Video Privacy Protection act, and Facebook has already recognised it uses CBS.com subscriber data, via the Facebook Tracking Pixel that Paramount uses.
Russia continues to tighten the regulatory screws on US tech firms, with fines imposed on Snapchat, WhatsApp, and Tinder for failing to store the data of their Russian users on local servers. Local data storage is a requirement since a 2019 law, although many western companies have fallen foul of it, and the number is growing.
China’s TikTok has paid a 92 million dollar settlement in a 2019 case brought in a Federal court in Illinois, alleging multiple data protection and privacy violations and illegal collection of biometric data. As part of the deal, TikTok must now restrict and disclose in its privacy policy what it collects and end the secret sending of data overseas.
Tech incorporated in clothes gives you useful feedback on a range of things. Now Manchester City have made their fans a scarf that gives the club loads of information about the wearer’s match experience. An EmotiBit sensor can read blood pressure, heart rate, emotional arousal or stress levels. The club has partnered for the pilot stage with Cisco, tech and production company Unit9, and sports marketers Octagon UK, although Man City is being coy for the moment about just what personal data will be collected and shared and with whom.