TechGDPR’s review of international data-related stories from press and analytical reports.
Legal processes and redress: compliant cookie banner, CEO liabilities, litigation data, virtual currencies
NOYB privacy foundation launches the second wave of complaints against deceptive cookie banners after the campaign first started last spring: “Another 270 draft complaints were sent to website operators whose banners don’t comply with the GPDR”, the statement on their website says. NOYB also offers guidelines for companies on how to comply and only files formal GDPR complaints against those who remain non-compliant after a 60-day grace period. Overall, NOYB claims, the first wave of complaints was successful with more and more websites implementing compliant cookie banner. NOYB also published screenshots of sites and their improved banners, including Nikon, Domino’s Pizza and Unilever, and others, available for download. In the coming months, NOYB will continue to review, warn and enforce the law on up to 10,000 websites. It will extend its scope to pages that use other Consent Management Platforms, (CMPs), than OneTrust, such as TrustArc, Cookiebot, Usercentrics, Quantcast, etc.
A German court recently ruled that a CEO was personally liable for a data privacy breach after they hired a detective to investigate possible criminal acts by the plaintiff, Technologyquotient reports. Under Art. 82 of the GDPR anyone who suffers non-material damage as a result of a GDPR infringement shall have the right to receive compensation for the damage suffered. In the related case the CEO, on behalf of the defendant company, commissioned a detective to investigate possible criminal acts committed by the plaintiff who had submitted a membership inquiry to the company. The detective’s findings revealed that the plaintiff had been involved in criminal acts. When the company’s shareholders were informed of this, they rejected the membership application. The court ruled that:
- the CEO hiring a detective violated data protection law and the plaintiff was awarded 5,000 euros in non-material damages;
- the CEO was personally liable for the data protection violations and the damage claim, alongside the company;
- it classified the CEO as a data controller, which distinguishes them from an employee who is bound by instructions;
- Since the European Court of Justice has tended to apply a very broad interpretation of a data controller, it seems likely that other courts could follow suit.
Italy’s Ministry of Economics and Finance has published its recent decree on the registration of service providers on Italian soil for virtual currencies and digital wallets, Data Guidance reports. They will have to register in a special section of the currency exchange register run by the Body for the Management of the Lists of Financial Agents and Credit Brokers (‘OAM’). Legal trading will not be possible without registration. Once the decree comes into force the OAM has 90 days to initiate the system, and companies already operating in Italy or online in the country will have a further 60 days to register. Before the OAM processes any personal data its technical and organizational security measures for personal data will need endorsement by the national data protection authority, ’Garante’.
The US Department of Justice has reportedly knocked a Senate-passed cybersecurity bill as having “serious flaws,” criticizing it over a lack of direct reporting to the FBI. The bill, the Strengthening American Cybersecurity Act, unanimously passed in the Senate on Tuesday night. It would require companies in critical sectors to alert the government of potential hacks or ransomware. The legislation would require cyber incidents to be reported to the Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency, though Justice Department officials argue that agencies should also report to the FBI.
Chinese data security laws increasingly create roadblocks for litigants seeking discovery in US courts, Technology Law Dispatch reports. Two Chinese information security laws, the Data Security Law, DSL, and the Personal Information Protection Law, PIPL, are creating difficulties for parties involved in litigation in the US seeking discovery materials stored in China. Both require data processors to obtain approval from the Chinese government before transferring any data stored in China to a foreign court or law enforcement authority or otherwise face significant penalties such as fines in the millions of dollars. In particular:
- The DSL broadly applies to “data processing activities” which include collection, use, processing, transmission, disclosure, and data management, and where “data” includes any record of information in electronic or another form.
- The DSL applies to extraterritorial data processing activities, as well as activities within China that would be detrimental to its national interests.
- Similarly, the PIPL applies to the processing of personal information about individuals in China.
Official guidance: CoC as data transfer tool and for clinical trials data, direct marketing
The EDPB has adopted final Guidelines on Codes of Conduct, (CoC), as tools for personal data transfers. Its executive summary says the GDPR requires that controllers/processors shall put in place appropriate safeguards for transfers of personal data to third countries or international organisations. To that end, the GDPR diversifies the appropriate safeguards that may be used by organisations under Art. 46 for framing transfers to third countries by introducing amongst others, CoC as a new transfer mechanism (Art. 40-3 and Art. 46-2-e). Once approved by the competent supervisory authority and having been granted general validity within the Commission, a CoC may be used by controllers or processors not subject to the GDPR located in third countries for the purpose of providing appropriate safeguards to data transferred to third countries. The guide provides clarification as to the role of the different actors involved in the setting of a code to be used as a tool for transfers and the adoption process with flow charts.
Meanwhile, the Spanish data protection authority AEPD published, in Spanish, its first CoC on the processing of personal data on clinical trials, DLAPiper reports. The Code has been published in collaboration with an association that brings together the majority of pharmaceutical companies established in Spain. It is the first sectorial code of conduct approved in Spain since the GDPR came into force, as well as the first code approved in the EU in this field. Thus, while the territorial scope is limited to Spain, it could become a benchmark at the EU level. The Code regulates how sponsors of clinical trials and contract research organizations decide to adhere, and to the implementation of the GDPR within the scope of clinical trials, as well as during the fulfillment of the obligations imposed by pharmacovigilance regulations, for the detection and prevention of adverse effects of medicines already on the market:
- establishment of protocols facilitating the application of the GDPR;
- details on the codification of the data; and
- the responsibility of each participant on the clinical trial;
- the establishment of protocols for the collection of information on possible adverse reactions, depending on who makes the notification;
- the establishment of a mediation procedure, voluntary and free of charge, which allows for an agile response to possible claims made by interested parties against member entities. The CoC is available in Spanish on the AEPD website.
The German Data Protection Conference, ‘DSK’, published revised guidance, (in German), on the processing of personal data for direct marketing purposes, DataGuidance reports. The guidance supplements information obligations and the conditions for consent, namely:
- informed consent requires that the type of intended advertising, (eg, letter, email, SMS, telephone, or fax), as well as the products or services to be advertised and the advertising companies, are mentioned in order to meet the requirements;
- a separate text or text section without any other content is to be used on a regular basis;
- if the declaration of consent under data protection law is to be given together with other; in particular contractual declarations in writing or in an electronic format, it must be presented in a manner that is clearly distinguishable from other facts, (Art. 7-2 of the GDPR);
- apart from explicit consent under Art. 9, the GDPR does not contain standard permission for the processing of special categories of personal data for advertising purposes, (it must be examined in each individual case whether conclusions about the health of a person can be drawn from the fact that they are a customer of a certain company in the health sector), etc. You can read the guidance here.
Enforcement actions: former employees’ email accounts, technical and organisational measures, verification of the processor
The Slovakian data protection authority has ruled on two cases where employers failed to deactivate former employees’ email accounts, Iuslaboris blog post reports. Both cases found that the employers, in both private and public sectors, were in breach of data privacy rules. In the first case:
- A former manager objected that the employer had not deactivated his email account after the termination of his employment and that it was still active and monitored by another manager within the company. In its defense, the employer used the legitimate interest argument, (protection of the employer’s property, business contacts, client responses).
- The regulator stated that legitimate interest can be a suitable legal basis for this kind of processing, however, the processing can only be carried out for a necessary period; ten months cannot be considered as necessary.
In the second case, after the termination of her employment, a former employee of a municipality created a fake email account. Subsequently, she used this fake account and sent a question to her municipality’s email. Her goal was to find out whether or not the municipality had deactivated this email account. Once she received an answer, and thus had proof of a possible breach of the GDPR, she filed a complaint with the regulator:
- The municipality claimed that the former employee had failed to hand over her agenda properly (communication with various state authorities, social security agencies, health insurance companies, rental apartment agendas).
- The municipality was therefore obliged to monitor this email account to prevent itself from being held liable for potential damages or unlawful conduct.
- The regulator found an absence of proof of a demonstrable legal basis for the above processing activities.
The Polish data protection authority, UODO, ordered a record-breaking penalty, (approx. 1 mln euros), on “Fortum Marketing and Sales Polska” for failure to implement appropriate technical and organisational measures ensuring the security of personal data, and for failure to verify the processor, who was also fined approx. 50,000 euros. After analyzing the notification of a personal data breach from the company, the supervisory body initiated ex officio administrative proceedings. Here are some facts from the case:
- The data breach consisted of copying the data of the administrator’s clients by unauthorized persons.
- It happened at the moment of introducing changes in the ICT environment.
- This change was made by the processor with which the administrator cooperated on the basis of concluded contracts, including contracts for entrusting the processing of personal data.
- During the changes made, an additional customer database was created.
- However, this database was copied by unauthorized persons, because the server on which it was deployed did not have properly configured security.
- The administrator learned about the incident not from the processor, but from two independent Internet users.
Moreover, the safety functions were not tested in the course of the work carried out for this purpose. The processing entity acted inconsistently with the commonly known ISO standards, and at the same time against the provisions of its own security policy. The processor also did not comply with the provisions of the contract for entrusting the processing of personal data, in which he undertook, inter alia, to implement pseudonymisation of data, which was to be treated as a mechanism guaranteeing an appropriate level of data security.
Individual rights: health apps data
Privacy International published a ‘long-read’ on how health apps could exploit users’ data: “Digital health apps of all kinds are being used by people to better understand their bodies, their fertility, and to access health information. But there are concerns that the information people both knowingly and unknowingly provide to the app, which can be very personal health information, can be exploited in unexpected ways”. Key findings of the report are:
- Apps that support women through pregnancy are one example where data privacy concerns are brought sharply into the spotlight.
- Reproductive health information is highly sensitive, and the implications of services that do not respect that fact can be serious.
- Apps that are taking on the responsibility of collecting that data need to take it seriously – but as PI has repeatedly found, many don’t, (eg, this includes the involvement of the DPO, availability of privacy policies, difficulties with anonymisation of health data, and more).
Big Tech: anti-AI discrimination law, identity proofing systems
Starting from March, China outlaws algorithmic discrimination, Wired reports. Under the new rules, companies will be prohibited from using personal information to offer users different prices for a product or service. The regulations, known as the Internet Information Service Algorithmic Recommendation Management Provisions, were drafted by the Cyberspace Administration of China, a powerful body that enforces cybersecurity, internet censorship, and e-commerce rules. Among other things, they prohibit fake accounts, manipulating traffic numbers, and promoting addictive content. They also provide protections for delivery workers, ride-hail drivers, and other gig workers. Companies that violate the rules could face fines, be barred from enrolling new users, have their business licenses pulled, or see their websites or apps shut down. However, some elements of the new regulations may prove difficult or impossible to enforce, (eg, it can be technically challenging to police the behavior of an algorithm that is continually changing due to new input).
America’s Internal Revenue Service, (IRS), says taxpayers will no longer have to provide facial scans to the private identity proofing system ID.me. to create an online account at irs.gov., KrebsOnSecurity reports. All biometric data already held by ID.me. will be destroyed, and any created to make new accounts in the future will be destroyed once the account is operational. ID.me will now offer the option of a live video interview, while the IRS is also rolling out Login.gov, already used by 28 other government agencies. Critics say this federal system provides excellent digital identity security, and should be a core government service, but is underfunded and underresourced.