TechGDPR’s review of international data-related stories from press and analytical reports.
Legal processes: Google Analytics case in Austria, EU Parliament breach, French health database, the Irish DPC
The Austrian data protection authority, the DSB, ruled that the use of Google Analytics violates the GDPR. Presented as evidence was a case where an IP address “anonymization” function had not been properly implemented on a health-focused website – netdoktor.at. When implementing GA services, the website had been exporting visitors’ data to the US-based company in violation of Chapter V of the GDPR. While the regulator upheld the complaint against netdoktor it did not find against Google’s US business for receiving/processing the data — deciding that the rules on data transfers only apply to EU entities and not to the US recipients, TechCrunch reports.
The complaint was filed by the NOYB privacy foundation based on the “Schrems II” CJEU decision, which invalidates the Privacy Shield framework for EU-US data transfers. The Austrian DSB assessed various measures by Google to protect the data in the US — such as encryption at rest in its data centers — but did not find sufficient safeguards to effectively block US intelligence services from accessing the data.
Because the Austrian data exporter in the given case has merged with a German company, the DSB will raise a ban on future data transfers with the relevant authority at the new headquarters too. The Dutch data protection authority, the AP, has also warned that the use of Google Analytics may soon not be allowed. The AP is currently investigating two complaints about the use of Google Analytics in the Netherlands. Upon completion of that investigation, in early 2022, the AP will be able to decide on the future of GA. In response to the Austrian decision, Google defended itself in a blog, stating that:
- Organizations use Google Analytics because they choose to do so. They, not Google, control what data is collected and how it is used.
- They retain ownership of the data they collect using GA, and Google only stores and processes this data per their instructions — to provide them with reports about how visitors use their sites and apps.
- Organizations can, separately, elect to share their Analytics data with Google for one of a few specific purposes, including technical support, benchmarking, and sales support.
- Organizations must take explicit action to allow Google to use their analytics data to improve or create new products and services. Such settings are entirely optional.
- Organizations are required to give visitors proper notice about the features of GA that they use, and whether this data can be connected to other data they have about them.
- Google offered browser add-ons that enable users to disable measurement by GA on any site they visit, etc.
Meanwhile, the European Parliament was also found to be in breach of EU rules on data transfers and cookie consent. The assembly hired a company to provide mass Covid-19 testing via a dedicated website for members and officials. The page attracted a number of complaints, filed by some MEPs, also with the support of the NOYB, over the presence of third-party trackers and confusing cookie consent banners, among a raft of other compliance issues. In particular, the test booking site was found to be dropping cookies associated with US Google Analytics and digital payments company Stripe, but the parliament failed to demonstrate it had applied any special measures to ensure that any associated personal data transfers would be adequately protected. The European Data Protection Supervisor, which oversees EU institutions’ compliance with data rules, gave the assembly one month to fix the privacy flaw.
EU Commissioner for Justice Reynders refuted the criticism that has been raised against the Irish Data Protection regulator, the DPC. As the lead data protection authority for Big Tech companies that have their EU headquarters in Ireland, the DPC has been subject to criticism over insufficient investigation and cooperation actions. At the end of 2021 some Members of the EU Parliament asked to initiate infringement proceedings against the DPC. In his response Reynders stated that, a) it is too early to come to definitive conclusions as to the efficiency and functioning of the GDPR cooperation mechanism, b) the Commission is taking appropriate actions to monitor the application of the GDPR in EU Member States, and c) there is no evidence that the Irish data protection rules have not been respected by the DPC and that the cooperation mechanism has not been applied correctly.
The French government reportedly decided to withdraw a request for authorization for the Health Data Hub, HDH, to host the main national health database. Without the permission of the French regulator CNIL, the HDH cannot function as intended. The platform makes data available to authorized projects, and the most important criticism relates to its choice to host health data on Microsoft Azure. The CNIL had protested against entrusting the hosting of health data to an US-based company. It had then expressed the wish that the hosting could be reserved for entities coming under the exclusive jurisdiction of the EU. However, there is no designated “cloud of trust” for French public services, as the “Blue” initiative, with Orange and Capgemini, does not exist yet.
Official guidance: ex officio data erasure, reuse of data by subcontractors, debtor’s data
The EDPB published its recent opinion on whether Article 58(2)(g) of the GDPR could serve as a legal basis for a supervisory authority to order ex officio the erasure of unlawfully processed personal data, in a situation where such a request was not submitted by the data subject. The Board supported the fact that some cases set forth in Art. 17, (‘Right to erasure’), of the GDPR clearly refers to scenarios that the controllers must detect on their own as part of their obligation for compliance. Thus, the EDPB concludes that Article 58(2)(g) GDPR is a valid legal basis for a supervisory authority to ensure the enforcement of the principles enshrined in the GDPR even in cases where the data subjects are not informed or aware of the processing, or in cases where not all concerned data subjects have submitted a request for erasure.
The French regulator CNIL published a new guidance for subcontractors: the reuse of data entrusted by a data controller (in French). A processor processes personal data on behalf of the controller. In this context, he only follows the instructions of the data controller and cannot, in principle, use the data for his own account. Sometimes, however, a subcontractor wishes to reuse the data, often with the aim of improving its services or products or designing new services and products. Such reuse is only possible under several conditions:
- national or European law may require them to do so;
- the controller may authorize its subcontractor to reuse the personal data for its own account. The processor then becomes responsible for this new processing;
- the subsequent processing must be compatible with the purpose for which the data was initially collected – the “compatibility test”, (when the processing is not based on the consent of the data subject, eg, ex-subcontractor is allowed to reuse data for the purpose of improving its cloud computing services, but must not us it for commercial prospecting);
- the existence of appropriate safeguards, which may include encryption or pseudonymisation;
- the authorization of the initial controller must be established in writing, including in electronic format;
- the initial controller must inform data subjects;
- ex-subcontractors must ensure the compliance of the processing (encryption, pseudonymisation, minimisation, retention periods, legal basis, data subject rights, etc.)
The Lithuanian data regulator VDAI, has issued a recommendation on the processing of debtors’ personal data. The following personal data is usually processed in the administration of debts: name, surname, payer’s code, date of birth, address and other details. Debt recovery procedures involve financial consequences for individuals, and such processing of personal data is often very sensitive. The cases investigated by the VDAI show that there are sometimes misunderstandings between debtors, creditors or debt collection companies. There are a number of cases where complaints are declared unfounded and terminated, such as the transfer of the debtor’s personal data to a processor for legal recovery where consent is not required. VDAI also noted that the exercise of the data subject’s rights does not imply a debt review. Finally, the exercise of data subjects’ rights does not affect the debtors’ contractual obligations to the creditor, (VDAI does not have the power to decide on debt calculation methods, the existence or absence of debt etc).
Data breaches, investigations and enforcement actions: DPO role, Europol data, IT security, credit default information, outsourced marketing
The Luxembourg data protection authority, (CNPD), fined an unnamed company for multiple violations of the GDPR, including the activity of the Data Protection Officer. The company failed to provide evidence that the DPO was appropriately involved in all matters relating to the protection of personal data, (Art. 38, 39 of the GDPR), DataGuidance reports. Although the DPO reported to company management:
- there were two hierarchical layers between them and the management, and therefore, direct access was not guaranteed;
- there was no proof that statements mentioning the formal reporting of the DPO’s activities on a quarterly basis were actually issued;
- the company did not have a formalised control plan specific to data protection. This meant that the DPO could not exercise their objective of controlling the compliance of the data controller.
Read the full decision, (available in French), which includes 11 control objectives for a valid DPO position.
The Finnish data protection ombudsman ordered Bisnode Finland, which provided digital business information services & credit and risk management, to rectify its credit information register. The investigation referred to processing of data on payment defaults following an individual’s complaint that the company had refused to remove from its credit register default entries based on judgments in civil cases, DataGuidance reports. In particular, the regulator stated that data based on final judgments in civil cases should not have been included as a default entry in the credit information register, since only information that adequately reflects a person’s ability or willingness to pay may be used as credit information. The regulator found the company in breach of Art. 25 of the GDPR, (‘Data Protection by Design and by Default’), as well as the Credit Information Act.
A municipality in Norway was fined more than 500,000 euros over a lack of security measures. It was subjected to a serious attack in 2021. As a consequence, employees no longer had access to most of the municipality’s IT systems, the data had been encrypted and backups deleted. Approximately 30,000 documents were lost, containing some very sensitive information about the municipality’s residents and employees. The deficiencies are related to both log and log analysis, securing backup and lack of two-factor authentication or similar security measures. The firewall was inadequately configured for logging, and a lot of internal traffic was never logged. Servers were not configured to send logs to central log reception and also lacked logging of important events. Furthermore, the municipality lacked protection of backup copies against intentional and unintentional deletion, manipulation and reading, etc.
The Italian regulator Garante fined a telecommunication company, (OMNIA24), 100,000 euros for multiple violations of the GDPR. The infringements included outsourced marketing activities, methods of collection of consent and the source of the data, Data Guidance reports. It also turned out that OMNIA24’s inadequate response to individuals’ requests to access their personal data constituted a further violation of the GDPR. The investigation determined the main reason was the failure to qualify the data processor/controller roles between the business associates, which had led to an inability to guarantee the facilitation of data subjects’ rights.
Europol was ordered to erase data concerning individuals with no established link to a criminal activity. The EDPS admonished Europol in 2020 for the continued storage of large volumes of data with no Data Subject Categorisation (DSC), which poses a risk to individuals’ fundamental rights. While some measures have been put in place by Europol since then, Europol has not complied with the EDPS’ requests to define an appropriate data retention period to filter and to extract the personal data permitted for analysis under Europol Regulation. Europol said the decision impacts its ability to analyze complex and large datasets at the request of EU law enforcement. The current Europol Regulation does not contain an explicit provision regarding a maximum time period to determine the DSC. In its decision the EDPS sets this period at six months. However, Europol’s work frequently entails a period longer than six months, as do the police investigations it supports.
Individual rights: Covid data in police investigations
Police in Germany are being slammed for using COVID-19 tracking data to identify witnesses as part of an investigation, IAPP news reports. Police and local prosecutors in Mainz successfully appealed to the civic health authorities and used data from the contact tracing Luca health application. The police used app logs of an individuals’ length of time at a location along with their name, address and phone number, to gather information about 21 people who may have been witnesses to a death at a local restaurant. The company that developed the Luca app, culture4life, condemned the abuse of Luca data collected to protect against infections. It added that it had received regular requests for its data from the authorities which it routinely rejected.
Big Tech: Clearview AI for FBI, YouTube fake news, Facebook/Meta competition lawsuit
In the US the FBI has signed a contract to subscribe to controversial facial recognition technology developed by Clearview AI. The company has been criticised for its policy of trawling social media platforms for pictures of people and storing them without their knowledge. The report by CyberScoop identifies more than 20 other federal agencies currently partnering with facial recognition technology contracts. Last year Clearview was found in breach of privacy rules in Canada, Australia and the UK. Finally last month the French Regulator CNIL slapped the company with an order to delete French users data.
A global coalition of fact checking organisations has fired a broadside at YouTube for being a “major conduit” of fake news. More than 80 groups signed an open letter saying YouTube allowed the “weaponization” of extremism and was not doing enough to filter out disinformation. The letter did suggest four remedial steps: a commitment to funding independent research into disinformation campaigns on the platform; providing links to rebuttals inside videos distributing disinformation and misinformation; stopping its algorithms from promoting repeat offenders; and doing more to tackle falsehoods in non-English-language videos.
Facebook/Meta is facing the first class action lawsuit of its kind in the UK for breach of competition rules. The plaintiffs, a competition lawyer and litigation fund, are seeking more than three billion dollars for all the millions of UK Facebook users in compensation for paying an “unfair price”, i.e. surrendering unfettered use of their personal and private data, in exchange for Facebook’s market-dominant services. If you were domiciled in the UK from 1 October 2015 to 31 December 2019 you could be in for a windfall even if you used Facebook just once, unless you opt out of the lawsuit.