Weekly digest 18 – 24 July 2022: personal data breaches, web hosting, targeted ads, smart video devices, geolocation & privacy

TechGDPR’s review of international data-related stories from press and analytical reports.

Legal processes: personal data breaches, EU Commission’s data transfers, non-implementation of the GDPR by a country, US-UK data access, targeted ads

In Poland, an administrative court upheld the decision of the personal data protection office UODO on the fine imposed on Bank Millennium. A personal data breach occurred as a result of the loss of bank correspondence including client names, surnames, registration address, bank account numbers, etc. by courier services. The UODO learned about the incident from a complaint against the bank. The controller decided there was a medium risk of negative consequences for the persons affected by the breaches, so did not report the breach to the supervisory authority and did not fully comply with the obligation to notify the data subjects. 

In its decision the court clarified that a breach of personal data is not only when personal data has been read by an unauthorized person, but also when the data controller cannot exclude such a situation due to the lack of information in this regard. According to the court, the supervisory authority also correctly recognised that the bank is the controller of the personal data concerned by the breach. It was the bank, and not the postal operator, that defined the purposes and methods of data processing. However it is true that postal operators or courier service providers are controllers, but only for the data needed for correct delivery.

The European Commission urged Slovenia to fulfil its obligations under the GDPR, as well as make it possible for its data protection authority to use all the corrective powers under the legislation. The Commission considers that Slovenia has failed to fulfil its obligations stemming from the GDPR due to its persistent failure to reform its pre-GDPR national data protection framework. Slovenia now has two months to reply to the Commission’s reasoned opinion. If the reply is not satisfactory, the Commission may decide to bring this matter before the Court of Justice of the European Union. 

Conversely, according to the euractiv.com news website, the Commission may face a lawsuit for violating its own data protection rules when transferring EU users’ personal data from one of its websites to the US. Reportedly, the action was initiated by a German citizen with regard to the Conference of the Future of Europe’s website, meant to engage EU citizens in deciding the future of the bloc and its member states. Amazon Web Services hosts the website, hence when registering for the event, personal data such as the IP address is transferred to the US. Moreover, the Commission’s website also allows users to log in via their Facebook account, which is US-based media too and faces an investigation by the Irish data regulator on similar allegations. In parallel, a complaint was filed before the European Data Protection Supervisor that has jurisdiction over the application of the data protection rules by EU institutions. However, the EDPS has put investigations on hold because a lawsuit is pending and the decision might take up to 18 months. 

The US-UK Data Access Agreement will go into effect in October, according to the joint statement shared by the US Justice department. It will be the first agreement of its kind, allowing each country’s investigators to gain better access to vital data to combat serious crime. Namely, it will allow information and evidence that is held by service providers and big tech companies related to the prevention, detection, investigation or prosecution of serious crime to be accessed more quickly than ever before. This will help, for example, the law enforcement agencies gain more effective access to the evidence they need to bring offenders to justice, including terrorists and child abuse offenders, thereby preventing further victims.

According to Privacy International the UK Department for Culture, Media and Sport (DCMS) recently ran a consultation to review the regulatory framework for paid-for online advertising. The aim according to DCMS is “to tackle the evident lack of transparency and accountability across the whole supply chain.” While PI agrees with the rationale for intervention, as a starting point it would like to see existing regulation, (such as the UK GDPR), be properly and regularly enforced. PI would rather resources were focused on enforcing existing data protection standards, and as a result that more investigations be opened into intermediaries and platforms such as data brokers, data suppliers, data management platforms, and measurement and verification providers, third-party software development kits etc. The risks to privacy do not stem from ad targeting alone, or the content of adverts. There are many steps in the process before adverts are served in a targeted manner:

  • Data collection, (hidden means such as trackers placed on the websites you visit)
  • Profiling, (dividing users into small groups or “segments” based on previous online behaviour)
  • Personalisation, (designing personalised content for each segment), and
  • Targeting, (delivering tailor-made, targeted messages)

Through each of these stages the users still have very little understanding on where that data came from, or by who and for what profiling is used, or the level of detail of profiling practices, etc. PI concludes it is impossible to address the problem without tackling the whole supply chain, (eg, real time bidding technology), and creating accountability at each stage.

Official guidance: smart video devices, geographical indications for EU producers

The French privacy regulator CNIL has published its position on the conditions for the deployment of smart video devices in places open to the public, (excluding offices, warehouses, and domestic use). For several years, says CNIL, new types of cameras equipped with artificial intelligence software have been evolving. The CNIL’s position concerns “augmented” video devices that differ from biometric recognition devices such as facial recognition devices. Two criteria make it possible to distinguish these devices:

  • the nature of the data processed: physical, physiological or behavioural characteristics;
  • the purpose of the device: to uniquely identify or authenticate a person.

A biometric recognition device will always combine these two criteria while an “augmented” camera will not meet any, (eg, an “augmented” camera that films the street to classify the different uses: cars, bicycles, etc.), or only one of the two, (eg, an “augmented” camera that detects fights in a crowd). This distinction has legal consequences: biometric recognition devices involve the processing of so-called “sensitive” data which are, in principle, prohibited by the GDPR, with some exceptions. 

The CNIL considers that any actor who wishes to deploy an “augmented” video device will have to rely on a legal basis determined on a case-by-case basis. While none is excluded or privileged in principle, the legal basis of “legitimate interest” must not lead to a manifest imbalance between the interests pursued by the user of an “augmented” video device and the reasonable expectations of individuals, (eg, a store that analyses the mood of customers to display them appropriate advertisements). More generally from the outset it is necessary to demonstrate proportionality, (that is to say, the conditions for implementing the device in relation to the objectives pursued), of the envisaged device. Even the police are not authorised by law to connect automatic analysis devices to video protection cameras to detect conduct contrary to public order or offences, says the CNIL. 

As such, effective data protection and privacy by design mechanisms must be implemented to help reduce the risks to data subjects. Strong safeguards include, for example, the integration of measures allowing the almost immediate deletion of source images or the production of anonymous information. Finally the CNIL states that people generally cannot oppose the analysis of their images, for example, when the algorithms do not keep the images, or that the conditions for exercising this right are not practicable, (marking one’s opposition requires pressing a button, making a particular gesture in front of a camera, etc). You can read the full opinion by the CNIL, (in French), here. 

The EDPS meanwhile published an opinion on protecting the personal data of EU foodstuff producers. While supporting the proposal for a regulation on geographical indications for wine, spirits, agricultural products, and quality schemes for agricultural products, the EDPS recommends that a number of measures related to the processing of personal data are clarified and added:

  • explicitly indicating the role of the European Union Intellectual Property Office as joint controller together with the European Commission;
  • identifying in the proposal itself the different categories of personal data to be included in the supporting documentation accompanying the applications for registration, oppositions and official comments, extracts from the Union register and the single document;
  • indicating in which circumstances and/or conditions it is necessary to make which categories of personal data publicly available and clearly define for which objectives;
  • assessing whether it would be appropriate to put in place a procedure whereby only individuals who demonstrate a legitimate interest have access to additional categories of personal data, such as contact details;
  • the chosen data retention period for the documentation related to the cancellation of geographical indications should be further justified or reduced.

Enforcement actions: passwords in clear text, wrongful emails, membership and consent, web hosting, vehicle geolocation, healthcare data, Google Workspace

The Danish data protection authority Datatilsynet expresses serious criticism of Salling Group for having stored a number of customers’ passwords in clear text format in a log file from one of the grocery group’s websites. The error persisted for more than a year. Salling Group uses a common login – Salling Group profile – so that the username and password can be used on all the services where the Salling Group profile provides access. In 2021, Salling Group implemented a monitoring tool to register incidents and events. Due to a human error, the customers’ passwords were not encrypted before they were stored in the system’s log file when the customers logged in to the website. 

personal data breaches

As a result, up to 146 internal users in the Salling Group were given technical access to read both usernames and passwords for a number of customers who had logged in on the website. If this access had been used, it would have been possible to gain access to the name, address, email address, telephone number, masked payment card information and purchase history of a number of Salling Group’s customers. The regulator also ordered the company to notify the customers whose passwords have been stored unencrypted in the log for the monitoring tool. 

In a separate ruling Datatilsynet also assessed the benefits of membership, (of Magasin’s customer club Goodie), in return for giving consent to marketing. The consumer will not be prevented from buying certain products/services simply because consent is not given – they will simply have to pay regular prices and the general discounts that apply at Magasin. In other words, it is voluntary whether a customer gives marketing consent in exchange for benefits or buys products/services on normal market terms. Members can revoke their consent to marketing at any time, with the consequence that membership of a customer club ends. There are no costs associated with revoking consent, and in connection with registration for the customer club, it is clearly stated that revoking consent results in the termination of membership. On this basis, the Danish regulator found that Magasin’s processing of personal data had taken place in accordance with data protection regulations. The full decision, (in Danish), is available here.

The Spanish privacy regulator AEPD fined DKV Seguros y Reaseguros, (health insurance for individuals), 220,000 euros for confidentiality and security violations, (Art 5, 32, 33 GDPR), Data Guidance reports. According to the individual plaintiff, they received dozens of emails with medical clearances of unknown individuals from the company, including the individuals’ names, surnames, and test data, from 2020-2021. Further, the AEPD specified that the plaintiff had repeatedly brought the situation to the attention of DKV Seguros y Reaseguros, but they did not act until receiving notice from a regulator. The investigation found out that:  .

  • the company’s technical and organisational security measures were inadequate, taking into consideration that the data in question was of a sensitive nature; 
  • the company had failed to notify the AEPD that it had suffered a personal data security breach since it had become aware of it back in 2020. 

However, the AEPD noted that due to an admission of guilt and a voluntary payment on the part of the defendant, the fine was reduced by 20%.

Meanwhile the Berlin data protection commissioner is examining data processing contracts between web hosting providers and their customers. Many organisations operate their websites or online shops via an third-party service provider. As a rule, related data processing takes place on behalf of the responsible party, the site operator. This means that the web hoster is technically a processor and a specific contract needs to be signed. In order to support responsible parties and prevent them from future sanction and enforcement actions, the Berlin data protection commissioner is examining the agreements of selected large web hosters the area. Many organisations in Berlin have complained about standard form contracts offered by web hosting companies, who are not willing to change them. Thus, the regulator encourages all IT service providers to check their standard contracts independently and to adapt them to the law.

The HIPAA journal has published the latest statistics on healthcare data breaches in the US.  Reportedly, there were 31 reported breaches of 10,000 or more healthcare records in June – the same number as in May 2022  – two of which, (the Texas Tech University Health Sciences Center and Baptist Medical Center), affected more than 1.2 million individuals. Healthcare providers were the worst affected HIPAA-covered entities, along with business associates. Several healthcare providers submitted breach reports in June 2022 due to a ransomware attack on HIPAA business associate, Eye Care Leaders. At least 37 healthcare providers are now known to have been affected by that ransomware attack and more than 3 million records are known to have been exposed in the attack. 

The French CNIL has imposed a penalty of 175,000 euros against the company UBEEQO International, (short vehicle rentals), for having disproportionately infringed the privacy of its customers by geolocating them almost permanently. The checks covered in particular the data collected, the retention periods defined, the information provided to individuals and the security measures implemented. The CNIL found in particular that, during the rental of a vehicle by an individual, the company collected data relating to the geolocation of the rented vehicle every 500 meters when the vehicle was in motion, when the engine was turned on and off or when the doors opened and closed. In addition, the company kept a history of some of the collected geolocation data for an excessive period of time. The company argued that vehicle geolocation data was collected for different reasons:

  • ensure the maintenance and performance of the service, (eg, check that the vehicle is in the right place, monitor the state of the fleet);
  • find the vehicle in case of theft;
  • assist customers in the event of an accident.

The CNIL considers that none of these purposes justifies a collection of geolocation data as fine as that carried out by the company. Such a practice is indeed very intrusive in the privacy of users insofar as it is likely to reveal their movements, their places of frequentation or all the stops made during a route.

Finally, the Danish data protection agency has made a final decision in the case concerning the use of Google Chromebooks in Elsinore municipality, EDPB reports. Last year the municipality of Elsinore was ordered to make a risk assessment of the municipality’s processing of personal data in the primary school using Google Chromebooks and Workspace. Based on the documentation and assessment of the risk for the data subjects which the municipality has prepared, the regulator has now found that the processing does not meet the requirements of the GDPR on several points. The municipality as controller has not assessed some specific risks in relation to the data processor construction as to the processing activities the controller is allowed to do as a public authority. In addition, the data processor agreement states that information can be transferred to third countries in situations for technical support without the required level of security and protection. The regulator has now made a new decision. It contains, among other things:

  • A suspension of the municipality of Elsinore’s data processing where information is transferred to third countries without the necessary level of protection.
  • A general ban on processing of personal data with Google Workspace until adequate documentation and impact assessment has been carried out and until the processing operations have been brought into line with the GDPR.  

Many of the specific conclusions in this decision probably will apply to other Danish municipalities that use the same data processor setup as Elsinore. 

Data security: private correspondence for a government

The UK Information Commissioner called for a government review into the systemic risks and areas for improvement around the use of private correspondence channels – including private email, WhatsApp and other similar messaging apps. The investigation found that the lack of clear controls and the rapid increase in the use of messaging apps and technologies had the potential to lead to important information around the government’s response to the pandemic being lost or insecurely handled. 

An example of this included some protectively marked information being located in non-corporate or private accounts outside of the Department of Health and Social Care’s official systems. This information, stored on outside servers, betrays an oversight in the consideration of storage and retention of information and the associated risks this could bring. Although the use of private channels brought some real operational benefits at a time in which the UK was facing exceptional pressures throughout the COVID-19 pandemic, it is of concern that such practices continued without any review of their appropriateness or the risks they present.

Big Tech: Microsoft cloud for governments, DiDi Global privacy fine, UBER massive data breach

Microsoft is beefing up its cloud offer, in partnership with Italy’s Leonardo and Belgium’s Proximus, by launching a public cloud to service government customers. Dubbed the “Cloud  for Sovereignty” Microsoft says it will offer greater control over data, be cheaper, and be closer to developing technology. Rivals Amazon and Google are doing good cloud business in the US and elsewhere, but the EU’s privacy watchdog is currently checking to see if private cloud operators are doing enough to ensure the safety of public data.

Chinese ride-hailing service DiDi Global has been hit with a billion-dollar fine by the national cybersecurity regulator for going public on the NYSE before a Chinese probe into the company’s data practices had been completed. The probe found user data had been illegally collected for years, and that DiDi had endangered national cybersecurity with their data processing methods. The inquiry forced the New York delisting of the company, which says it will review and change its practices.

Uber has admitted to failing to report a massive 2016 data breach and covering it up from regulators for a year as part of a Non-Prosecution agreement in the ongoing federal criminal case in California; Data from over fifty million users was stolen, but the company points to a complete overhaul of data protection and privacy and change of top management since then. The company also fully co-operated with prosecutors. Uber has already paid out nearly 150 million dollars in all 50 US states in civil litigation related to the breach, Reuters reports.

Do you need support on data protection, privacy or GDPR? TechGDPR can help.

Request your free consultation

Tags

Show more +