Weekly digest April 4 – 10, 2022: EU data governance, digital products security, US law enforcement outreach & privacy

TechGDPR’s review of international data-related stories from press and analytical reports.

Legal processes and redress: EU data governance, traffic and location data, consumer rights, hospitals

The EU Data Governance Act, approved by the Parliament on April 6, promises to boost data sharing in the EU so that companies and start-ups will have access to more data they can use to develop new products and services. The new draft rules also aim to build trust in data sharing, making it safer and easier as well as ensuring it is in line with data protection legislation. This will be achieved through a range of tools, from technical solutions such as anonymisation and pooling of data to legally binding agreements by the reusers. The rules will enable:

  • data collected in some public sector areas to be better used;
  • the creation of common European data spaces for important areas: health, environment, energy, agriculture, mobility, finance, manufacturing, public administration, and skills;
  • new rules for data marketplaces – usually online platforms where users can buy or sell data – will help new intermediaries be recognized as trustworthy data organizers;
  • new rules for companies, individuals, and public organizations that wish to share data for the benefit of society (data altruism).

The Data Governance Act must be formally adopted by the EU countries in the Council before it becomes law. Also to further encourage data sharing, the Commission proposed in February a Data Act that the Parliament is working on.

The European Court of Justice confirms that EU law precludes the general and indiscriminate retention of traffic and location data relating to electronic communications for the purposes of combating serious crime. In the related longstanding case in Ireland, a man was sentenced to life imprisonment for murder and appealed, saying the court of the first instance had wrongly admitted traffic and location data of telephone calls as evidence. “The privacy and electronic communications directive does not merely create a framework for access to such data through safeguards to prevent abuse, but enshrines, in particular, the principle of the prohibition of the storage of traffic and location data”, the highest EU court stated. However, it held that EU law does not preclude legislative measures for the purposes of combating serious crime and preventing serious threats to public security for: 

  • targeted retention of traffic and location data which is limited, according to the categories of persons concerned or using a geographical criterion; 
  • general and indiscriminate retention of IP addresses assigned to the source of an internet connection; 
  • general and indiscriminate retention of data relating to the civil identity of users of electronic communications systems; and 
  • the expedited retention, (quick freeze), of traffic and location data in the possession of those service providers. Read the full decision by the ECJ here.

The Irish government has approved a draft bill – the General Scheme of Representative Actions for the Protection of the Collective Interests of Consumers. The aim is to permit qualified and designated entities to represent consumers in a representative action, (civil claim), where a trader has infringed consumer rights under one or more of the legislative provisions listed, including the major data protection legislation at EU and national levels – the GDPR, ePrivacy Directive, and the Irish Data Protection Act 2018. You can examine the full draft bill here.

Utah followed California, Virginia, and Colorado in adopting a comprehensive consumer data privacy law, JD Supra News reports.  Utah’s Governor signed the Consumer Privacy Act, which will take effect on December 31, 2023. The consumers include individuals who are Utah residents and are acting in an individual or household context, and not an employment or commercial context. Under the Act, data controllers, (certain entities that conduct business or target consumers in Utah on a big scale), have obligations to, among other things: 

  • disclose in a privacy notice various processing activities;
  • provide consumers with clear notice and an opportunity to opt out of the processing of sensitive data, including biometric and geolocation data;
  • provide consumers with a right to opt out of targeted advertising or the sale of personal data;
  • comply with requests from consumers to exercise their other rights to access, obtain a copy of, or delete personal data, and confirm whether a controller processes personal data; and
  • maintain reasonable administrative, technical, and physical data security practices. 

However, the law does not create a private right of action and grants exclusive enforcement authority to the Attorney General. 

The Czech Supreme Administrative Court upheld a fine by the national data protection authority imposed on a hospital for insufficient security in the processing of personal data, (Art. 32 of the GDPR). In the landmark decision, the court stated that the hospital in question is a joint-stock company, not a public entity, although it is financed mainly from public health insurance funds and provides its healthcare services in the public interest.

Thus, it can not enjoy the exemption which derives from Art. 83 (7) of the GDPR: “each Member State may lay down the rules on whether and to what extent administrative fines may be imposed on public authorities and bodies established in that Member State”. In particular, the court rejected the application of the national data protection legislation, which do not allow the imposition of a sanction on a public entity. The full text of the judgment, (in Czech), can be found here.

Official guidance: data processing agreements, digital products security, AI knowledge base

The Danish data protection authority Datatilsynet responded to some questions regarding data transfer provisions in processing agreements, Data Guidance reports. In the given case, a company, (KOMBIT), supplies IT systems to Danish municipalities and uses a subcontractor/processor, (Netcompany), which in turn uses Amazon Web Services, (AWS). According to KOMBIT, the information is generally processed within the EU/EEA, but it also appears from the data processing agreement between Netcompany and AWS that this can be deviated from if it is necessary to comply with the legislation or a binding decision from a public authority in a third country. The question is:

  • whether there is an intentional or unintentional transfer to third countries and,  
  • whether the municipalities must comply with the requirements for transfers to third countries, and
  • whether this gives rise to a question of adequate security of processing.

In the eyes of the Danish regulator, this will be an intentional third-country transfer. Therefore, municipalities must ensure that the rules on transfers to third countries are complied with when or if AWS makes such transfers in accordance with the instructions set out in the data processing agreement.

The EU Commission is holding an open public consultation on the establishment of new horizontal rules for digital products and associated services placed on the internal market, in the view of a new European Cyber Resilience Act, (CRA), Bird&Bird Insights reports. The consultation and call for evidence will be open for stakeholders’ feedback until May 25. The future CRA aims to create:

  • baseline cybersecurity requirements for manufacturers and vendors of a wide range of digital products and ancillary services, the absence of which would prevent the tangible product from performing its functions, (wireless and wired, embedded and non-embedded software), and would cover their whole life cycle;
  • obligations on economic operators; and 
  • provisions on conformity assessment, the notification of conformity assessment bodies, and market surveillance.

The CRA would add to the existing cybersecurity framework, the NIS Directive, the EU Cybersecurity Act, etc. The consultation questionnaire and its outcome can be found here

The French regulator CNIL presented a knowledge base, (in French), referring to the Artificial Intelligence concept. The CNIL explains, through various tools and publications, the challenges in terms of data protection and the way in which it acts to support the deployment of solutions that respect the rights of individuals. The project includes:

  • a short glossary of AI;
  • accessible resources for everyone, (books, films, factsheets, articles);
  • guidance for data protection specialists on the application of the GDPR in AI systems, (impact assessment questionnaires, rules on assigning responsibilities, documenting requirements, etc.) 

Investigations and enforcement actions: unsecured visa applications, failed data deletion, unauthorised disclosure, accidental alterations of customer data

The Dutch data protection authority, (AP), has fined the Foreign affairs ministry 565,000 euros for potentially breaching the privacy of people making visa applications over a number of years, DutchNews.nl reports. The AP identified the ministry as a data controller and stated that its visa information system is not secure enough, and there is a risk of unauthorised access and changes to files. Sensitive information, such as fingerprints, name, address, the purpose of the trip, nationality, and photo could have been accessed because of inadequate physical and digital security. Also, people applying for visas were not given proper information about the way their data is shared with third parties. In addition, the AP imposed an extra fine, subject to periodic penalty payments, for fixing the security provision, (50,000 euros every two weeks), and the information obligation, (10,000 euros per week).

The Irish supervisory authority fined Bank of Ireland Group 463,000 euros for violating Art. 32-34 of the GDPR. This inquiry was opened after 22 personal data breach notifications in 2018-2019. The notifications related to the corruption of information in the Group’s data feed to the Central Credit Register, a centralised system that collects and securely stores information about loans. The incidents included unauthorised disclosures and accidental alterations of customer personal data. The decision considered as a preliminary issue whether the incidents met the definition of a “personal data breach” under the GDPR, and found that 19 of the incidents reported did meet the definition. Additionally:

  • the group failed to issue communications to data subjects without undue delay in circumstances where the personal data breaches were likely to result in a high risk to data subjects’ rights and freedoms; and
  • the group failed to implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk presented by its processing of customer data in the centralised register. 

Meanwhile, the Danish data protection agency Datatilsynet assessed that Danske Bank has not been able to document that they have deleted personal information in accordance with the data protection rules, and therefore set the bank a fine of approx. 1,3 mln euros. In 2020 the regulator initiated a case after the bank itself had stated that they had identified a problem with the deletion of unneeded personal data. It has emerged that in more than 400 systems there were no rules laid down for deletion and storage of personal data, and that no manual deletion of personal data had been carried out. These systems process the personal data of millions of people. At the same time, the regulator emphasized Danske Bank’s active participation in the disclosure of the case and its continuous attempts to align its practices with legal requirements and minimize the risks for data subjects.

Data security: UK cybersecurity survey, US law enforcement outreach

The UK Department for Digital, Culture, Media & Sport published the latest cyber security breaches survey. It is an annual survey detailing the cost and impact of cyber breaches and attacks on businesses, charities, and educational institutions. Here are some key findings:

  • Cyberattacks are becoming more frequent with organizations, (businesses and charities), reporting more breaches over the last 12 months.
  • Almost one in three businesses and a quarter of charities suffering attacks said they now experience breaches or attacks at least once a week.
  • Data shows two in five businesses use a managed IT provider but only 13 percent review the security risks posed by their immediate suppliers.

Four out of five senior managers in UK businesses now see cyber security as a ‘very high’ or ‘fairly high’ priority, a significant rise since 2021. Read the full survey here.

A Guardian article reveals that very little data is secret from US law enforcement that has multiple ways to obtain personal data, either openly, or covertly. It was reported last week that hackers obtained the information of some Apple and Meta users by forging an emergency legal request, (explained in the previous digest), one of several mechanisms by which law enforcement agencies can demand that tech companies hand over data such as location and subscriber information. US law enforcement requests include gag orders, meaning the company cannot notify users that their information has been requested for six months or more. There are a few types of legal requests and other legal ways that have recently sparked concern among activists and experts:

  • geofence warrants,
  • keyword search warrants,
  • administrative subpoenas,
  • cell-tower dumps, 
  • inter-agency data sharing at the local, state, and federal levels, or from companies like Palantir, 
  • location and purchase history data from data brokers,
  • surveillance tech companies like Clearview AI and Voyager, etc.

Big Tech: Google complaint in Germany, China surveillance, Clearview expansion, Mailchimp data breach, banned apps on Google Play

Google in Germany is facing a legal complaint in which the North Rhine Westphalia consumer’s office says Google’s cookie banners violate data protection rules, Reuters reports. The office maintains refusing cookies requires more steps than consenting to them on Google’s search engine websites. The company says it is soon changing its consent banner and cookie policy Europe-wide to comply with regulations.

Using publicly available documents Reuters has identified an explosion in software using AI in China to crunch big surveillance data and rising demand from police and civil authorities around the country for the equipment. Vast quantities of data used to require human input to organize. The new software is built around the “one person, one file” concept, facilitating the tracking of individuals. Since 2016’s first patent application at least 28 firms have entered the market for file archiving and image clustering algorithms for facial recognition, extracting data from social media, and details on relatives, social circles, vehicle records, marriage status, and shopping habits.

Google has banned dozens of apps from its Google Play store after finding embedded software that secretly harvested user’s data, including location and personal identifiers, IAPP News reports. The code, developed for Android and used in millions of devices worldwide, was developed by Measurement Systems, which reportedly has links to a Virginia defense contractor.

Major email marketer Mailchimp has reported a data breach after hackers exploited a weakness in an internal customer support and account administration tool, TechCrunch says. A social engineering attack led to 300 client accounts being hacked, with 102 losing audience data, with customers from cryptocurrency and finance sectors being targeted. Mailchimp says it detected the breach quickly and has taken steps to ensure it won’t happen again.

Controversial facial recognition startup Clearview AI is looking to expand beyond providing services to police forces, AP News reports. In March it reportedly offered its services for free to the Ukrainian military to help identify casualties and prisoners with images scraped from the Russian social media website VKontakte, and it is now going to offer a new “consent-based” product using algorithms, and not its 20 bln image library, to banks and other private businesses for identity verification purposes.

Do you need support on data protection, privacy or GDPR? TechGDPR can help.

Request your free consultation

Tags

Show more +