Weekly digest February 21 – 27, 2022: the EU Data Act to facilitate use of digital economic data

The Commission proposed new rules on who can use and access data generated in the EU across all economic sectors. The EU Data Act will “ensure fairness in the digital environment, stimulate a competitive data market, open opportunities for data-driven innovation and make data more accessible for all”. In particular the Act will:

  • allow users of connected devices to gain access to data generated by them, which is often exclusively harvested by manufacturers;
  • consumers and businesses will be able to access the data of their device and use it for aftermarket and value-added services, (eg, farmers, airlines, construction companies will make better decisions buying higher quality products and services);
  • measures to rebalance negotiation power for SMEs by preventing abuse of contractual imbalances in data sharing contracts;
  • statutes for public sector bodies to access and use data held by the private sector necessary in the exceptional circumstance of a public emergency;
  • new rules allowing customers to effectively switch between different cloud data-processing services providers and putting in place safeguards against unlawful data transfer.

In addition, the Data Act reviews certain aspects of the Database Directive which protects investments in the structured presentation of data. Notably, it clarifies that databases containing data from IoT devices and objects should not be subject to separate legal protection. This will ensure they can be accessed and used. The volume of industrial data is constantly growing and the Commission reports 80% of it is never used.

The EDPB sent a letter to the Commission on adapting liability rules to the digital age and Artificial Intelligence initiative. It considers that the revision of the legal framework should ensure consistency with and complement the EU acquis in the field of personal data protection, in particular when it comes to the security of personal data processing and the use of AI systems. While, under the GDPR, only controllers and processors would be liable, (eg, in a personal data breach case, it is essential to consider the role and potential liability of providers of AI systems developed and made available in order to secure personal data processing). However because of the nature of AI, assigning the responsibility to a party in a claim that involves an AI system might be particularly difficult, especially when the burden of proof lies with the individual since the latter could be unaware of the fact that AI is used and, in the majority of cases, would lack the necessary information to prove the liability of the AI system. For that purpose, the EDPB wishes to stress the positive effects of:

  • including systematic human supervision;
  • transparency for the end-user on the use and operation of the AI system and on the deployed methods and algorithms;
  • limitations and liability risks on the use of AI systems due to different types of attacks;
  • providers of AI systems should be responsible for providing users with mitigation tools for known and new types of attacks and for embedding security by design throughout the entire lifecycle of the AI;
  • users of AI systems should be responsible for ensuring the safe operation of the system, etc.

Additionally, specific liabilities might be triggered by the ineffective application of data protection principles by AI providers and users. Lack of data accuracy or scarce attention paid to the fairness of algorithmic decisions might translate into impairments to individuals’ rights and freedoms as well as economic losses. 

Official guidance: video surveillance

The UK Information Commissioner’s Office has published a guide on the use of video surveillance. As video surveillance technology becomes more mainstream and affordable, it is now more common to see technologies such as smart doorbells and wireless cameras. Traditional CCTV also continues to evolve into more complex AI-based surveillance systems. These can process more sensitive categories of personal data. The ways in which the technology is used also continue to develop. Some of the provisions include:

  • data protection by design default approach;
  • performing LIA demonstrates the lawfulness of the processing, that can naturally feed into a DPIA, for any processing that is likely to result in a high risk to individuals;
  • maintaining a record of the processing activities taking place; 
  • determining a necessary data retention periods;
  • notifying and paying a data protection fee to the ICO, unless exempt, etc.

The guidance covers UK GDPR and Data Protection Act 2018 requirements. It applies where personal data is being processed by video surveillance systems in the public and private sectors. It also outlines considerations for the use of Automatic Number Plate Recognition, Body Worn Video, Unmanned Aerial Vehicles, (also known as drones), Facial Recognition Technology and surveillance, commercial products such as smart doorbells and surveillance in vehicles, workplace monitoring, live streaming, and other commercially available surveillance systems that have the potential to process personal data.

Investigations and enforcement actions: proof of identity, satisfaction survey, cooperation with the regulator, data breach notification

The Netherlands’ data protection authority fined Belgium-based DPG Media 525,000 euros for GDPR violations. The regulator found that individuals who wanted to view the data the company held or have it removed first had to provide proof of identity. The regulator received several complaints about the way Sanoma Media Netherlands BV, (before it was acquired by DPG Media in 2020), dealt with these types of requests. In particular: 

  • Subscribers received unwanted advertising from the company.
  • Anyone who wanted to unsubscribe, know what personal data was kept, or wanted to have data deleted, first had to upload proof of identity. 
  • When the proof of identity was sent digitally, these people were not informed by the company that they were allowed to protect their data.
  • For customers who had not created an online account with DPG Media it was more difficult to access or change their data. 

DPG Media has changed its working methods, and now sends a verification email to establish the identity of a requester. DPG Media has objected to the decision.

The EDPB analyzed the recent enforcement case where the Hungarian supervisory authority fined a car importer for unlawful data processing practices related to satisfaction measurement. After the applicant had their car inspected/serviced by the respondent as a specialist car garage, the applicant provided the respondent with its email address at the request. The applicant subsequently received an unsolicited email asking him to complete a satisfaction questionnaire in relation to the above service provided and then another email asking him to complete the questionnaire again due to his lack of response. The applicant’s consent for the transfer was not requested. Throughout the investigation, the importer company could not demonstrate how the following processed data are related to the stated purposes of satisfaction measurement and complaint management: the customer’s name, email address, home address, telephone number, age, gender, chassis number, registration number, technical data of the vehicle, the name of the dealer partner used, the date of the service used and the content of the feedback.

The EDPB also looked at another fine, by the Polish regulator, for lack of cooperation. The regulator requested a company respond to the content of a complaint and to answer detailed questions regarding the case. The regulator sent four requests to the company, (the data controller), and it accepted only one of them and did not reply. Disregarding the obligations related to cooperation with the regulator constitutes a breach of great gravity and as such is subject to financial sanctions. Therefore, in this case, the supervisory authority imposed an administrative fine of approx. 4,000 euros, which will not only be effective, proportionate, and dissuasive in this individual case but will also be a signal for other entities. 

The Spanish regulator AEPD fined Worldwide Classic Cars Network 1,500 euros and imposed corrective measures for having video surveillance without just cause and lack of information posters, Data Guidance reports. The complaint was filed by an individual for the installation of two video surveillance cameras which captured images of the public. Moreover, the video surveillance cameras did not display signs in accordance with the GDPR. The AEPD ordered Worldwide Classic Cars, within 10 business days, to provide proof of the following measures: a) removing the cameras from the current location, or redirecting them to its particular area; b) placing the information sign in the video-monitored areas; and c) making the stored information referred to in the GDPR available to those affected.

The Italian regulator ‘Garante’ ordered Minelli S.p.A to notify a data breach to data subjects, Data Guidance reports. The company became aware of a data breach following a report by an employee. The data breach consisted of the temporary loss of availability of data, (bank details, health data, authentication credentials), contained in a number of servers and PCs owned by the company, and the probable loss of confidentiality of the same data as a result of a ransomware attack. The breach involved around 800 data subjects, including employees, consultants, customers, and suppliers. However, Minelli had only notified the data breach to the employee who had initially detected the incident, and failed to notify all the data subjects involved. 

DPIA: Microsoft Teams

The Dutch government released a public version of the DPIA on Microsoft Teams. The document assesses the data protection risks of the professional use of the tool in combination with OneDrive, SharePoint Online, and the Azure Active Directory. These applications are commonly used to access and store files shared via Teams. As a precondition to using Microsoft’s online services, end-users, and admins, including guest users, must be authenticated through the online cloud service Azure Active Directory. The DPIA conclusion says Microsoft has implemented many legal, technical, and organizational measures to mitigate the risks for data subjects. In reply to the initial findings of this DPIA, Microsoft has also committed to improving some shortcomings and has provided important assurances.

However, in view of the ‘Schrems II’ ruling and the technical findings described in this report, Microsoft has to make more adjustments for one high and a couple of low-level identified risks. It is uncertain how the transfer risks will be assessed by the national data protection authorities this year, (in their joint investigation into the use of cloud services by public sector organizations). For this DPIA the transfer risks have been rigorously assessed, including a separate DTIA. Download the full DPIA document here.

Big Tech: TikTok’s child privacy, Meta-EU data transfer row, AI-based privacy compliance tool

The Texas Attorney General has launched an investigation into TikTok, demanding a wealth of documentary proof that the company has not been violating child privacy and enabling unlawful conduct and human trafficking. Two Civil Investigative Demands, (CID), request TikTok explain privacy policy, procedure and review practices, and how it identifies and removes content for child safety. TikTok must also provide copies of policies, guidance, manuals, training materials and the like related to children’s use of TikTok. The company has until March 18 to reply to the CIDs.

Ireland’s data protection regulator reportedly is inching towards banning Meta’s Facebook and Instagram from transferring data to the US after Data Protection Commissioner Helen Dixon issued a draft ruling for which Meta has 28 days to make legal submissions. They will likely focus on their claim the transfer ban, a result of the Schrems privacy campaign and the 2020 ECJ decision to scrap the existing transatlantic data transfer agreement, damages its and thousands of other companies’ business. The decision could be shared with fellow EU regulators in April and if none of them lodge an objection, “the earliest time we could have a final decision could be the end of May,” Helen Dixon told Reuters. Any objection could add some months to the timeline.

Mobile app developers have a new AI-based tool to help to identify possible privacy and compliance issues within apps. Called Checks, it’s out of Google’s Area 120 incubator and is freemium to all Android and iOS developers. Via Google Play developers will be able to get their apps scanned for any potential privacy and compliance problems, and a report offering applicable solutions and resources.

Do you need support on data protection, privacy or GDPR? TechGDPR can help.

Request your free consultation

Tags

Show more +