Data protection & privacy digest 18 Apr – 2 May 2023: draft AI legislation finalised, and employers’ compliance in focus

TechGDPR’s review of international data-related stories from press and analytical reports.

Legal processes

Draft AI Act: The long-discussed AI legislation is expected to go through the full European parliamentary vote in mid-June. Reportedly MEPs, after two years of discussions with stakeholders, have finally reached a common political position. However, it will take a few years for it to be enforced: the EU interinstitutional ‘trilogue’ that comes after parliamentary approval may take a while. 

The most rigorous regulations will apply to the high-risk systems that could be used for biometric identification, critical infrastructure management, or by large online platforms and search engines if they create health and safety or fundamental threats for individuals. The framework includes testing, proper documentation, data quality and human oversight. Extra safeguards are promised when such systems are intended to process special categories of personal data, prioritising instead synthetic, anonymised, pseudonymised or encrypted data. 

MEPs also support the idea to put stricter data governance obligations on foundation models, (like ChatGPT), distinguishing them from general-purpose AI. 

MiCA: Meanwhile the Parliament endorsed the EU rules to trace crypto-asset transfers and prevent money laundering, as well as common rules on supervision and customer protection. The “travel rule”, already used in traditional finance, will in the future cover transfers of crypto assets. Information on the source of the asset and its beneficiary will have to follow the transaction and be stored on both sides of the transfer. The rules will not apply to person-to-person transfers conducted without a provider or among providers acting on their own behalf. The end of 2024 or early 2025 will see the full implementation of the framework. 

America’s Innovative tech: The existing legal authorities apply to the use of automated systems and innovative new technologies just as they apply to other practices, states the US Justice Department with its federal partners. The US Constitution and federal statutes prohibit discrimination across many facets of life, including education, criminal justice, housing, lending, and voting. It is illegal for an employer to discriminate against an applicant or employee due to their race, religion, gender, age, pregnancy, disability, or genetic information. The firms are also required to destroy algorithms or other work products that were trained on illegally collected data. 

Case law

Apartment surveillance: The Estonian supreme court explained the possibility of installing surveillance cameras in an apartment building if some owners do not agree. In the given case, drug gang activity in the building was spotted, but one owner contested the cooperative’s decision to install the cameras as an intrusion into his privacy and the risk of monitoring. As CCTV processes personal data, a legal basis is necessary according to the GDPR. If an agreement between the owners cannot be reached, it can be done by a majority vote. In this case, there must be a legitimate interest, which outweighs the interests or fundamental rights of the apartment owners, (eg, a security threat – in the given case).

However, the court stated, if the installation of cameras is decided by a majority vote at the general meeting, then all apartment owners must be given the opportunity to familiarize themselves with the planned conditions, including a privacy notice for the use of cameras before the meeting. In case of violation of this requirement, the decision of the general meeting would be null and void.

Official guidance

SMEs guide: An organisation not only has to process personal data according to the GDPR, but it also needs to be able to demonstrate its compliance. For this purpose, the EDPB published its Guide for SMEs. It applies whenever you process personal data about your staff, consumers, and business partners. Transparency, data minimisation, respect for individual rights and good security practices are basic precautions for both data controllers and processors. The guide contains visual tools and other practical materials. In addition, it contains an overview of handy materials developed for SMEs by the national data protection authorities.

Employer’s guide: The Irish data protection regulator meanwhile published Data Protection in the Workplace instructions. Employers collect and process significant amounts of personal data on prospective, current and former employees. Although not all organisations are required to have a data protection officer, organisations might still find it useful to designate an individual within their organisation to overview the recruitment data processing.  The guide includes explanations and examples of appropriate legal bases, storage periods, fulfilment of data subject requests, employee monitoring technologies, email status, and much more. 

Employees’ photos: The Slovenian data protection agency published its opinion regarding the revocation of consent for the publication of employees’ photos on the employer’s social networks. The processing of the employee’s personal data based on their personal consent is permissible only in exceptional cases, due to the obviously unequal position of the employer and the employee. 

Nonetheless, if the circumstances of the employment relationship do not require the production, publication and continued storage of a photograph, the employer should obtain consent, (and provide all the necessary information stipulated in Art. 13 of the GDPR). In this case, the fact that the photos are made public has no effect on the possibility of revocation of consent to their publication. And refusals or silence of the manager gives rise to the possibility of deposing a complaint with the data protection authority. 

RoPA: A fresh new guide on records of processing activities with some practical examples was issued by the Irish data protection agency. The RoPA should not just be a ‘catch all’ document that refers to other documents; all processing activities should be recorded in sufficient detail, it states. An external reader or an auditor needs to be able to fully comprehend the document. Smaller organisations may not be required to maintain a full RoPA due to their size. However, most organisations will need to record processing activities such as HR and payroll functions. It may be that a simple spreadsheet is sufficient. For more complex organisations, the data controller may opt to use a relational database or one of the RoPA tools available from third-party data protection service providers. 

Online training: During the planning stage of a seminar, explains the Latvian data protection regulator, best practice means writing down and evaluating what kind of data about the event’s visitors is intended to be processed, and for what purposes. Beyond registration data, this can include the participant’s technical data from a device and broadcast and recording of the seminar. The next questions should be what is the applicable legal basis, the types of personal data, and the storage periods necessary to achieve the goal. 

In the case of other (joint) controllers, or processors involved, they must agree among themselves, determine the specific responsibilities and inform the workshop participants. The organizer(s) can include such information in the general privacy policy or develop it separately for each individual seminar. The information must be provided in a concise, transparent, understandable and easily accessible way, (it is considered good practice to have the privacy policy no more than two clicks away from the website’s front page). 

Enforcement decisions

ChatGPT: The temporary ban against Open AI and its Chat GPT has been dropped by the Italian data protection authority. The platform has introduced the required opt-out option for the user’s data processing before running the AI chatbot. A number of European regulators are also moving into action. The French data protection authority has announced the investigation of received complaints, and the German regulators want to know if a data protection impact assessment has been conducted. At the same time, Ireland’s regulator advises against rushing into ChatGPT prohibitions that “really aren’t going to stand up”, stressing it is necessary first to understand a bit more about the technology. 

Record number of cases: The Spanish data protection agency published its 2022 report. 15,128 claims were filed, which represents an increase of 9% compared to 2021 and 47% compared to 2020. This figure rises to 15,822 including cross-border cases from other European authorities and the cases in which the agency acts on its own initiative. The areas of activity with the highest amount of fines imposed have been Internet services, advertising, labour matters, personal data breaches, fraudulent contracting and telecommunications. The main way of resolving claims involves their transfer to the data controller, obtaining a satisfactory response for the citizen in an average of less than 3 months, states the report.

Employee’s dismissal: The Danish data protection authority criticizes an employer who informed the entire workplace that an employee had been dismissed due to, among other things, cooperation difficulties – The employer’s briefing emails went further than what was necessary for the purpose – namely to inform the relevant persons about the resignation. The employer stated that making the reason for the resignation public was to avoid the creation of rumours. However, the Danish regulator found that consideration for the resigning employee weighed more heavily

Security clearance: The Danish authority also decided against a former security guard who complained that his employer, (Securitas), had passed on information about him to the intelligence services in connection with a security clearance without obtaining consent. However, Securitas insists that all on-call employees are informed of the requirement for security clearance, and the complainant had completed an employment form with a declaration of consent, as his application for security approval would have been rejected if the complainant had not completed, signed and consented to it

Dark patterns: In Italy, a company that offers digital marketing services was found guilty of having illegally processed personal data. It emerged that in some of the portals owned by the company, “dark patterns” were used which, through suitably created graphical interfaces and other potentially misleading methods, enticed the user to give their consent to the processing of data for marketing purposes and to the communication of data to third parties. In addition, an invitation to click on a link that led to another site to download an e-book had the user’s profile data already recognized and the consent already selected. 

Security evidence logs: For a careless response to a data access request, the Spanish data protection authority fined Securitas Direct Espana 50,000 euros, according to Data Guidance. The complainant used their right of access when their vacation home was robbed for which they had signed a security service contract, The data logs from the alarm system were not provided by Securitas Direct, and those that were sent to the complainant were incomplete, out of order chronologically, and missing the decryption keys The logs produced by the alarm system installed in the complainant’s home, stated the regulator, are considered personal data and are thus subject to the right of access.

Data security

Consumers’ personal data: New York’s Attorney General released a guide to help businesses adopt effective data security measures to better protect personal information.  The guide offers a series of recommendations intended to help companies prevent breaches and secure their data, including:

  • maintaining controls for secure authentication,
  • encrypting sensitive customer information,
  • ensuring your service providers use reasonable security measures,
  • knowing where you keep consumer information,
  • guarding against automated attacks, and
  • notifying consumers quickly and accurately of a data breach, etc.

Cybersecurity of AI: The European Union Agency for Cybersecurity published an assessment of standards for the cybersecurity of AI and issued recommendations to support the implementation of upcoming AI legislation. AI mainly includes machine learning resorting to methods such as deep learning, logic, and knowledge-based and statistical approaches. However, the exact scope of an AI system is constantly evolving both in the legislative debate on the draft AI Act, as well in the scientific and standardisation communities. 

The assessment is based on the observation concerning the software layer of AI. It follows that what is applicable to software could be applicable to AI. However, it does not mean the work ends here. Other aspects still need to be considered, such as a system-specific analysis to cater for security requirements deriving from the domain of application, and standards to cover aspects specific to AI, such as the traceability of data and testing procedures. Meanwhile, some key recommendations include:

  • establishing a standardised AI terminology for cybersecurity;
  • developing technical guidance on how existing standards related to the cybersecurity of software;
  • reflecting on the inherent features of machine learning in AI;
  • risk mitigation should be considered by associating software components to AI, reliable metrics, and testing;
  • promoting cooperation and coordination across standards organisations’ technical committees.

Big Tech

VLOPs: The first designations of ‘Very Large Online Platforms and Online Search Engines’ under the Digital Services Act, (and the Digital Markets Act), were made public by the European Commission. As the 19 registered entities reach 45 million monthly active users, they will be subject to more regulatory requirements: user rights offerings, targeted advertising opt-outs, restriction on sensitive data and profiling of minors, as well as improved transparency and risk assessment measures. By 4 months after notification, the platforms will have to redesign their services, including their interfaces, recommender systems, and terms and conditions.

Salesforce Community leaks: A large number of businesses, including banks and healthcare, are leaking information from their open Salesforce Community websites, KrebsOnSecurity analysis has discovered  Customers can access a Salesforce Community website in two different ways: through authenticated access, (which requires logging in), and through guest user access, (which doesn’t). It appears that Salesforce administrators may inadvertently give guest users access to internal resources, (payroll, loan amount, bank account information combined with other data), which could allow unauthorised users to gain access to a company’s confidential information and result in possible data leaks.

Do you need support on data protection, privacy or GDPR? TechGDPR can help.

Request your free consultation


Show more +