FRIA

Data protection digest 3-17 Mar 2025: Combining FRIA with DPIA is possible, but not once the development of an AI system has begun

FRIA and DPIA: Before deploying a high-risk AI system, the organisations shall assess the impact that the use of such a system may have on fundamental rights, explains the Croatian data protection regulator AZOP. For this purpose, private and public entities shall carry out an assessment containing:

  • a description of the implementing entity’s processes in which the high-risk AI system will be used for its intended purpose;
  • a description of the period and frequency of intended use;
  • the categories of natural persons and groups likely to be affected in the specific context;
  • the specific risks of harm likely to affect the categories of natural persons or groups of persons identified;
  • a description of the implementation of human control measures;
  • measures to be taken in the event of the materialisation of those risks, including internal governance arrangements and complaints mechanisms.

If both a FRIA and a DPIA need to be conducted, the regulator recommends combining these two analyses to complement each other. At the same time, FRIA is mandatory for introducing a high-risk AI system, while a DPIA must be carried out at the very beginning, before the development of an AI system. A DPIA should also be carried out if it is not a high-risk AI system, but the processing of personal data within the AI ​​system is considered to be high-risk.

The regulator provides an example: A chatbot will in most cases be considered a medium-risk AI system. However, if the chatbot is used in a sensitive context, it may result in processing activities that would be classified as high-risk, even if the system itself would not be high-risk. Therefore, an FRIA may not be required, but a DPIA is required.

Stay up to date! Sign on to receive our fortnightly digest via email.

EHDS

On 5 March, the European Health Data Space Regulation was officially published in the EU Official Journal. It enters into force on 26 March, marking the beginning of the transition phase towards its application in the next decade. The law is designed to benefit all EU residents, including patients, healthcare professionals, researchers, policymakers, and industry players.

FRIA

EHDS aims to establish fast and free access to electronic health data across systems and countries, security and privacy protections by default, opt-out rights from secondary use, more cost-efficient access to high-quality health data for research, innovation and public health monitoring. 

Parental control in app stores

According to CNN, Utah approved a first-of-its-kind law in the US mandating that app stores confirm users’ ages and get parental approval before allowing children to download programs to their devices. The legislation, which is pending the Utah Governor’s signature, is a victory for Meta and other platforms that have been under pressure to do more to protect minors online. It may significantly change how all users—not just the young—use app stores. Similar legislation has been presented in at least eight other states. However, Apple and Google provide other ideas including app shops and app developers sharing accountability for age verification. 

AI Code of Practice

The third draft of the General-Purpose AI Code of Practice was published by the European Commission. It is only relevant for a small number of providers of the most advanced general-purpose AI models that could pose systemic risks, by the classification criteria in Art. 51 of the AI Act. The first two sections of the draft Code detail transparency and copyright obligations for all providers of general-purpose AI models, with notable exemptions from the transparency obligations for providers of certain open-source models. The final Code should be ready in May, as a tool for general-purpose AI model providers to demonstrate compliance with the AI Act.

More legal updates

Whistleblowing rules in the EU:  Five EU Member States, Germany, Luxembourg, the Czech Republic, Estonia and Hungary, have been ordered to pay financial penalties for failing to transpose the Whistleblowers directive. Persons who work for a public or private organisation or are in contact with such an organisation in the context of their work-related activities are often the first to know about threats or harm to the public interest.

By reporting breaches of Union law that are harmful to the public interest, such persons act as ‘whistleblowers’ and thereby play a key role in exposing and preventing such breaches and safeguarding society’s welfare. However, potential whistleblowers are often discouraged from reporting their concerns or suspicions for fear of retaliation. Among many things, respect for privacy and protection of personal data, are areas in which whistleblowers can help to disclose violations of law. 

The Data Act implementation: In Germany, with few exceptions, supervision of the processing of personal data by controllers in the non-public sector is the responsibility of the respective state data protection authorities. In contrast, responsibility for monitoring the application of the GDPR within the framework of the Data Act is to be transferred to the Federal Commissioner for Data Protection (BfDI). This results in the opposite of the intended simplification of responsibilities for companies, authorities, and data subjects. There is also a risk of dual supervision by a federal and a state authority for the same matter. 

Union digital access rights

The Ius Laboris law blog examines the limits of unions’ freedom of association in Germany via the digital communication tools of the employer. Since the groundbreaking 2009 decision, the Federal Labour Court granted unions a digital right of access to the employer for the first time. Unions may use company email addresses as a means of communication for information and advertising purposes, and the employer must tolerate this as long as it does not lead to an impairment of the operational process or a disruption of industrial peace.

Later on, for data privacy and security purposes following GDPR implementation, important prohibitions were set, including: 

  • receiving all company email addresses of employees;  
  • accessing the group-wide communication platform; and 
  • receiving a link on the homepage of the company’s intranet. 

Video surveillance in Sweden

Since 2018, certain businesses have had to apply for a permit from data protection regulator IMY for camera surveillance. The Riksdag has now decided that the permit requirement will cease. This can make it easier for those who want to use camera surveillance to prevent, deter or investigate crimes. At the same time, a great responsibility is placed on those who want to monitor to ensure that the surveillance is permitted under the GDPR – identify the legal basis and properly document the activity, and investigate whether other measures may be sufficient to create safety and security. 

More from supervisory authorities

What are the data processing operations that do not require a DPIA? The Latvian data protection authority offers some suggestions: 

  • Processing of employees’ data only within the country, if no processing, profiling or systematic monitoring of biometric or genetic data is carried out.
  • Processing of personal data of customers by companies for the provision of services and advertising within the country, if the company’s core business is not related to large-scale processing or special categories of data.
  • Processing of member and donor data by associations and foundations.
  • Processing carried out by apartment owners’ associations and cooperatives related to the management of residential buildings, if it is not carried out on a large scale.
  • Processing of collective applications by local governments, for example, when residents submit a collective application to the local government, etc.

Differential privacy guide: America’s NIST meanwhile finalized guidelines for evaluating differential privacy guarantees to de-identify data. Differential privacy works by adding random “noise” to the data in a way that obscures the identity of the individuals but keeps the database useful overall as a source of statistical information. However, noise applied in the wrong way can jeopardize privacy or render the data less useful. To help users avoid these pitfalls, the document includes interactive tools, flow charts, and even sample computer code that can aid in decision-making and show how varying noise levels can affect privacy and data usability. 

AI human oversight: The Dutch AP initiated a public consultation on tools for meaningful human intervention in algorithmic decision-making, which will be open until 6 April. The document focuses on meaningful human intervention in automated decision-making, distinguishing between substantive and symbolic human oversight under the GDPR and the Law Enforcement Directive. The consultation process is open to contributions from data protection officers, data controllers, and other relevant stakeholders.

Receive our digest by email

Sign up to receive our digest by email every 2 weeks

Weather camera

FRIA

The Austrian data protection authority in a recent case observed that the operation of a weather camera violated a homeowner’s fundamental right to data protection. The recordings could be viewed by anyone online. The camera was mounted on a roof and offered an overview of the town. The owner of a house, which is visible in the images, complained.

The operator of the weather camera argued that the recordings were for tourism purposes so that people could find out about the weather. This was countered by the homeowner’s interest in not having their presence and absence visible to everyone online. The decisive factor for the regulator was that this purpose could also be achieved without the (worldwide accessible) recording of the house.

Online retailers and guest access

In a complaint-independent review, the Hamburg data protection regulator HmbBfDI examined relevant online shops in Hamburg and found that a large online clothes retailer did not offer the option to order as a guest. Purchases were therefore only possible after creating a permanent customer account. The HmbBfDI requested that the company allow guest orders in the future to comply with data protection requirements.  

In principle, it is incompatible with data protection law to create permanent customer profiles if customers may only wish to place a one-time order. The principle of data minimisation stipulates that only as little data as necessary should be processed – customer accounts, on the other hand, often contain more extensive information. Creating password-protected access via the internet also exposes the entered data to the risk of hacker attacks – a risk that not all customers are willing to take.

Right to be forgotten

The EDPB has launched another coordinated action for 2025. Following the action on the right to information in 2024, this year’s focus is on implementing another key data protection right, namely the right to erasure, (the “right to be forgotten”), under Art. 17 of the GDPR. 32 data protection authorities from across Europe will participate in this initiative. The authorities will soon contact several companies and organisations from various sectors – either by initiating formal inspections or to collect information. In the latter cases, further follow-up measures could also be taken if necessary. 

Swiss cyberattacks

Reporting cyberattacks on critical infrastructure in Switzerland will be mandatory from 1 April. Operators of critical infrastructures will be required to report cyberattacks to the National Cyber Security Centre within 24 hours of discovery. This reporting obligation is under certain circumstances also relevant for non-Swiss entities. The Federal Council has decided to implement the relevant legislation for fines on 1 October to give those concerned sufficient time to prepare for the new reporting obligation. 

The regulator recommends entities check if they fall under the rather broad term of “critical infrastructures” before the deadline.

More enforcement decisions

Wrong recipient fine: Vitallaw.com legal blog reports the case by the Spanish data protection authority AEPD that has fined Ibermutua Mutua Colaboradora 600,000 euros. Over 3,395 people’s data was impacted by the breach, and 354 recipients—including businesses and consultants working with Ibermutua—received the data.

The fine came after people complained that they had received a notification from Ibermutua’s data protection officer stating that their data, including health data, had been transferred to other organisations because of a computer fault. Ibermutua contacted the companies to request that the personal data be deleted and took technical, organisational measures, including: 

  • correcting the error in programming and undertaking a series of exhaustive tests to ensure correct functioning; 
  • restriction of attachments to prohibit the sending of multiple attachments in a single e-mail; 
  • verification of the identity of the attachment with the corresponding recipient;
  • testing before sending e-mail remittances;
  • implementing training for staff, and 
  • launching an external audit.

Finally, Telenor ASA, (telecommunication company), in Norway has been sanctioned approx. 342,000 euros for deficiencies in its data protection officer scheme and internal controls. In particular, the company had not carried out all necessary assessments and documentation of the role of the DPO, including their independence and possible conflicts of interest. There was also no established and documented direct reporting line for the DPO to the highest management level. 

In case you missed it

Device code phishing: A recent Microsoft cyber security blog explains the malicious technology behind device code phishing attacks, targeting governments, NGOs, and a wide range of industries in multiple regions. In device code phishing, threat actors exploit the device code authentication flow to capture authentication tokens, which they then use to access target accounts, and further gain access to data and other services that the compromised account has access to. 

In one example, the phishing attack masquerades as Microsoft Teams meeting invitations delivered through email. When targets click the meeting invitation, they are prompted to authenticate using a threat actor-generated device code. The actor then receives the valid access token from the user interaction, stealing the authenticated session. Read more about queries to detect phishing attempts and email exfiltration attempts in the original article

‘Verify you are a human’ malware deployment: Krebs on Security law blog describes another ‘clever’ malware deployment scheme first spotted in targeted attacks last year that has now gone mainstream. In this scam, dubbed “ClickFix,” the visitor to a hacked or malicious website is asked to distinguish themselves from bots by pressing a combination of keyboard keys that causes Microsoft Windows to download password-stealing malware. ClickFix attacks mimic the “Verify You are a Human” pop-up tests that many websites use to separate real visitors from content-scraping bots. 

Do you need support on data protection, privacy or GDPR? TechGDPR can help.

Request your free consultation

Tags

Show more +