Information systems, their security, and personal data gaps are the focus of our latest digest. Also requiring your attention are invalid consent in cookie walls, the ‘pay or okay’ subscription model, Open AI “Sora” data practices, and the crackdown on mass data collectors
Stay tuned! Sign up to receive our fortnightly digest via email.
Personal data gaps in information systems
The Spanish data protection agency AEPD examines the distinction between addressing security by focusing exclusively on information systems or from the perspective of the treatments carried out. Under the GDPR rules, a data controller must evaluate the risks to the rights and freedoms of natural persons whose data is being processed and apply measures to mitigate them. Therefore security focused on processing activities is a broader concept than security focused exclusively on systems. The scope of application of the GDPR is the processing of personal data, understood as processes with an ultimate and specific purpose, while the scope of application of other regulations, such as cybersecurity or artificial intelligence, is oriented to information and communications systems.
An example that illustrates this difference is the case of access control operations in personal data processing – when third parties use compromised credentials to log into a service or application. Some controllers may incorrectly claim that a breach within the meaning of the GDPR has not occurred since, according to their opinion, the information systems have not been compromised. These controllers understand that the use of valid credentials to log in to the system has not led to a personal data breach in the processing as the system has functioned correctly.
“Consent or Pay” initial guidance
Some businesses are considering giving people a choice between accessing online services without payment if they consent to their personal information being used for personalised advertising or, if they refuse this consent, having to pay to access that service. In principle, data protection law does not prohibit business models that involve “consent or pay”, states the UK ICO. However, some types of access mechanisms aren’t likely to comply with expectations in data protection law for consent to be ‘freely given’. The relevant context may include power imbalance, equivalence, appropriate fees, privacy by design, and information obligation:
“Being upfront and honest with people about what happens to their personal information when they use the service is a good thing.”
More official guidance
Data obtained as part of work duties: The Latvian regulator DVI explains the legality of data processing through information systems that hold personal information and to which access is authorised through employment. We may directly or indirectly come into contact with other people’s data while carrying out our job, including customers, coworkers, and residents.
The organisation that grants its employees access to the systems must ensure, (if technically possible), that the employee accesses only the information necessary to perform the duties of their position. Personal interest or curiosity is no longer an adequate basis for looking into a database. In the case of a data processing infringement, the organisation should anticipate that, as the data controller, they would be the main responsible.
Automated decisions: The Spanish AEPD has updated guidance on the degree of human intervention in automated decisions, (Art. 22 of the GDPR). Many automated decisions involve some degree of human intervention. However, to be considered as such, it has to be active and not just a symbolic gesture, that is, it has to have a certain degree of relevance and capacity. Evaluating whether human supervision is possible and effective involves evaluating both the system used and the treatment and its context. To carry out this evaluation systematically, it is recommended to objectively assess a person’s participation in the decision process. More details in the original publication (in Spanish).
Public affairs: As part of their activity, public affairs professionals, (public affairs or lobbying consulting firms, internal departments), collect personal data relating to individuals in sectors such as government, administrative, associative, parliamentary, media actors, etc. To help them comply with the GDPR, several associations representing business and public relations professionals have jointly developed a guide, drafted in consultation with the CNIL, (in French).
Legal processes
EU AI Act: The Guardian analyses the practical implications of the upcoming regulations for customers and businesses. The act will soon become law and go into effect gradually over the following three years. Customers will feel more certain that the AI technologies are configured for safe use as a result. Similar to how the GDPR role model worked, the legislation will likewise have an impact outside the EU. However, the EU’s proposed cap on computing power used to train AI models is far lower than equivalent laws in the US. Consequently, European companies could even decide to relocate west to get around EU regulations, warn some tech businesses.
European Health Data Space: EU legislators have struck a provisional agreement on the exchange and access of health data at the union level. Currently, the level of digitalisation of health data in the EU varies from one member state to another. The proposed regulation requires all electronic health record systems to comply with the specifications of the European electronic health record exchange format, ensuring that they are interoperable at the EU level.
Patients still will have the right to opt-out from primary and secondary use of their data or restrict access to it with some exceptions, (eg, scientific research, public interest, vital interests).
IAB Europe: The CJEU holds, as argued by the Belgian data protection regulator, that a structured character string capturing internet users’ preferences such as IAB Europe’s TC string can be considered personal data. TC String constitutes personal data, in particular, because its purpose is to link advertising preferences to a specific individual. As a sectoral organisation which standardises and prescribes the method for capturing and transmitting user preferences, IAB Europe can be indeed considered a (joint) controller concerning the processing carried out following this method.
Receive our digest by email
Sign up to receive our digest by email every 2 weeks
Data erasure request
Another ruling by the CJEU states that the supervisory authority of a Member State may order the erasure of unlawfully processed data even in the absence of a prior request by the data subject. Such erasure may cover data collected from that person and data originating from another source if such a measure is necessary to fulfil its responsibility for ensuring that the GDPR is fully enforced. The case relates to the provision of financial support to persons who have been made vulnerable by the COVID-19 pandemic, (in Hungary), and the data breaches committed by a local administration affecting eligible persons who had not applied for the support.
Bank security failed
The Italian data protection authority Garante fined UniCredit 2.8 million euros and the company responsible for carrying out its security tests 800,000 euros. The violation had occurred due to a massive cyber attack on the mobile banking portal. The attack caused the illicit acquisition of the name, surname, and other identifiers of approximately 778,000 customers and former customers and, for over 6,800 of the customers, it had also led to the disclosure of the portal access PIN. The data was made available in the HTTP response provided by the bank’s systems to the browser of anyone who tried to access, even unsuccessfully, the mobile banking portal.
More enforcement decisions
Invalid consent in cookie walls: The Danish data protection authority Datatilsynet ruled the use of cookie walls on Berlingske.dk must take place within the framework of the data protection rules. Berlingske’s specific approach is to greet users with a cookie wall when they try to access embedded content, (eg, video players or blog posts). This means that the content is unavailable unless the user accepts the processing of their data for statistical and marketing purposes through the use of cookies.
European Commission’s use of Microsoft 365: Following its investigation, the EDPS has found that the European Commission has infringed several key data protection rules when using Microsoft 365. The Commission has failed to provide appropriate safeguards to ensure that personal data transferred outside the EU/EEA are afforded an essentially equivalent level of protection. Furthermore, in its contract with Microsoft, the Commission did not sufficiently specify what types of personal data are to be collected and for which explicit and specified purposes when using Microsoft 365. More details of the case can be read here.
Commercial prospecting: The French CNIL fined Foriou company 310,000 euros for using data provided by data brokers for commercial prospecting purposes. It conducts telephone canvassing campaigns to promote the loyalty programs and cards it sells. The misleading appearance of the collection forms implemented by the brokers at the origin of the collection did not make it possible to obtain valid consent from the persons concerned. The size of this fine, which represents approximately 1% of the company’s turnover, was decided in light of the seriousness of the breach.
Information security audit
Moorfields Eye Hospital NHS Foundation Trust has undergone a consensual data protection audit conducted by the UK’s ICO. The scope areas were determined following a risk-based analysis of the trust’s processing of personal data. The suggestions for improvement included some tips on information security and data sharing, and included the following advice:
- The permanent roles which make up the Information Security function should be filled quickly to ensure that operational responsibility is clearly in place.
- A template letter should be in place to notify data subjects of a data breach which includes all appropriate information including details of the DPO, a description of the likely consequences of the breach and the measures which have been taken.
- Appropriate reviewing processes should be in place for all data-sharing agreements, which include review schedules and review logs.
- The trust should have measures in place to ensure that relevant staff receive appropriate training, and ensure this is periodically refreshed.
Among best practices, the ICO recognised that the trust tests their physical security on-site, with police officers being shown around and then returning at a later date in plain clothes to assess the security, for example by seeing if they can get into secure areas or move around unchallenged without appropriate ID.
When user login data is made public
The Lithuanian data protection authority VDAI reminds us that upon receiving information about potentially leaked login names and passwords, an organisation, (the data controller), should conduct a preliminary investigation and determine whether there has been a violation of the confidentiality, integrity or availability of personal data. For example, it should establish whether the personal data processed in the organisation’s information systems has been compromised.
- If the processed personal data has not been accessed by unauthorised persons, the data controller still must assess the risks, prevent possible negative consequences, and let users know what action they can take in this situation, (eg, block user accounts whose login data matches the leaked data, generate new temporary passwords and send them to affected data subjects, activate two-factor authentication, etc.)
- If the processed personal data has been accessed by unauthorised persons, (eg, illegal logins to user accounts are detected or it is not possible to unequivocally determine that there were no such logins, illegal actions on accounts are detected, etc.), the organisation must conduct a full investigation, take immediate measures, notify the data subjects, and report to the regulator within 72 hours of becoming aware of the breach.
As a general precaution, VDAI also advises individuals to take the following precautions in similar situations:
- Change your password to a new and unique one. If you have used the same password on other systems, please change them as well.
- It should consist of at least 12 characters: letters, numbers, at least one capital letter and a special character.
- Do not store your passwords in browsers.
- Watch for news or announcements from your service provider, or authorities.
- Install and regularly update antivirus software on your devices.
- If you notice any suspicious activity in your account or related systems, notify your service provider immediately.
Big Tech
OpenAI “Sora”: Italian regulator Garante has opened an investigation against OpenAI that in recent weeks has announced the launch of a new AI model, ‘Sora’, which, according to the announcement, can create dynamic, realistic and imaginative video sequences from short text instructions. OpenAI will also have to clarify several issues:
- how the algorithm is trained;
- what data is collected and processed to train the algorithm, especially whether it is personal data;
- whether particular categories of data, (religious or philosophical beliefs, political opinions, genetic data, health, sexual life), are collected, and
- which sources are used.
Crackdown on mass data collectors: Several recent FTC enforcement actions reflect a heightened focus on pervasive extraction and mishandling of consumers’ sensitive personal data, states an FTC blog post. Taken together, browsing and location data paint an intimate picture of a person’s life, including their religious affiliations, health and medical conditions, financial status, and sexual orientation. None of the underlying datasets at issue in the FTC’s proposed complaints, (against Avast, X-Mode, or InMarket), are alleged to have contained people’s names, social security numbers, or other traditional standalone elements of personally identifiable information.
What makes the underlying data sensitive springs from the insights they reveal, (eg, through proprietary algorithms), and the ease with which those insights can be attributed to particular people. People also have no way to object to how their data is collected, retained, used, and disclosed when these practices are hidden from them. Moreover, any safeguards used to maintain people’s privacy are often outstripped by companies’ incentives and abilities to match data to particular people.