TechGDPR’s review of international data-related stories from press and analytical reports.
Financial data: The EDPS discussed recommendations to encourage data sharing to extend the range of available financial services and products, while also giving people or organisations control over the processing of their financial data. Individuals and organisations, according to the proposals, would govern access to their financial data using dashboards offered by financial institutions. Individuals would be able to monitor, limit, or authorize access to their information. Users should be supplied with comprehensive, accurate, and unambiguous information about the financial service provider asking for access to their data. It should also disclose the type of product, payment, or service for which an individual’s data will be utilized, as well as the categories of data required.
Digital Services Act: The Digital Services Act took effect for large online operators serving in the EU on 25 August. 19 platforms and search engines with at least 45 million users must comply with stricter rules concerning data collection, privacy, disinformation, dark patterns, online hate speech and more. This includes a ban on targeted advertising of minors based on profiling, and a ban on targeted advertising using special categories of personal data, such as sexual orientation or religion. Online platforms will be required to redesign their systems and prove they have done so to the European Commission, (including publishing the risk assessments). Additionally, vetted researchers can access the data of those services to conduct analyses on systemic risks in the EU. Smaller platforms will be subject to the same regulation beginning in 2024. They will, however, be supervised by national agencies rather than Brussels.
Cybersecurity and risk assessment in California: The California Privacy Protection Agency, (CPPA), has published its proposed Cybersecurity and Risk Assessment Audit Regulations. According to the CPPA, official regulation processes for cybersecurity audits, data protection risk assessments, and automated decision-making technologies have yet to begin. These versions are intended to promote board deliberations and public participation. They provide standards for service providers and contractors, assisting organisations in meeting audit compliance. The regulations state that every business that processes personal information that potentially poses a serious risk to customers’ security must conduct an audit, (annually). It also describes the components to be evaluated and the measures to be taken, as summarized by digitalpolicyalert.org.
EU-US Data Privacy Framework: Almost all transmissions of personal data to US-based companies, if they have committed themselves to the certification mechanism, are covered by the EU-US Data Privacy Framework, explains the Bavarian state data protection commissioner However, for the transfers of personal data collected in the context of an employment relationship, (‘HR data’), the US business must explicitly state it in its certification. Particular attention must also be paid to onward transfers, for example, if the US processor working for the EU data exporter transmits the personal data to a sub-processor in another third country. The US adequacy decision cannot apply in this situation.
‘Freedom of Information’ and data protection: Guernsey’s data protection commissioner discusses Freedom of Information requests that caused some of the most extraordinary data breaches recently, (eg, when details of thousands of police and civilian personnel employed by the Police Service of Northern Ireland were released in error). Freedom of Information generally refers to the right of citizens to access information held by public authorities. In reality, this information will often include personal data about individuals, whether that is staff, citizens or other individuals that the public authorities are in contact with. The rights of all individuals must be considered before any disclosure. If you are a data controller, you must understand your legal obligations concerning data subjects’ rights and have appropriate policies and procedures to ensure they are dealt with properly.
Biometric data: Meanwhile the UK Commissioner’s Office is currently consulting on draft guidance on biometric data. This guidance explains how data protection law applies to organisations that use or are considering using biometric recognition systems or vendors of these systems. At a glance:
- You must take a data protection by design approach when using biometric data.
- You should do a data protection impact assessment before you use a biometric recognition system. This is because using special category biometric data is likely to result in a high risk.
- Explicit consent is likely to be the only valid condition for processing available to you to process special category biometric data.
- If you can’t identify a valid condition, you must not use special category biometric data.
Employees’ digital monitoring rules: Digital work tools can record large amounts of data about employees, and therefore monitoring of it is heavily restricted, states the Norwegian privacy regulator. In most cases, the employer does not have the right to monitor the employee’s use of work tools, including the use of the Internet, unless the purpose of the monitoring is to manage the company’s computer network to uncover or clarify security breaches, etc. At the same time, it can be difficult for employers to introduce such measures in particular cases, as many regulations control different aspects of the working environment, and may include trade union approval, transparency obligations, data protection implications, and information security.
Privacy by default: This means that products and services are designed to ensure that a person’s privacy is protected from the outset and that they do not need to take any additional steps to protect their data, explains the Latvian data protection regulator. This approach is designed to minimise possible violations in the process of data acquisition and usage, and unauthorized access and risks that could arise if personal data comes into the possession of a third party. This may include minimal necessary data collection, default settings of the user account, (in “private mode”), limited data retention, (followed by automatic anonymisation or deletion of user data if the account is inactive for a certain period), user control tools, (whether to allow the user profile to be found in search engines, etc), clear information notices, (including all third parties with whom the data may be shared), and security measures, (encryption, regular security audits).
UI Path data leak: The Romanian data protection authority has fined learning platform Uipath SRL approx. 70,000 euros for massive data loss. It did not implement adequate technical and organisational measures to ensure that, by default, personal data cannot be accessed, without the intervention of a person(s), including the ability to ensure the ongoing confidentiality and resilience of processing systems and services, as well as a process for regularly testing, assessing and evaluating the effectiveness of implemented measures. This fact led to the unauthorised disclosure and access to personal data, (user name and surname, the unique identifier, e-mail address, the name of the company where the user was employed, the country and details of the level of knowledge obtained within the courses), of about 600,000 users of the Academy Platform, for about 10 days. This violation is likely to bring physical, material or moral harm to the data subjects, such as the loss of control over their data or the loss of data confidentiality.
Misconfigured cloud storage: The UK Information Commissioner issued a reprimand to a recruitment company: the organisation misconfigured a storage container, with 12,000 records relating to 3,000 workers, to be publicly accessible without any requirement to authenticate. The personal data consisted of a variety of different data sets, including names, addresses, dates of birth, passports, ID documents and national insurance numbers. The company has since committed to periodically audit the configuration of cloud services as part of a wider security assessment including access rights, appropriate identity and access controls, event logging and security monitoring.
Vklass data leak: The Swedish privacy regulator has been reprimanding the learning platform Vklass for not being able to detect abnormal user behaviour in its learning platform and to track what happened in the system. Multiple complainants alleged that an unauthorized person came across personal data about teachers and students from the learning platform. The reports come from municipal committees and private businesses that conduct school and educational activities. The incident probably occurred because a student wrote a script that automatically saved information from the learning platform in its database and the information was then published openly on a website, which is now closed.
Edmodo and minors’ consent: Meanwhile in the US, the Federal Trade Commission obtained an order against education technology provider Edmodo for collecting personal data from children without obtaining their parent’s consent and using that data for advertising, in violation of the Children’s Online Privacy Protection Act Rule, (COPPA), and for unlawfully outsourcing its COPPA compliance responsibilities to schools. Among many orders, the provider is obliged to identify the account in question and delete or destroy certain data, (from students under 13 years of age), periodically provide compliance reports to the Commission, permanently refrain from collecting more personal information than reasonably necessary for the child to participate in any activity offered on the online platform, etc.
High-risk systems: For some so-called “critical processing” IT systems, a data breach would create particularly high risks for people. As a result, they require an adequate level of security. To best support the professionals concerned, the French regulator CNIL submits a recommendation for public consultation, (in French). It specifically targets so-called “critical” treatments, defined by the following two cumulative criteria: a) the processing is large-scale within the meaning of the GDPR, and b) a personal data breach could have very significant consequences either for the data subjects, for state security or society as a whole.
This includes customer databases and other processing that bring together a large part of the population, such as in the energy, transport, banking or large-scale dematerialised public services, health treatments, etc. Risk scenarios may include attacks by organised criminal organisations or “supply chain attacks”, likely to take place over a long period; the compromise of third-party service providers responsible for IT development, maintenance or support operations; the exploitation of unknown vulnerabilities of software or hardware components, the compromise of persons authorised to access the processing.
Email security guidance: Guidance by the UK Information Commissioner explains what organisations should, and could do to comply with email security, including several case studies and a checklist. Even if email content doesn’t have anything sensitive in it, showing which people receive an email could disclose sensitive or confidential information about them. In brief:
- You must assess what technical and organisational security measures are appropriate to protect personal information when sending bulk emails.
- You should train staff about security measures when sending bulk communications.
- You should include in your assessment consideration of whether using secure methods, such as bulk email services or mail merge services, is more appropriate, rather than just relying on a process that uses Blind Carbon Copy.
- If you are only sending an email to a small number of recipients, you could consider sending each one separately, rather than one bulk email.
Open AI for organisations: Open AI offers its most powerful version of ChatGPT to enterprises. It has longer context windows for processing longer inputs, advanced data analysis capabilities, customization options and more. According to the company, 80 per cent of Fortune 500 companies, (largest US corporations), have registered ChatGPT accounts, as determined by accounts associated with corporate email domains. Businesses have expressed concerns about privacy and security, fearing that their data may be used to train ChatGPT and that the application could mistakenly reveal sensitive consumer information to AI models. According to OpenAI, ChatGPT Enterprise users will have complete rights and ownership over their data, which will not be used for algorithm training.
‘Algorithmic disgorgement’: At the same time, the US Federal Trade Commission reminds companies of certain obligations when using Generative AI. When offering a generative AI product, companies need to inform customers whether and the extent to which AI training data includes copyrighted or otherwise protected material. Companies should not try to “fool people” into thinking that AI-generated works were created by humans. Companies must ensure that customers understand the material terms and conditions associated with digital products. The regulator also noted that unilaterally changing terms or undermining reasonable ownership expectations can be problematic, etc. Finally, in its enforcement of data protection regulations, the Commission has lately begun to compel “algorithmic disgorgement” – the destruction of not just the illegally obtained data itself, but also artificial intelligence models and algorithms constructed using such data.