Data protection & privacy digest 4 – 17 Feb 2023: synthetic data for fintech, MS Excel guide, Palantir technology ban

TechGDPR’s review of international data-related stories from press and analytical reports.

Legal processes and redress: synthetic data for fintech, draft Data Act, DPO dismissals

The UK Financial Conduct Authority, (FCA), issued a statement on synthetic data for beneficial innovation in UK financial markets. It strongly indicated fraud and anti-money laundering as a key use case for synthetic data, in part due to its ability to augment rare patterns of behavior in a dataset. Whilst the data protection legislation places conditions on such data processing, the FCA emphasizes that data sharing between different entities, (eg, access to the real datasets, as well as synthetic transactional datasets with embedded fraud typologies), is possible under the current regulatory framework if at least one lawful basis is met, accompanied by built-in privacy by design, data protection impact assessments, data sharing agreements, and other legal requirements.

The European Parliament adopted the draft Data Act – new rules for fair access and use of industrial data. It would contribute to the development of new services, in particular in the sector of AI where huge amounts of data are needed for algorithm training. It can also lead to better prices for after-sales services and repairs of connected devices. When companies draft their data-sharing contracts, the law will rebalance the negotiation power in favour of SMEs, by shielding them from unfair contractual terms imposed by companies that are in a significantly stronger bargaining position. Finally, the proposed act would facilitate switching between providers of cloud services, and other data processing services, and introduce safeguards against unlawful international data transfer by cloud service providers.

The CJEU rendered two decisions regarding the procedures for dismissing data protection officers and their potential conflicts of interest, (under the German Federal Data Protection Law), insideprivacy.com reports. In the relevant cases, the DPO also handled other organisational duties in a professional capacity. The data controllers argued that since those positions were incompatible, (chair of the work council in one of the cases), the DPO’s dismissal was appropriate. The former DPO started a legal action which ended up in the EU top court. 

However, the CJEU determined that as long as the national laws do not undermine the goals set for DPOs under the GDPR, EU member states may require that DPOs be dismissed for “just cause”. It is also for the national courts to decide whether a conflict of interest existed taking into account “all the relevant circumstances, in particular the organisational structure of the controller or its processor and in light of all the applicable rules, including any policies of the controller or its processor.”

Official guidance: MS Excel, research projects, free data protection tool, game developers

Bavaria’s data protection authority explains how to avoid data breaches when using Microsoft Excel. It is not uncommon for users to encounter the program intuitively; Contrary to the primary purpose, Excel is often used when the number of columns in Word is not sufficient. However, if there is personal data in an Excel workbook, improper handling of the application can easily trigger a data breach. Excel workbooks can contain multiple worksheets, (the number is only limited by the available memory), even if you don’t work regularly with such “multi-sheet” workbooks yourself. Be especially careful with Excel files created by others, as Excel workbooks can contain invisible worksheets, as well as columns, rows, or even individual cells, comments, and metadata. It is worth remembering:

  • before sharing an Excel workbook with personal information, especially before attaching it to an email, make sure that you really want to share everything;
  • consider whether the file should be processed further by a recipient, otherwise;
  • send a PDF version that can be checked for hidden data before sending;
  • if possible, consistently delete the worksheets that are no longer required;
  • before creating a new workbook with multiple worksheets, consider whether you can complete the task with multiple single-sheet workbooks;
  • consider whether you need Excel for the task to be completed or whether a “simple” resource, (eg, a word processing program), will suffice.

If not careful, an Excel data breach can trigger the reporting obligation under Art. 33 of the GDPR, and the notification obligation under Art. 34 of the GDPR.

Meanwhile, the Danish data protection authority has amended rules for deleting personal data at the end of research projects. Data controllers may have a legitimate need to process information for a period after the end of the investigation, (eg, for the purposes of peer review or countering accusations of scientific misconduct), so data should not always be deleted, anonymised, destroyed or returned at the end of a research project. Personal data can be transferred for storage in an archive in accordance with the rules in archive legislation. In addition, in some research areas, work is done with ongoing coverage of research fields, and building of relationships or data material, where it is not meaningful to talk about a project being “finished”. 

The Finnish data protection authority is promoting its data protection tool available as open source code to increase the data protection expertise of SMEs. You can familiarise yourself with the tool (in English) here. With the initial level test, the respondent can first check how well they control the basic issues of the data protection regulation. The role-mapping test helps the respondent to define what role the company plays in regard to the processing of personal data. Each role also has its own tests. The source code and content of the data protection tool are for free use, to further develop a company or industry-specific privacy tool or to produce new language versions, or even in commercial applications.

Finally, the UK Information Commissioner’s Office offers new guidance to game developers on protecting minors. The recommendations are based on the experiences and findings during a series of voluntary audits, (eg, on Yubo, Facepunch), of game developers, studios and publishers within the gaming industry: 

  • The age range of the players and the different needs of children at different ages and stages of development should be at the heart of how you design your games. 
  • Designing games to promote meaningful parent/guardian – child interactions, while setting a high level of privacy by default and appropriate parental controls is key.
  • It is important to only process children’s personal data in ways that are not detrimental to their health or wellbeing. 
  • It is crucial that games do not use nudge techniques to lead children to make poor privacy decisions.
  • Bad privacy information design obscures risks, unravels good player experiences, and sows mistrust between children, parents, and game providers.

Investigations and enforcement actions: employee emails monitoring, failed data subject requests at a sports center, HBNR and BIPA violations in the US, student data management

In Austria, the data protection authority finds employer’s monitoring of employee emails unlawful. Several complainants argued that the company, without their consent and knowledge, checked the technical mail server logs of all 6,000 employees for a specific recipient domain. The reason for this control measure was the suspicion of a breach of trade secrets. The data protection authority came to the conclusion that the control measure, which only took place six months after the incident that gave rise to it, was not proportionate due to the lack of a temporal connection and the topicality. Plus, there was no valid consent from the works council. 

The Norwegian data protection authority confirmed its fine of over 900,000 euros to Sats for breach of several provisions in the GDPR. The complaints were related to the company’s failure to comply with clients’ demands for access and deletion. Furthermore, the fitness centre chain lacked the authorisation to process data about the customers’ training history. Sats is the Nordic region’s largest fitness center chain and has its head office in Norway.  Therefore the Norwegian regulators dealt with the case in collaboration with other supervisory authorities under the so called one-stop-shop mechanism.

In the US, the Illinois Supreme Court ruled that fast food chain White Castle System must face claims that it repeatedly scanned the fingerprints of nearly 9,500 employees without their consent, (to access a company computer system), which the company says could cost it more than 17 billion dollars. The Illinois Biometric Information Privacy Act, (BIPA), imposes penalties of 1000 dollars per violation and 5000 dollars for reckless or intentional violations. The law requires companies to obtain permission before collecting fingerprints, retinal scans, and other biometric information from workers and consumers. 

Also in the US, the Federal Trade Commission has taken enforcement action for the first time under its Health Breach Notification, (HBN), Rule against the telehealth and prescription drug discount provider GoodRx Holdings, for failing to notify consumers and others of its unauthorized disclosures of consumers’ personal health information to Facebook, Google, and other companies. The company collects personal and health information about its users, including information from users themselves and from pharmacy benefit managers confirming when a consumer purchases a medication using a GoodRx coupon. 

From 2021 US health apps and smart products that collect or use consumers’ health information must comply with the HBN Rule. It ensures that entities not covered by the Health Insurance Portability and Accountability Act, (HIPAA), face accountability when consumers’ sensitive health information is breached. In the above case, GoodRx also displayed a seal at the bottom of its telehealth services homepage falsely suggesting to consumers that it complied with the HIPAA.

The French privacy regulator CNIL gave formal notice to two higher education institutions to comply with the GDPR concerning files used for administrative and pedagogical management. Areas of non-compliance include data retention period, student information, use of subcontractors, and data security:

  • they had not provided a precise retention period for all processing of students’ personal data, nor have they provided for a purge and archiving system;
  • they do not properly inform students about the collection of their data via the various forms they fill out during their schooling;
  • they were not able to send the CNIL the duly signed data processing agreements with subcontractors;
  • they had no password policy to guarantee a minimum level of security in this area.

Data security: messaging apps

Privacy International issued a guide on communicating with others via messaging apps. Reportedly, there are two main aspects to consider: a) whether it offers end-to-end encryption that protects the content of your communication; and b) whether it collects any information beyond the content of the message, such as location, who you communicate with, and other details referred to as ‘metadata’. For sensitive conversations, it may be sensible to use disappearing messages if offered by your app, (however, it is unclear whether self-destructing messages are also recoverable by mobile phone extraction technology).

The use of E2EE for messaging should always be preferred over text messages, which are completely unencrypted meaning they can be easily read, manipulated in transit, or spoofed. They may also be stored by your telecommunications provider, which may be subject to access requests from governments and law enforcement. For example, Signal uses E2EE not only to encrypt the contents of messages but also to obscure all metadata even from itself. In contrast, both WhatsApp and Telegram store, and can access IP addresses, profile photos, “social graphs”, and more.

Big Tech: Palantir technology ban in Germany, more Tik Tok data centers in Europe

A top German court ruled against the use of software developed by the Palantir Technologies, saying that police use of automated data analysis to prevent crime in some German states was unconstitutional as it infringes on the right to informational self-determination. The US-based technology has so far been employed, among other things, to look into the criminal organisation accused of plotting to overthrow the German government in December, Reuters reports. Palantir says it only offers software for processing data. However, the German Society for Civil Rights, which brought the lawsuit, claimed the software used data from innocent people to form suspicions and could produce errors.

TikTok plans to open two more data centers in Europe, (Ireland), hoping to lessen regulatory pressure on the business. Data migration for TikTok users in Europe will start this year and last until 2024. TikTok hasn’t been subject to the same hefty fines as Google and Meta in the EU. Now TikTok is attempting to reassure governments and privacy regulators that users’ personal information cannot be accessed and that its content cannot be altered by the Chinese government or anyone else working for Beijing. 

The company also reported an average of 125 million monthly active users in the EU, under the brand-new online content rules known as the Digital Services Act. For comparison, Twitter says it has 100.9 million. Alphabet – 278.6 million at Google Maps, 274.6 million at Google Play, 332 million at Google Search, 74.9 million at Shopping, and 401.7 million at YouTube. The Meta Platform claims 255 million on Facebook and about 250 million on Instagram.

Do you need support on data protection, privacy or GDPR? TechGDPR can help.

Request your free consultation

Tags

Show more +