terms of service

Data protection digest 3-16 Feb 2024: Sneakily changing terms of service and privacy policy won’t help your business

In this issue, you will find that America’s FTC is warning against retroactively changing terms of service or privacy policy. Palantir running the NHS’s new data platform in the UK, and envisaged changes to the EU GDPR enforcement framework and new dispute resolution mechanisms are also in focus.

Sign up to receive our fortnightly digest via email.

Terms of Service and User Privacy

America’s FTC warns AI developers and other companies that quietly changing terms of service could be unfair or deceptive. While businesses creating AI products have strong financial incentives to utilize user data as fuel for their systems, they also have established policies in place to safeguard users’ privacy. A business that collects user data based on one set of privacy commitments cannot then unilaterally renege on those commitments after collecting users’ data. Some companies may attempt to make these changes and inform users covertly by making retroactive amendments to their terms of service or privacy policy, (eg, to use that data for AI training). 

Last summer, the FTC alleged that a genetic testing company violated the law when the company changed its privacy policy to retroactively expand the kinds of third parties with which it could share consumers’ sensitive data, adding supermarket chains and nutrition and supplement manufacturers, without notifying consumers who had previously shared personal data, or obtaining their consent. Additionally, it did not encrypt that data, restrict access to it, log or monitor access to it, or inventory it, according to the complaints. The company stored it in publicly accessible “buckets” on a cloud storage service with thousands of health reports about consumers and raw genetic data, sometimes accompanied by a first name, despite promising users its security practices would exceed industry-standard security practices. 

Other official guidance

Employment data: The Italian privacy regulator launched the Code of Conduct for employment agencies. The agencies that adhere to the code undertake to process only data strictly necessary for the establishment of the employment relationship and must therefore not carry out investigations into jobseeker’s political, religious or trade union opinions or carry out pre-selections based on information regarding marital status, pregnancy, disability, even if candidates have given their consent. 

Agencies must not obtain information by consulting social profiles intended for interpersonal communication. Online information can be collected only if made available on professional social channels. Furthermore, employment agencies will not be able to acquire the candidate’s professional references from previous employers and communicate them to their clients, without “prior explicit authorization from the candidate”.

Camera systems: The Czech data protection authority has published a new methodology for the design and operation of camera systems, (in Czech). The methodology applies to camera systems, (including security cameras), that record as well as camera systems in online mode, minimum technical and organisational measures for them, and use cases. The methodology is not a legally binding document and it remains the duty of personal data administrators to always proceed following the GDPR and EDPB Guidelines No. 3/2019.

New procedures for GDPR enforcement

MEPs have adopted a draft position laying down additional procedural rules for enforcing the GDPR. It deals with cooperation and dispute resolution mechanisms of the GDPR and introduces deadlines for cross-border procedures and disputes. Concerning amicable settlements, such settlements should require the parties’ explicit consent, and should not prevent a supervisory authority from starting an own-initiative investigation into the matter. The MEP’s position also ensures that all parties to complaint procedures have the right to effective judicial remedies, for example when the regulator does not take necessary actions or comply with deadlines. 

Digital Services Act is now fully applicable 

The DSA has applied to online platforms and search engines with more than 45 million users in the EU since 25 August 2023. From 17 February, it applies to smaller platforms and online intermediaries, (goods, content or services), on the European market. Its main goal is to prevent illegal and harmful activities online and the spread of disinformation. For instance, if you complain about what you suspect is illegal content, the service provider must handle the matter and inform you of its solution. 

Compliance will be supervised by the specialised agencies in the Member States, and certain obligations by consumer protection and data protection authorities. To avoid disproportionate constraints, small companies, (with less than 50 employees and an annual turnover of less than EUR 10 million), and micro-enterprises are exempted from the application of various measures, (transparency reports, internal complaints handling system, etc.). More details on the enforcement framework under the DSA are here

More legal updates

Main establishment in the EU: The EDPB clarified the notion of the main establishment under the GDPR rules. A controller’s “place of central administration” in the EU can be considered as a main establishment under Art. 4(16)(a) GDPR only if: 

  • it makes the decisions on the purposes and means of the processing of personal data and, 
  • it has the power to have such decisions implemented. 

Furthermore, the One-Stop-Shop mechanism can only apply if there is evidence that one of the establishments of the controller in the Union takes decisions on the purposes and means for the relevant processing operations and has the power to have these decisions implemented. This means that, when the decisions on the purposes and means of the processing are taken outside of the EU, there is considered to be no main establishment of the controller in the Union, and therefore the One-Stop-Shop should not apply.

CPRA enforcement: California’s Third District Court of Appeal held that the California Privacy Protection Agency’s authority to enforce its amended privacy regulations should have been effective on July 1, 2023. The decision restores the CPPA’s authority and overturns a lower court ruling. The agency has been vigorously enforcing the statutory rights approved by Californians – Proposition 24, the California Privacy Rights Act of 2020 (CPRA). Some of the new and amended regulations implementing the CPRA, which largely define and clarify how businesses must honour those rights, were previously deemed unenforceable by the lower court.

Video gaming and children’s data

The ICO has carried out an age-appropriate design code audit of Gameforge’s processing of UK children’s data. The majority of their games are rated as suitable for children aged 0-12 years. Gameforge does not collect any user data to confirm their ages or identify child users, and subsequently has chosen to apply safeguards to all users by implementing pseudonymisation of all user account data, and not implementing higher risk processing activities such as location tracking or profiling. Gameforge does not use personal data to promote or market third-party products or services, and Gameforge’s online services do not include any third-party advertising.

As notably good practice, the ICO underlined the high level of qualifications and involvement of the data protection team. In particular, Gameforge has made two DPO-certified members key signatories to the company accounts and new/changed contracts. However, opportunities for improvement were also identified, such as a clearer privacy policy, and DPIA that records consultation and feedback/approval with key stakeholders. An assessment also should be undertaken to consider and document the potential ages of users, which can be achieved non-intrusively by using anonymous or aggregated data such as market research. 

Cookie-banners supervision

The Dutch regulator promised to intensify the checks of websites and explained, one more time, how organisations should set up cookie banners to properly request permission: 

  • to provide information in clear text about the purpose;
  • not to automatically enable checkboxes;
  • give all choices in the first layer, (don’t hide certain choices and don’t make someone make extra clicks);
  • not to use a discreet link in the text;
  • be clear about withdrawing consent;
  • carefully choose the legal basis, (do not confuse consent with legitimate interest).

The Bavarian data protection authority meanwhile checked the cookie banners of hundreds of websites and apps and found numerous violations. Many operators, (around 350 websites), now have to change their pages. The regulator has successfully developed a tool which makes it possible to automatically check websites to see whether, in addition to the “Accept All” option, there is also an equivalent option for not granting consent. The test is initially based on the use of a very common consent management platform, (CMP), but will be expanded to include other CMP providers and thus an even larger number of websites in future iterations.

Receive our digest by email

Sign up to receive our digest by email every 2 weeks

Enforcement decisions

Data storage periods: The French CNIL fined the company which publishes the pap.fr website, allowing individuals to view and publish real estate ads, 100,000 euros. The company had defined a retention period of ten years for the customer accounts using paid services on the site, against the consumer code on which it relied. The company informed individuals through an incomplete and unclear privacy policy. The password complexity rule was insufficiently robust and passwords and related data were stored unencrypted. All data relating to inactive user accounts was kept unsorted. 

Online dating site: The Italian data protection authority has fined the manager of a well-known online dating site 200,000 euros for violating the personal data of about 1 million members. Registration on the platform, which has about 5 million members worldwide required the insertion of numerous data, (meeting interest, country, region, city of residence, date of birth, e-mail), and photos, which customers uploaded within the public profile or in the reserved area, without being provided with adequate information on the use that would be made of that data. The information also did not contain any indication of the possibility for data subjects to exercise their rights provided for by privacy legislation. 

The owner of the site did not have a specific privacy policy regarding the storage of the data processed, limiting itself to randomly proceeding with the deletion of accounts that are no longer active and the information contained, as well as unsuccessful registration requests. Finally, although the company was required to do so, it had not drawn up a register of processing activities, had not appointed a DPO, nor had it prepared an impact assessment (DPIA). 

Viamedis and Almerys data breach

The French CNIL is conducting investigations into a data breach which has affected Viamedis and Almerys, operators managing third-party payment for numerous complementary health insurance and mutual insurance companies. More than 33 million people are affected. The data concerned civil status, date of birth and social security number, and the name of the health insurer. Data such as banking information, medical data, health reimbursements, postal addresses, telephone numbers and emails are not be affected by the breach. 

Shoplifter identity

The Dutch data protection authority has granted 500 permits for a collective shopping ban. Shopkeepers with such a permit can warn each other in a defined area about shoplifters and people who cause nuisance, sharing their names and photos. Shopkeepers may only share such a ‘blacklist’ with each other under strict conditions. For example, someone from the police, the municipality or the public prosecution service must always be involved.

Big Data

UK health care data: The Good Law Project NGO raises concerns about the lack of transparency in the contract allowing Palantir to run the NHS’s new system – the Federated Data Platform. The organisation has now taken legal action to challenge the NHS’s data governance. Despite the massive scale of redactions in Palantir’s 500+ page contract, the NGO insists no reasons for the secrecy have been given by the public bodies. The NHS has also signed a contract with the biotech IQVIA, to provide “Privacy Enhancing Technology” for the platform. Around three-quarters of the contract is also completely redacted, including a section on personal data protection. 

Pupil surveillance: Privacy International reports that some UK schools have bought and installed sensors in toilets that ‘actively listen’ to pupils‘ conversations to try to detect keywords spoken by pupils. Such sensors do not record or save any conversations but send alerts to staff when triggered. At the same time, some schools are also pairing them with surveillance cameras, so when activated by a vaping sensor they capture students leaving bathrooms. 

Ulez fines: Italy is investigating the case of Italian police allegedly accessing thousands of EU drivers’ data and sharing it with firms collecting fines on behalf of Transport for London, (TfL). Some other Member States have also claimed that a police department that has not been named has abused its authority by providing personal information about EU drivers to Euro Parking Collections. TfL uses this company to levy fines to enforce low and ultra-low emission zones, (Ulez). Due to national regulations permitting the UK to access EU individuals’ data only for criminal offenses and the fact that breaking Ulez guidelines is considered a civil violation, it is believed that the fines have been unlawfully levied since Brexit.

Do you need support on data protection, privacy or GDPR? TechGDPR can help.

Request your free consultation

Tags

Show more +