child safety

Data protection digest 18 Jan – 2 Feb 2024: social media industry grilled over child safety and mental health

Child safety online was the subject of a sometimes heated US Congressional hearing, forcing CEOs of the biggest American social media giants to apologise to parents of victims. While legislators are struggling to find a legal solution to the crisis, police are finding evidence of children as young as seven being at risk of harm.

Sign up to receive our fortnightly digest via email.

Children at risk

Last week, the CEOs of Meta, X, TikTok, Snap and Discord were questioned before the US Congress over alleged harms to young users on their platforms – access to drugs and subsequent overdoses, harassment, grooming and trafficking exploitation, leading in some cases to death. Legislators stated that the industry, through its constant pursuit of engagement and profit, failed to adequately invest in trust and child safety. Executives highlighted controls and tools they have introduced to mitigate harm. 

US legislators are pushing forward legal solutions to the existing crisis through the debated Kids Online Safety Act and anti-CSAM legislation, as well as changes to the COPPA rule. Meanwhile in neighbouring Canada, (British Columbia province), some of the measures have just been enforced.

In the EU, a draft Parliament position was adopted by the LIBE Committee at the end of last year, now awaiting further enforcement. The privacy regulators meanwhile warn about present risks to children and their personal information online. For instance, the Guernsey data protection authority recently identified a local Snapchat group that includes children as young as seven, possibly encouraging them to share explicit images of themselves. The police now advise parents:

  • to have conversations with their children regarding the reputational and long-term risks associated with sharing personal information via such networks, and 
  • ensure children are not using social networks or apps if they’re under the authorised age for those networks/apps, (13 for Snapchat). 

In the UK, the Information Commissioner’s Office also created a toolkit of free resources to promote responsible data sharing to safeguard children and renewed its age assurance opinion, an important part of its world-leading Children’s code, reflecting developments over the past two years. A similar age-assurance design code was passed into law in California in 2022.

Legal updates

Draft AI Act: The draft legislation received a unanimous endorsement from all 27 European Union member states. Negotiations over the shape of the law concluded last December, with the main focus on safeguards for foundation models and the use of facial recognition software. According to Euractiv analysis, the primary opponent of the political agreement was France, which, together with Germany and Italy, asked for a lighter regulatory regime for powerful AI models, that support general-purpose AI systems, (protecting domestic start-ups). Nonetheless, the Parliament insisted on the need for strict guidelines for these models. In April, Parliament will hold its final vote on the law.

German employee data protection: DLA Piper’s legal analysis looks at the data protection provisions relating to employees and other workers in Germany. Currently, it is largely determined by case law, and national legislators are very cautious about using Art. 88 of the GDPR – the adoption of provisions that specify data protection requirements in the employment context. Even more problematic, relevant provisions of the Federal Data Protection Act, (BDSG),  after being clarified by the CJEU last year, did not meet the conditions set out in the GDPR. Read more on the envisaged Single Employee Data Protection Act in Germany, in the original analysis

Automated decisions

The Isle of Man data protection commissioner reminds the public of Art. 22 of the GDPR which provides individuals with the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning them or similarly significantly affects them. It is permitted to use such methods only: a) with the explicit consent of the individual; b) if necessary for entering into, or performing a contract between the individual and the data controller; or c) is authorised by law. The controller must also have safeguards in place to allow individuals to obtain human intervention regarding the decision, to contest it in certain cases or to express their point of view. 

AI checklist

The Bavarian data protection authority for the private sector published a draft ‘Data Protection and AI’ checklist, (in German). In addition to a legal basis for the creation of AI models and the operation/use of AI applications, the rights of those affected and other compliance requirements of the GDPR must also be implemented. The data protection risk model must be documented and regularly checked to ensure that it is up-to-date and complete. If necessary, the test points, (see them here), can be checked as part of the control activities by the data protection officer.

Software for schools

children’s safety

The Danish supervisory authority has investigated the use of Google Workspace in Danish schools in 53 municipalities. The report considers that the municipalities have had no reason to forward student data to Google for the development and measurement of services, ChromeOS and the Chrome browser. The data protection authority also reminds the municipalities that they should have found out how Google processes the transmitted personal data before implementing the tools. Municipalities now have to bring the processing in line with the rules:

  • Municipalities should no longer pass on personal data to Google for these purposes. This will likely require Google to develop a technical option for the data streams in question to be intercepted.
  • Google must itself refrain from processing the information for these purposes.
  • The Danish Parliament provides a sufficiently clear legal basis for disclosure for these purposes.

A similar investigation on the use of Google’s teaching platform in schools was conducted in Finland in 2021. The decision does not prohibit the use of the educational platform but states that a legal basis must be defined for the processing of students’ data in Google services.

Purpose limitation

How to comply with the principle of purpose limitation? The Latvian data protection authority explains that when your data is transferred to someone else, it is usually done with the confidence that the data will be used for a specific purpose that is clearly understood by you. The principle of purpose limitation is closely related to other principles established in the GDPR, such as the principle of transparency, because only by knowing the specific purpose of data processing can a person understand what to expect within the scope of their data processing. 

Likewise, determining the exact purpose is related to the principles of data minimisation and storage limitation, because depending on the purpose, the amount of data needed to achieve it can be determined, as well as how long the data needs to be stored. The connection is also with the principle of legality because only the data that is planned to be used to achieve a clearly defined purpose will be able to establish an appropriate legal basis. When concluding processing for a different purpose, the controller must first assess whether this purpose is compatible with the initial processing, including the following aspects:

  • the connection between the purposes;
  • the context in which data has been collected;
  • nature of data;
  • the consequences that further processing would have for the data subject;
  • the existence of adequate safeguards in both initial and intended subsequent processing operations.

Receive our digest by email

Sign up to receive our digest by email every 2 weeks

EDPB documentation

The EDPB published a One-Stop-Shop case digest on Security of Processing and Data Breach Notification. The relevant decisions were initially filtered using Art. 32 of the GDPR, (security of processing), as the main legal reference. This article establishes an obligation for both data controllers and data processors to implement “appropriate technical and organisational measures to ensure a level of security appropriate to the risk”. The analysis of decisions will provide insights into how regulators interpret these obligations in concrete situations, such as how to protect organisations against hacking, how to ensure meaningful and robust encryption, how to build strong passwords, etc. 

The EDPB has launched a website auditing tool that can be used to help analyse whether websites are compliant with the law. It can be used by both legal and technical auditors at data protection authorities, as well as by controllers and processors who wish to test their websites. The tool is Free and Open Source Software under the EUPL 1.2 Licence and is available for download on code.europa.eu. The source code is available here

Enforcement decisions

Prospect data: The French CNIL fined TAGADAMEDIA, (online competition and product testing websites), 75,000 eurost. The data collected by brokers is sent to the company’s partners for commercial prospecting. The prospect questionnaire did not allow free, informed and unambiguous consent to be obtained. The highlighting of the button allowing users to give their consent contrasted to the one allowing users refuse consent, which also featured an incomplete text of reduced size, alongside a strong encouragement for users to agree to the transmission of their data to partners.

Insurance companies: An administrative court in Finland upheld the data protection commissioner’s decisions on the handling of health data by insurance companies. In some situations, insurance companies request personal health information directly from healthcare providers. However, data should be identified and precisely defined, which means only the necessary information from the provider and for the period that is relevant in assessing the insurance company’s liability is required. Also, the insurance applicant’s data from health services cannot be processed before concluding the contract.

Intrusive scientific research: The Italian regulator sanctioned a municipality for conducting two scientific studies, using cameras, microphones and social networks. The projects, financed with European funds, aim to develop technological solutions to improve safety in urban areas. It involved footage from video surveillance cameras already installed in the municipal area, as well as audio obtained from microphones specifically placed on the street. One of the projects also analysed hateful messages and comments published on social media, detecting any negative emotions and processing information of interest to the police. The municipality has not proven the existence of any legal framework for the processing: the data was unlawfully shared with third parties and partners. Furthermore, the anonymisation techniques proved insufficient.

Data breaches

Undetected attacker: America’s FTC’s proposed action against Blackbaud alleges that the company’s failure to implement some basic safeguards resulted in the theft of highly sensitive data about millions of consumers, including Social Security numbers and bank account information. South Carolina-based Blackbaud provides a wide variety of data, fundraising, and financial services to more than 45,000 companies, including nonprofits, foundations, educational institutions, and healthcare organisations. 

In 2020, an attacker purportedly used a Blackbaud customer’s login and password to access certain Blackbaud databases. The attacker rummaged around undetected for three months until Blackbaud finally spotted a suspicious login on a backup server. By then, the attacker had stolen data from tens of thousands of Blackbaud’s customers, which compromised the personal information of millions of consumers. Blackbaud eventually agreed to pay 24 Bitcoin, (valued at about 250,000 dollars), in exchange for the attacker’s promise to delete the stolen data. But Blackbaud hasn’t been able to verify that the attacker followed through. 

Data processor supervision: The Danish data protection authority reported Capio A/S to the police for not having supervised data processors. The private hospital may face a fine of approx 200,000 euros. In particular,  the hospital has not been able to ensure and demonstrate that personal data is processed for legal and reasonable purposes and in a way that ensures sufficient security for the sensitive personal data of the large number of data subjects in question, over several years.

Data security

TOMs: The Swiss data protection authority has revised its guide on technical and organisational security measures, (in English). The guide is primarily intended for people in charge of information systems, whether technicians or not, who are directly confronted with the problem of personal data management. 

Cloud: The French CNIL published factsheets on encryption and data security, (in French). It offers a detailed analysis of the different types of encryption applied to a cloud computing service: encryption at rest, in transit and in-process, and e2ee. The guide also looks at various tools to secure cloud services, (anti-DDoS, WAF, CDN, load balancer), and key vigilance points.

Login: What to do if you detect a credential-stuffing attack? The Lithuanian data protection authority recommends responding quickly and proactively:

  • determining whether the attacker managed to use the available accesses,
  • blocking potential malicious activity,
  • notifying users of an attack and encouraging them to change their passwords,
  • notifying the regulator about the personal data security breach that has occurred,
  • conducting a thorough incident investigation and implement additional security measures to prevent similar attacks in the future, (2FA, automatic attack detection systems, password policy).

Finally, if the attack is systemic or involves multiple platforms, it is recommended to collaborate with other data controllers in analyzing the incident.

Cybersecurity program: As cybersecurity threats continue to mount, you need to show improvements over time to your CEO and customers. How do you measure your progress and present it using meaningful, numerical details? America’s NIST offers a Draft Guidance on Measuring and Improving Your Company’s Cybersecurity Program. It is aimed at different audiences within an organisation –  security specialists and C-suite and can help organisations move from general statements about risk level toward a more coherent picture founded on hard data

Big Tech 

Amazon “stalking” employees: The French data protection authority fined Amazon France Logistique 32 mln euros for putting employees under constant surveillance. The company manages the Amazon group’s large warehouses in France, where it receives and stores items and then prepares parcels for customer delivery. Each warehouse employee is given a scanner to document the performance of certain tasks in real time. Each scan results in the recording and prolonged storing of data used to calculate employee quality, productivity and periods of inactivity, (the “error” margin was set to less than 1.25 seconds or longer than 10 minutes). The company was also fined for video surveillance without information or sufficient security. 

Uber has been fined 10 mln euros by the Dutch data protection authority for violating privacy regulations related to its drivers’ data. Uber failed to specify in its terms and conditions the duration for which drivers’ data is retained and the security measures in place, particularly when transferring data to non-European countries. The fine was imposed following a complaint by over 170 French drivers, which was then forwarded to the French data protection authority and subsequently to the Dutch regulator, as Uber’s European headquarters is in the Netherlands. 

Do you need support on data protection, privacy or GDPR? TechGDPR can help.

Request your free consultation

Tags

Show more +