Data protection & privacy digest 19 Jan – 3 Feb 2023: threshold for cookies, spy pixels, consent evidence, data storage and deletion

TechGDPR’s review of international data-related stories from press and analytical reports.

Legal processes: threshold for cookies, advertising claims’ mediation, China’s outbound transfers

The EDPB approved a minimum threshold for the use of cookies and subsequent processing of the data collected. No cookies that require consent can be set without positive action expressed by the user, or purely on the grounds of the data controller’s legitimate interest. The absence of refuse options, visible and accessible at any time, on any layer of the banner, constitutes an infringement. The limitations, such as for strictly necessary technical cookies, must be indicated. Any confusing information, designs and colours are not acceptable.

The Spanish data protection agency AEPD announced a mediation system to expedite the resolution of advertising claims, (in Spanish). It has approved the modification of the Autocontrol Code of Conduct ‘Data processing in advertising activity’ , which includes out-of-court procedures to resolve individual’s complaints more quickly. Advertisers must respond within a maximum period of 15 days, proposing the actions they deem pertinent for mediation. The maximum duration of the procedure will be 30 days.

The Cybersecurity Administration of China has published guidelines on outbound data transfers of personal and important data from China to other jurisdictions, whitecase.com reports. Organisations must comply with these guidelines by 1 March or risk administrative, civil and criminal penalties. In certain cases the measures include security assessments and approval from the state before engaging in outbound data. Outbound data transfers in this case include:

  • an entity in China actively sends data to a recipient in another jurisdiction, or 
  • permits a person or entity outside China to access data generated in the course of the data processor’s operations in China;
  • multinational intragroup transfers of data, and 
  • operating centralised document management systems for global operations, with servers hosted outside China. 

Official guidance: consent evidence, data storage periods and deletion, TOMs, training, recruitment data

Denmark’s privacy regulator explained the balance between consent evidence requirements and data minimisation. The data controller should be able to demonstrate that the data subject has given consent. However, the rule only applies as long as the data processing is ongoing. After the end of the processing activity, (eg, the data subject has withdrawn their consent), ​​there is no obligation to demonstrate that evidence. Moreover, the data controller has a duty to delete personal and additional data without undue delay after consent withdrawal, (unless needed for claims to be established or defended and only for a short period of time).

The Portuguese privacy regulator CNPD published a guidance on technical and organisational security measures, aimed at data controllers and processors. The CNPD lists a set of TOMs that must be considered by organisations in their risk prevention and minimisation plans, (in Portuguese). The list is dynamic and not exhaustive due to rapid technological changes and is therefore subject to updates whenever necessary. The increasing number of security incidents in the past year revealed that if organisations had been equipped with adequate security measures, the risks would have been lower and the impact on the rights of data subjects smaller. 

The GDPR states that the organisation, (controller), is obliged to limit the storage of personal data with the intention that the data is not stored longer than is necessary to achieve its purpose. The Latvian privacy regulator DVI explains how to determine the data storage period, and what to do when it is expired. The organisation must have internal procedures in place in order to determine:

  • that the purpose has been achieved, and the data cannot be further used for any other unrelated purpose ,(eg, if the deadline specified in the regulatory act has been reached, or the loss of the legal basis);
  • the frequency with which the purposes of the data processing and their justifications will be reviewed;
  • how to receive a signal that personal data has expired, and
  • how to inform data subjects of these periods, (or the criteria that were taken into account to determine them), in the privacy policy. 

In the end, data must be deleted completely, without possibility of recovery. The deletion procedures must include finding persons responsible, location of the data, deletion follow-up, informing processors and other controllers, and the data subjects.

The Latvian regulator also issued a reminder of the importance of data protection training. It is necessary to familiarise employees with the framework created in the organisation for data protection and processing: cyber security, specific industry regulations, employee liabilities for violations, data breach responses, and reviewing procedures. A desired outcome would be: a customer is asked to provide his personal data for identification; if the client has questions about why this is necessary, the employee should be able to reasonably answer it and indicate that more detailed information is available in the privacy policy. 

A recruitment process necessarily involves the processing of a significant amount of personal data about candidates. The rise of new technologies has multiplied recruitment channels, (social networks, personalised advertising, specialized search engines), and communication tools used (videoconferencing, chatbots, mobile applications). It has also led to the creation of databases of a large volume allowing the use of artificial intelligence or the use of tools to assess the “soft skills” of candidates. In this context, the French regulator CNIL offers a guide and a set of practical sheets, Q&As, to support recruitment stakeholders in their compliance, (in French). 

Investigations and enforcement actions: game developers, spy pixels, psychometric tests, unwanted membership, Covid-related algorithms, email security

The UK’s ICO published Age Appropriate Design Code Audit, (AADC), of Facepunch Studios, a games developer. Facepunch does not require a user account, although some gameplay data and device information is collected in-game. Facepunch also share some personal data of users with third parties in order to operate parts of or functions within their games or services. The audit concluded that Age assurance measures in place should be improved, by assessing and reliably determining the actual ages of current UK child users, regularly monitoring the effectiveness of the third-party age gate used, and assessing which elements of an online service are appealing to or likely to be accessed by children. Where actual user ages are not established with certainty, the AADC standards should be applied to all users. 

The Danish data protection authority criticized Vækstfonden, (Denmark’s investment fund), for using spy pixels in its newsletters. As with the processing of personal data using cookies on websites, the use of spy pixels requires a processing basis according to the GDPR. Spy pixels were to analyze which articles the recipients clicked on in order to optimize the organisation and sending of the newsletters. But they had not observed the obligation to provide information regarding the processing. Vækstfonden has stated that they have changed suppliers for sending out newsletters and that the fund has updated its privacy policy. 

Spain’s AEPD fined Thomas International 40,000 euros for processing of sensitive data, Data Guidance reports. The complaint concerned a psychometric test provided by Agroxarxa, which was run by Thomas International. Though Agroxarxa stated that candidates were not required to provide sensitive personal data, the psychometric test requested it, adding that its provision was required by the HR department of Agroxarxa. Thomas International provided the same questionnaire to all clients that used its services, allowing for the processing of sensitive personal data even when not requested by the client.

In the US, the Federal Trade Commission is sending payments totaling more than 973,000 dollars to 17,064 people who lost money after NutraClick automatically enrolled them in unwanted membership programs for supplements and beauty products and misled consumers about when they had to cancel trial memberships to avoid monthly charges.

The Italian privacy authority has sanctioned three local health authorities, who, through the use of algorithms, had classified patients in relation to their Covid-related complications risks. Data of the patients had been processed in the absence of a suitable regulatory basis, without providing the interested parties with all the necessary information, (in particular on the methods and purposes of the processing), and without having previously carried out an impact assessment. 

Ireland’s privacy regulator fined a nursing homes operator. The credentials of a user account at a nursing home were captured on a fake website via a phishing email. This allowed the bad actor to set up email forwarding of all inbound emails to a third-party email account. Adequate technical and organisational measures could have included appropriate encryption of data being transferred over external networks, suitable phishing training, and regular testing of the safeguards. 

Meanwhile, the Swedish privacy regulator fined an insurance company for sending sensitive personal data via e-mail without sufficient protection. The email was only encrypted in transit. The encryption ended before the message had reached the final recipient and there was thus a risk that unauthorised persons could read the message in plain text after the encrypted transmission had ended.

Data security: ISO 31700 Privacy by Design, AI Risk Management Framework by NIST, taxonomy of ICT incidents, mobile data

The International Organisation for Standardisation has finally published the long-awaited ISO 31700. It establishes high-level requirements, (and use cases), for privacy by design to protect privacy throughout the lifecycle of a consumer product, including data processed by the consumer. This includes consumers’ personally identifiable information and other data processed, (collected, used, accessed, stored, and deleted), or intentionally not collected or processed by the organisation and by the digital goods and services within the digital economy. The preview document is available here.

America’s NIST published an AI Risk Management Framework. AI systems and the contexts in which they are deployed are frequently complex, making it difficult to detect and respond to failures when they occur. AI risk management can drive responsible uses and practices by prompting organisations and their internal teams who design, develop, and deploy AI to think more critically about context and potential or unexpected negative and positive impacts. Core concepts remain human centricity, social responsibility, and sustainability.

In Italy, the National Cybersecurity Agency offered a new taxonomy of incidents on ICT assets, subject to mandatory notification. After initial access, execution, installation & lateral movements, it talks about “Actions on objectives”, which refers among other things to: collecting from within the network confidential and sensitive data or detecting their presence outside the systems authorised to process them; exfiltrating data from within the network to external resources or manipulating, degrading, disrupting, or destroying systems, services, or data. 

Could your phone be leaking data that you are not aware of? asks the US NIST. It goes on to explain how control of the data may be lost due to unauthorized or unwarranted transmission of data to an external source. Mobile data leaks can also occur when mobile device privacy settings or applications are misconfigured. This includes personally identifiable information, financial and health data, video and audio files, information about the way an individual uses the Internet, and location tracking data. Thus, organisations have to:

  • Manage mobile device settings;
  • Preserve confidentiality, by employing data in transit protection;
  • Keep mobile operating system and applications up to date;
  • Apply zero trust principles;
  • Separate work from personal information, by deploying a Bring Your Own Device;
  • Apply App vetting to identify security and privacy risks;
  • Apply Mobile Threat Defense solutions that monitors for device-, app-, and network-based attacks.

Big Tech: the Digital Services Act’s deadline, Replika AI chatbot ban

The European Commission has published non-binding guidance to help very large online platforms and search engines within the scope of the Digital Services Act, (DSA), to comply with their requirement to report user numbers in the EU, at the latest by 17 February, and at least once every six months afterwards, (for small businesses and start-ups the info must be provided on the request of authorities). In the nearest future very large online platforms and search engines will be subject to additional obligations, such as making a risk assessment and taking corresponding risk mitigation measures on users’ rights online. 

Replika, an AI chatbot company, is not allowed to use the personal information of Italian users, according to Italy’s data protection agency, which cites risks to children and emotionally fragile individuals. The US-based start-up offers users personalised avatars that talk and listen to them. The lack of an age-verification mechanism, such as filters for minors or a blocking mechanism if users do not explicitly state their age, was one of many issues that the Italian regulator highlighted. Additionally, the processing of personal data by the company is illegal because it cannot be justified by a contract that a minor is unable to sign.

Do you need support on data protection, privacy or GDPR? TechGDPR can help.

Request your free consultation

Tags

Show more +