TechGDPR’s review of international data-related stories from press and analytical reports.
Legal processes: cloud in the EU, cookie consent, AI standards, children’s data protection in California
The EDPB has announced a coordinated investigation and enforcement probe on the use of the cloud in the EU by the public sector. Reportedly, the cloud uptake by enterprises doubled across Europe in the last 6 years. The COVID-19 pandemic has sparked a digital transformation of organisations, with many turning to cloud technology. However, public bodies at the national and EU levels may face difficulties in obtaining information and communication technology products and services that comply with EU data protection rules. 22 national supervisory authorities, (also in coordination with the EDPS), will examine public bodies’ challenges with GDPR compliance when using cloud-based services, including the process and safeguards implemented when acquiring cloud services, challenges related to international transfers, and provisions governing the controller-processor relationship. The probe followed by an end-of-year report will be covering a wide range of sectors – health, finance, tax, education, and central buyers, or providers of IT services.
The Norwegian data protection authority Datatilsynet asked the government to tighten national rules on the cookie consent mechanism. Datatylsinet compares the Norwegian and French approaches to cookie opt-out options. In France, like the rest of the EU, consent to the use of cookies is required to be in line with the requirements of the GDPR. The reason for the latest multimillion fines on Google and Facebook from the French regulator CNIL was that the two companies allowed users to consent to the use of cookies through a single click, while the procedure for refusing consent was more cumbersome and time-consuming. In comparison, however, the practice for which tech giants have now been fined in France would hardly have been considered problematic under the regulations for cookies in Norway, where consents are allowed through preset browser settings. In the view of Datatilsynet, these cases illustrate how unsustainable the current regulation of cookies and similar tracking technologies in Norway is, and they ask that the government grant Datatilsynet supervisory powers.
The EU’s effort to set a standard for Al will likely take more than a year before it can become legislation. The main debate is focusing on whether facial recognition should be banned and who should enforce the rules, Reuters reports. The initiative moved forward last year due to pandemics and the spread of algorithm-based gadgets and services in daily life. Reportedly the European Commission wants to allow facial recognition use by law enforcement in terror attacks and serious crimes. But civil rights activists fear it could facilitate discrimination and surveillance by governments and companies. Also, a balanced enforcement approach would be needed where the basic implementation would be at the national level by national regulators and certain applications and certain impacts would be left to the Commission.
In California, legislators proposed a new bipartisan bill to protect children online. The California Age-Appropriate Design Code Act was written after the UK Children’s Code and contains provisions for children’s data protection and limits to online exposure for minors under age 18, IAPP News reports. Existing law, the Parent’s Accountability and Child Protection Act, requires a person or legal entity that conducts business in California and that seeks to sell specified products or services to take reasonable steps to ensure that the purchaser is of legal age. They are prohibited to reuse obtained data during the verification process for any other purposes. Commencing July 1, 2024, this bill would also require a business that creates goods, services, or product features likely to be accessed by children to comply with specified standards, including considering the best interests of children, (eg, using clear language suited to the age of children likely to access that good, service, or product feature).
Official guidance: data for research purposes, DPIA checklist, CNIL’s 2022 strategy
The UK Information Commissioner’s Office is seeking feedback on the draft guidance on the research provisions in the UK GDPR and the Data Protection Act 2018. Both pieces of legislation contain a number of provisions for processing personal data for research purposes: namely a) archiving in the public interest; b) scientific or historical research; and c) statistical purposes. However, they are contained in a number of articles and paragraphs in both pieces of legislation creating a complicated area of data protection. The draft guide helps those engaged in research to carry out their processing while being compliant with the existing law. Adhering to this guide, data controllers should be able to demonstrate their processing is necessary for one of these research purposes and that it meets a set of indicative criteria for each of the three types of research. These provisions cover three broad areas of data protection:
- the data protection principles, (purpose limitation, storage limitation);
- conditions for processing special category data and criminal offence data;
- exemptions from data subjects’ rights and
- appropriate safeguards (data minimization, pseudonymization, anonymisation).
Interested parties can submit their responses by 22 April via this page.
The Spanish regulator AEPD published a checklist, (in Spanish only), to help data controllers carry out data protection impact assessments, Data Guidance reports. The list allows a quick check and prior consultation to ensure all the necessary aspects have been taken into account when carrying out and documenting an impact assessment. In particular:
- those responsible who plan to carry out a prior consultation must complete and submit the said list to the AEPD to verify that it contains the minimum content required;
- if after carrying out the DPIA, and after having adopted measures, the risk is still high, the person in charge must carry out prior consultation with the AEPD before carrying out this processing of personal data, etc.
You can download the full list here. The document also complements AEPD’s risk management and DPIA guide.
The French regulator CNIL published its strategic plan for 2022-2024. The new orientations are divided into three priority areas: a) promoting control and respect for the rights of individuals, b) promoting the GDPR as a trusted asset for organizations, c) prioritizing targeted regulatory actions on subjects with high privacy stakes. Similarly, the CNIL specifies its priority control topics for 2022: commercial prospecting, use of cloud computing, and remote working monitoring. Each year the CNIL conducts several hundred checks, (384 in 2021). Usually, the three themes chosen as priorities for the year represent approximately one-third of the checks carried out:
- Unsolicited commercial prospecting is one of the irritants of French daily life and is a recurring subject of complaints and calls to the CNIL hotline.
- The massive use of teleworking during the Covid-19 pandemic has led to the development of specific tools, allowing employers to ensure closer monitoring of employees’ daily tasks and activities. Many believe that it will become widespread and will continue even when the health situation has returned to normal.
- The use of the cloud is constantly growing in the private and public sectors, followed by massive transfers of data outside the EU to countries that do not provide an adequate level of protection or are vulnerable to data breaches in the event of incorrect configuration.
Data breaches, investigations and enforcement actions: cookie audit, Grindr’s appeal, Danish Capital Region’s data breaches, criminal conviction data
Latvia’s data inspectorate announced the results of cookie audits of websites belonging to 26 companies, IAPP News reports. Auditors looked for comprehensive information on the user and if the appropriate consent of the website user was obtained, including the use of marketing, statistical and analytical cookies. In total, at least one or more non-compliances with the requirements of the GDPR and Latvia’s Information society services Act were found on the websites inspected. The highest number of non-compliances was found for obtaining appropriate consent from a website user in cases where it is mandatory to obtain it:
- none of the websites examined provided adequate consent,
- in most cases only partial consent was obtained from the website user,
- in 4 cases it was considered that no consent was obtained at all.
The least inconsistencies were found in the evaluation of the cookie policy/terms of use available on the website regarding the inclusion of the minimum information required. Official notices were sent to three organizations to evaluate and eliminate non-compliances according to the findings by April, and for the rest by August.
Grindr has appealed against the 6,5 mln euro fine imposed by the Norwegian data protection authority Datatilsynet. Grindr is a location-based social networking app marketed towards gay, bi, trans, and queer people. In 2020, the Norwegian Consumer Council filed a complaint against Grindr claiming unlawful sharing of personal data with third parties for marketing purposes. The data shared was GPS location, IP address, Advertising ID, age, gender, and the fact that the user in question was on Grindr. Datatilsynet concluded that Grindr disclosed user data to third parties for behavioral advertisement without a valid legal basis. Datatilsynet will now assess Grindr’s appeal and consider whether there are grounds to rescind or alter the decision. The Norwegian Consumer Council will also be given the opportunity to express an opinion. If the decision is not rescinded or altered, the case will be sent to the Privacy Appeals Board for processing. Decisions from the Privacy Appeals Board cannot be further appealed, but depending on the circumstances, the parties can file a lawsuit before the courts against the validity of such a decision.
The Danish data protection authority has used criticism, injunctions, and warnings to the Capital Region after two security breaches. Both incidents were reported by the Danish Health and Medicines Authority in 2020 and 2021. In both cases, a data exchange service from the health platform, (for which the Capital Region of Denmark was the data controller), was involved and a couple of thousand medication prescriptions for patients were affected. The security breaches arose on the basis that the integrations between two systems enabling an update in one affect the integrity of the display of information in another. After reviewing both reported breaches, the Danish data protection agency has expressed serious criticism of the Capital Region for:
- not having qualified relevant test scenarios in order to better identify dependencies on other IT systems,
- not having carried out the necessary tests before the changes were made,
- not informing the Danish Health and Medicines Authority about the security breaches when the incidents were established.
The Danish data protection agency has ordered the region to prepare and introduce a process that ensures that known integrations with other systems do not create incorrect information in these, but also to detail mapping of the internal IT architecture and the IT environment in collaboration with the parties involved.
The Spanish regulator AEPD fined Amazon Road Transport 2 mln euros for unlawful processing of criminal conviction data, Data Guidance reports. A union representative filed a claim with the AEPD that for the hiring of self-employed contractors, Amazon Road Transport Spain requested certificates of the absence of a criminal record, specifically requiring the consent of the candidates, so that this data could be transferred to the group companies and their supplier located outside the EEA. Amazon Road Transport claimed that when obtaining a negative certificate, data relating to criminal convictions or offenses was not processed, since the certificate did not contain any data relating to the commission of crimes, and as such, does not fall under Art 10. of the GDPR. The regulator refused to accept their interpretation of the GDPR. The AEPD found that Amazon Road Transport was not diligent, as it failed to implement adequate procedures for the collection and processing of personal data relating to a criminal conviction. The company also has to cease requiring the above certificates, delete all the information of the certificates already provided, bring its processing in compliance with Art. 6 and 10 of the GDPR. At the same time, it was not in violation of Art. 7, and 49.1 of the GDPR, (as explicit consent of a data subject can be used as a derogation for restricted international transfer).
Data security: best practices
The European Union Agency for Cybersecurity, (ENISA), and CERT- EU published a joint set of cybersecurity best practices for public and private organisations. There is a substantial increase of cybersecurity threats for organisations in the EU. Three factors are at play in such a trend: a) ransomware remains a prime threat, putting millions of organizations at risk; b) criminals are increasingly motivated by the monetisation of their activities; c) attacks against critical infrastructure are rising exponentially and other economic sectors, as well as society at large, can be exposed. The publication is mainly intended for decision-makers, (both in IT and general management), and security officers, (CISOs). It is also aimed at entities that support organisational risk management. Recommendations are provided in no particular order. Organizations should prioritize their actions according to their specific business needs:
- Ensure remotely accessible services require multi-factor authentication, (MFA).
- Ensure users do not re-use passwords, encourage users to use MFA whenever supported by an application, (eg, on social media).
- Ensure all software is up-to-date.
- Tightly control third-party access to your internal networks and systems.
- Pay special attention to hardening your cloud environments before moving critical loads to the cloud.
- Review your data backup strategy and use the so-called 3-2-1 rule approach.
- Change all default credentials, employ appropriate network segmentation.
- Conduct regular training.
- Create a resilient email security environment.
- Protect your web assets from denial-of-service attacks.
- Block or severely limit internet access for servers, etc.
Big Tech: Texans’ biometric data, employee spying software, Clearview AI image collection expansion
Texas’s Attorney General Ken Paxton is suing Meta for its use of facial recognition technology to harvest the biometric data of millions of Texans without their consent, Reuters reports. The lawsuit claims 20.5 million Texans use Facebook, and data was captured illegally “billions” of times. The plaintiffs are reportedly seeking hundreds of billions of dollars in civil damages. In 2020 Facebook settled a similar suit in Illinois for 650 million dollars, and last November a blog post announced the system was being axed and any data collected destroyed.
Controversial facial recognition specialist Clearview Ai is going the other way, according to the Washington Post. It revealed Clearview had called on investors for 50 million to collect “100 billion” faces within a year to make “every person on earth identifiable”. Clearview, which collects images from social media and other websites without their or the subjects’ consent works mainly for law enforcement but is seeking to expand into monitoring gig economy workers. Facebook, Google, Twitter and YouTube have all demanded Clearview stop, to no avail. The French, Australian, and UK privacy regulators have already ruled against its practices.
China’s Sangfor Technologies has come under scrutiny for software that spies on company employees and attempts to predict when they will quit, IAPP News reports. The Shenzen-listed company’s “resignation analysis system” monitors employee browsers for job ads, recruitment emails, and social media websites. Ex-employees have been going public about how their employers fired them when they job hunted online, and how they knew exactly what they had been doing on their computers. The story has found an echo on Chinese social media and forums, with many finding the software an infringement of personal privacy.