Data protection & privacy digest 16 Dec – 2 Jan 2023: US signals intelligence redress mechanism, “dormant” privacy risk assessment, data brokerage

TechGDPR’s review of international data-related stories from press and analytical reports.

Legal processes and redress: US signals intelligence redress mechanism, Google search results removal, California consumer privacy rights, Australia Privacy Act review

The US Office of the Director of National Intelligence, (ODNI), published a directive for implementing the signals intelligence redress mechanism created under the proposed EU-US Data Privacy Framework. It is necessary for the implementation of the US adequacy decision which received a green light from the European Commission just before the end of 2022. The directive governs the handling of redress complaints regarding certain signals intelligence activities and outlines the process by which qualifying complaints may be transmitted by an appropriate public authority in a qualifying state. Additionally, the directive outlines the role of the ODNI Civil Liberties Protection Officer with a given complaint: 

In Sweden, the Supreme administrative court rejected the appeal in a case between Google and the Swedish privacy regulator IMY. This means that the judgment gains legal force and that Google must pay a 4.5 million euro fine. In 2020, the IMY charged Google for violating the right to have search results removed. When Google delisted search results the site owner was notified of the webpage and data subject concerned via Search Console, previously Webmaster Tools. But informing the site owner meant that the personal data was used beyond its original purpose, and the information notice was misleading users and restraining them from exercising their right to request removal. 

California consumer privacy rights expanded on 1 January, (but will be enforced in July).  In 2020, California voters approved Proposition 24, known as CPRA, amending some of the older CCPA’s consumer protections and therefore expanding business’ obligations. For example, previously employees, job applicants, owners, directors, officers, and contractors were excluded from the definition of “consumer,” and they had limited data subject access rights. These rights include the ability to opt-out of profiling, opt-out of targeted/cross-context advertising, opt-out of automated decision making, and to limit the use and disclosure of sensitive information. The new law establishes annual privacy risk assessments and cybersecurity audits. Civil lawsuits will also be allowed against companies that fail to take appropriate measures, with potential damages between 100 and 750 dollars per consumer, per incident. 

Australian Attorney-General Mark Dreyfus confirmed that the Privacy Act Review has been completed and a final report received by his department. The announcement came shortly after a wave of spectacular data breaches in the Australian corporate sector. The new privacy regime could include a broader definition of personal data, expanded information obligations for organisations, opt-in consent for users, the right to erasure, and increased penalties for serious or repeated data breaches. 

Official guidance: special categories of data, global cookie review, data brokerage, age-appropriate design tests

The Latvia data protection agency DVI issued a reminder of the rules for the legal processing of special categories of personal data. For special categories of personal data, in order to ensure their legal processing, in addition to complying with the general data protection conditions, it is necessary to observe that by default they are prohibited from processing unless there are exceptional permissions or justifications:

  • a person’s consent, (eg, to receive commercial notices about price discounts for specific goods or services in a pharmacy);
  • social protection rights, (eg, when terminating the employment of a unionised employee, the employer must contact the trade union); 
  • vital interests of a person, (eg, in cases where a person is unconscious and it is necessary to find out his blood group, allergies, etc.);
  • non-profit activity for political, philosophical, religious, or trade union-related purposes, (the personal data is not disclosed outside the said organisation without the consent of the individual);
  • data deliberately made public, (eg, the person has expressed on social networks that they are vegetarian);
  • essential public interests, (eg, information about political party donors must be made public);
  • preventive or occupational medicine, ( eg, assessment of the employee’s work capacity, health or social care, or treatment);
  • public health, (eg, to limit the spread of COVID-19);
  • archiving in the public interest, for scientific, historical or statistical purposes.

The French privacy regulator CNIL published guidelines on the commercial use of customer files – data brokerage. Data controllers need to pay attention to the types of data that can be transferred, (only data relating to active customers can be shared), and on obtaining consent from data subjects for the intended transfer, (eg, via an electronic form). The purchaser also must inform the data subjects of the transfer and the source of the data, (the name of the company that sold the customer files,) and obtain the data subjects’ consent if it wishes to use their data for electronic commercial prospecting.

Bird&Bird offers the latest Global Cookie Review – the legal and regulatory landscape relating to the expanding use of cookies and similar technologies, country by country. Such regulations often follow a path set by the EU GDPR and ePrivacy Directive. The report also contains Asia Pacific, Latin American, and South African overviews, where similar regulations are often lacking or can be even divergent on transparency and consent requirements. 

The UK Information Commissioner’s Office has published design tests to support designers of products or services that are likely to be accessed by children or young people. Each test provides a report detailing areas of good practice as well as ways to improve conformity with the Age-Appropriate Design Code. This includes “best interests of the child” standards like age authentication, safe default settings, parental controls, enforcement, and data protection impact assessments.

Investigations and enforcement actions: credit rating by mistake, “dormant” risk assessment, “defaulting” customers error, employees’ email metadata, mass grocery purchases monitoring, and workers’ fingerprinting

The Norwegian data protection authority has notified Recover of its decision to fine the company 20,000 euros. The matter concerns a credit rating performed without a legal basis. The background to the fine is a complaint from a private individual who was subjected to a credit assessment without any form of customer relationship or other connection to the above company. A credit rating is established after compiling personal data from many different sources including a person’s overall financial situation, any payment remarks, debt-to-income ratio, and whether the person has any mortgages/liens.

The Norwegian regulator also has given Statistics Norway notice of a decision that involves a ban on their planned collection of data on the Norwegian population’s grocery purchases. Through the collection of bank data and bank transaction data, the organisation planned to obtain information on what the population buys, and then link that to socio-economic data such as household type, income, and education level. The regulator believes that a legal basis, (societal benefit of consumption and diet statistics), is not clear and predictable enough for this planned processing of personal data. Even if the purpose is to produce anonymous statistics, intrusion into the individual’s privacy will occur. 

Italian regulator Garante fined Areti 1 million euros: thousands of users were mistakenly classified as “defaulting” customers and unable to switch to other suppliers. The misalignment of the company’s internal systems led to incorrect data migration to the integrated information database consulted by suppliers before signing a new contract. As a result, more than 47,000 Areti customers wanting to change energy supplier were denied an account activation and any potential savings deriving from market advantages, because they were incorrectly red-flagged. 

Additionally, Garante issued a fine to Lazio Regio of 100,000 euros for unlawful monitoring of employees’ email metadata. An internal audit was launched by the region on the suspicion of a possible unauthorised disclosure to third parties of information protected by official secrecy. Metadata was collected in advance and stored for 180 days: date, time, sender, recipient, subject, and size of email. This allowed the region to obtain information relating to employees’ private lives, such as their opinions or contacts. 

No workplace fingerprinting without specific requirements is the ruling from Garante, which fined a sports club 20,000 euros. The authority intervened following a report from a trade union, which complained about the introduction of the biometric system by the company, despite the union’s request to adopt less invasive means of authentication. The company had carried out, for almost four years, the fingerprinting of 132 employees, violating the principles of minimisation and proportionality. It also provided workers with very little information on the characteristics of biometric treatments. 

The Romanian data protection authority completed an investigation at leading retailer Kaufland and issued a fine of 3000 euros. A video recording containing images of a complainant in the parking lot of one of the stores by the commercial chain appeared on the web page of a local newspaper. It turned out that the store manager allowed an employee access to the monitoring room, who captured, with his personal mobile phone, images of the video recordings that were playing and sent them via WhatsApp to a third party. Later, the images were transmitted by posting them by an online publication. As a result, the image and registration number of the car were revealed, with two persons affected by this incident.

The EDPB published a summary on risk assessment and acting in accordance with established procedures. A controller, (in Poland), was notified of a personal data breach that occurred as a result of a break-in at an employee’s apartment and the theft of a laptop. The confidentiality of the personal data was at risk because the stolen computer was only password protected. The controller had kept adequate documentation since the beginning of the application of the GDPR and had performed a risk assessment, but it was only after the data breach occurred that the controller complied with the results of its own risk assessment by encrypting laptop hard drives.

Data security:  zero trust architecture, IoT onboarding, and lifecycle management

The US NIST’s National Cybersecurity Center of Excellence has published a draft practice guide on implementing a zero trust architecture and is seeking the public’s comments on its contents. As an enterprise’s data and resources have become distributed across the on-premises environment and multiple clouds, protecting them has become increasingly challenging. Many users need access from anywhere, at any time, from any device on-premises and in the cloud. Comments from industry participants are welcomed by or before 6 February. 

In parallel, the NIST is also seeking comments on draft guidance on Trusted IoT Onboarding and Lifecycle Management. Scalable mechanisms are needed to safely manage IoT devices throughout their lifecycles, beginning with secure ways to provision devices with their network credentials—a process known as trusted network-layer onboarding. In combination with additional device security capabilities such as device attestation, application-layer onboarding, secure lifecycle management, and device intent enforcement, this could improve the security of networks and IoT devices from unauthorised connections.

Big Tech: face recognition practices by PimEyes, Epic games’ COPPA violations, TikTok apps age rating

The Baden-Württemberg data protection authority announced proceedings against PimEyes, (Face recognition and reverse image search), Data Guidance reports. Recent media reports stated that PimEyes scans the face for individual characteristics on the internet and stores biometric data without proper legal basis, an identified data sharing model, or valid opt-out options. A data subject should be able to agree to the processing of personal data relating to them in an informed and unambiguous manner. In the case of automated retrieval of images on the Internet, these requirements cannot be met. Equally, private company PimEyes cannot undertake police investigative work in the public interest or interfere with the rights of data subjects. Read the original statement here

US Video Game Maker Epic will pay a more than half-billion dollar refund over allegations of children’s privacy law, (COPPA), violations, and tricking users into making unwanted charges for in-game items, (eg, costumes and dance moves). Epic’s Fortnite game has more than 400 million users worldwide. The company will be required to adopt strong privacy default settings for children and teens, (parental notice and consent requirements), ensuring that voice and text communications are turned off by default. This is the Federal Trade Commission’s largest refund award in a gaming case and the largest administrative order in its history. 

Finally, Virginia Attorney General joined 14 other state attorneys general to call on Apple and Google to take immediate action and correct their application store age ratings for TikTok. The change will help parents protect their children from being force-fed harmful content online. The current ratings of “T” for “Teen” in the Google Play App store and “12+” in Apple’s App Store falsely represent the objectionable content found and served to children on TikTok. While TikTok does have a “restricted mode” available, it is also aware that many of its users are under 13 and have lied about their age to create a profile.

Do you need support on data protection, privacy or GDPR? TechGDPR can help.

Request your free consultation

Tags

Show more +