Weekly digest October 25 – 31, 2021 “Privacy, DP, and Compliance news in focus”

TechGDPR’s review of international data-related stories from press and analytical reports.

Legal processes and redress

The Administrative Court of Dusseldorf clarified a non-retroactive applicability of the GDPR. In 2016, charges were brought against the plaintiff, a decades-long civil servant for the police and secret services, for tax evasion followed by an alleged disclosure by the court of details of the investigation to the press. The plaintiff had filed a complaint with the local data protection authority, the DSG NRW. It explained that existing data protection laws were only applicable to the courts where they perform administrative tasks. Thus, the inadmissible disclosure of court files falls within the scope of case-law. In 2019 the plaintiff decided to bring an action seeking the enforcement of the GDPR to the court, based on Art.78 – Right to an effective judicial remedy against a supervisory authority. The DSG NRW decision was upheld with further explanations that, despite a data protection breach being manifestly present, the legal redress would be time-barred. Data protection proceedings of the plaintiff were no longer pending at the time of the entry into force of the GDPR, and neither the GDPR nor the old law contain transitional provisions, and would require specific legislative validation.

Quebec’s Bill 64 and the new requirements for cross-border transfers of personal information are explained in McCarthy Tétrault’s latest blog series. The previous Private Sector Act specified that transferring personal information to third parties was permissible without prior consent if was essential for the original business purposes. The new rules include: conducting a prior privacy impact assessment, a PIA, establishing through a written contract the scope of the mandate, the purposes for which the third party would use the information, the categories of persons who would have access, and data subject rights to objection. The definition of Bill 64’s “adequate protection” in the country of destination remains ambiguous in comparison to PIPEDA’s “comparable level of protection” and the GDPR’s “adequacy decision”. The document also makes no distinction between international and inter-provincial transfers, and does not clarify the frequency at which businesses should conduct PIAs.

The US Court of Appeals 2nd Circuit decided when trivial data breaches of personally identifiable information, PII, are not actionable. To have standing, the plaintiff must primarily establish an “injury in fact.” The court identified three factors courts should consider; whether the PII had been exposed as the result of a targeted attempt to obtain that data, whether any portion of the dataset had already been misused, and whether the type of data that had been exposed is so sensitive that there is a high risk of identity theft or fraud. The decision was inspired by McMorris v. Carlos Lopez & Associates, where former employees brought a class action after an employer accidentally emailed 65 employees a spreadsheet containing social security numbers, home addresses, dates of birth, telephone numbers, educational degrees, and dates of hire for approximately 130 current and former employees. The spreadsheet was not shared with anyone outside the company or otherwise taken or misused by third parties. Read more details in the analysis by Thompson Coburn.

A similar dismissed case of a trivial low-level data breach in the UK was explained by Blake Morgan. In Rolfe & Ors -v- Veale Wasbrough Vizards LLP, it was confirmed that it is not sufficient for claimants to merely establish that there had been a data breach; claimants must go further and establish that they have suffered a material or non-material loss as a result of the data breach which is more than merely trivial. The claim arose from solicitors sending a letter containing some personal information to the incorrect recipient who immediately notified the solicitors and subsequently deleted the e-mail.

In Australia, a draft bill that increases privacy breach penalties was released, inviting industry submissions within the next month. Under the draft bill, the maximum penalties applicable to companies for serious or repeated privacy breaches will increase to whichever is higher: 10 million dollars, three times the value of any benefit obtained through the misuse of the information, or 10% of the corporate group’s annual Australian turnover. It also enables the introduction of an online privacy code, covering a wide scope of organisations to regulate social media services, large online platforms and data brokerage services.

The US Federal Trade Commission announced a newly updated rule that strengthens financial institutions’ data security safeguards, following recent data breaches and significant harm to consumers, including monetary loss, identity theft, and other forms of financial distress. The updated Safeguards Rule requires non-banking financial institutions, such as mortgage brokers, motor vehicle dealers, and payday lenders to develop, implement, and maintain a comprehensive security system to keep their customers’ information safe. Institutions must also explain their information sharing practices, specifically the administrative, technical, and physical safeguards the financial institutions use to access, collect, distribute, process, protect, store, transmit, dispose of, or otherwise handle customers’ secure information. In addition, financial institutions will be required to designate a single qualified individual to oversee their information security program and report periodically to an organization’s board of directors, or a senior officer in charge of information security.

The Danish business authority announced that in future it will not prioritize supervision of the consent rules for simple statistics cookies. It justifies the change by recognising that cookies are a necessity for websites, and that the current negotiations in relation to a new regulation on e-data protection indicate that simple statistics cookies for traffic measurement are exempt from consent requirements.

The Danish data protection agency concurs that there may be a need for data controllers to collect and use information for statistical purposes in order to improve their website. However, the rules of the GDPR still apply whenever personal data about website visitors is collected and processed – for statistical or any other purposes. This means that the data controller – e.g. the owner of the website – must ensure that there is a legal basis for the processing of personal data. This also applies to any subsequent processing of data that takes place either at a data processor or when transferred to other independent data controllers.

Official guidance

The German federal data protection authority, the BfDI, clarified how the COVID-19 vaccination status of employees should be processed by employers. Employers generally may not process the “vaccination status” date of their employees without express statutory authorization – not even in the context of the pandemic. The “vaccination status” data is a special category of data pursuant to Art. 9 of the GDPR. Only in individual cases is processing of the “vaccination status” data possible, on the basis of legal requirements, namely, in the health care sector, daycare facilities for children, in the event of a possible infection and subsequent quarantine due to state-required pandemic control requirements, or on the basis of freely given and recorded consent. If the vaccination status is to be stored, no copies of vaccination cards or comparable certificates, (original or copy), may be included in the personnel file. It is sufficient if it is noted that these have been presented in each case.

There were clarifications on CCTV use on private property from Cyprus’s privacy commissioner. While the GDPR does not apply to personal or household activities, the scope of any recording should not go further than the perimeter of said private property. Also any complaints should be made to the police, as the data protection office does not have the power to enter a private property to examine any footage. Visible signs should state that CCTV is in use, explain why, and include a contact number for an operator. If CCTV is installed by a building’s management committee, then it becomes the principal data controller. CCTV may be installed in building entrances and exits, outside lift doors, and over tills and payment points only as long as the camera is only pointed towards them. Cameras can also be installed in building parking areas if the management committee deems it necessary. Finally, CCTV is not allowed in toilets, corridors, lobbies, inside lifts, and indoor or outdoor areas of cafes, bars and restaurants.

Denmark’s data protection agency has published guidance on the use of personal data for testing IT systems, available in Danish. Depending on the circumstances, it may be reasonable and necessary to use personal information when developing and testing IT systems. For example, it will be acceptable to use personal information in connection with final integration tests with other, (external), IT systems, or where there is significant difficulty in creating accurate anonymised test data, in particular because it can be difficult to reflect all the errors and irregularities that may occur in a production environment. In addition, it may be reasonable to use a limited amount of personal information in connection with troubleshooting and error correction. Sometimes it may even be unsafe to put a system into its final production stage without having first tested it with production data, including personal. However, such testing would require a risk assessment for the data subjects, (eg employees, customers and citizens), and appropriate security measures in accordance with the risk assessment.

Some other important guidance published by regulators in the EU and abroad includes:

  • The most common mistakes made by the communities working on draft codes of conduct, by the Polish data protecting authority, UODO. These include the lack of clear justification of the purpose of the code, or the entity applying for approval of the code does not represent the majority of the sector, or a draft code’s scope of consultations is too narrow, not including, for example, data subjects.
  • Guidelines on political campaigns were set by Malta’s IDPC, including the legal bases for door-to-door canvassing, postal and telephony communication, as well as online canvassing, and opting out from direct advertising.
  • China’s draft guidance on identifying important data sets out the identification principles as well as a list of important data. One of them divides data into three classes, namely public data, personal information, and legal person data, and five levels according to their importance – public, internal, sensitive, important and core. Entities in the industrial and telecom sectors are also required to first divide the data into different types – research data, production operation data, management data, operation maintenance data, business service data and personal information, and then divide data into levels and classes.
  • The European Data Protection Supervisor offers ever-so-simple guidance on protecting your personal information from phishing attacks. Suitable even for a young audience, it encourages you to STOP if you receive a suspicious message or email, THINK before you click on any links or attachments contained in the message, and LOOK for clues such as how the email or message is phrased, the time at which the email or message was sent, the list of recipients of the email, the sender’s number or email address, or the tone of the message if there is a sense of urgency.
  • California’s Attorney General has provided consumers and businesses with tips on how to defend against cyber threats. The recommendations emphasise complexity – from creating strong passwords, limiting personal information shared online, checking on privacy settings on your device, to encryption, employee training and wifi network security.

Enforcement actions

Spain’s data protection authority, the AEPD, has issued its third-largest fine after finding flaws in the consent acquisition language used by CaixaBank. The investigation also uncovered that Caixabank requested information about an individual from the solvency file, even though the individual had no ongoing contracts with the bank. The individual was also included in the bank’s marketing campaigns for a pre-granted credit, without proper legal basis or consent and adequate information about the data processing, including profiling. The aggravating factors for the significant fine were the volume of the business and the duration and severity of the negligence.

The AEPD also fined a data controller – Servicios Logísticos Martorell, 16,000 euros for implementing a biometric identification system without carrying out a DPIA beforehand. A workers union complained that a company had implemented a biometric identification system to control its 520 workers’ access using their fingerprints, a system that was used along with a card reader system. The union argued that the workplace was so big that employees had a 20 minute walk to reach their work station, so they needed an additional control system to determine when they really accessed their post. The company argued that the biometric system is more reliable than using cards, since people could use another worker’s card.

The Dutch data protection authority, the AP, has rejected the license application of a Dutch association of small and medium enterprises to keep a blacklist of possible fraudsters and share that blacklist with companies from different sectors. The AP may grant such licenses only when it is necessary for the data to be shared, and sufficient safeguards have been put in place, such as implementing a data collection and sharing protocol. Similarly, the AP rejected a license application for Fraudehelpdesk, a governmental initiative that helps victims of fraud find their way to the right authorities, for not having an implemented protocol in place. “In the event of a data breach, telephone numbers, e-mail addresses and other personal data of suspected perpetrators, whose crime was not proven, can roam the internet. If you are known as a fraudster, even if this is unjustified, you could be fired, for example. Then it may be difficult to get a loan or to rent a home”

The Czech data protection authority, the UOOU, has published an overview of data breaches inspections for the first half of 2021. In one of the complaints, a former insurance company employee stated that the IT department did not fill out an exit checklist at the end of any employee’s contract. This checklist includes the data access revocation, infringing Art. 32 (2) of the GDPR by failing to sufficiently consider the risks of unauthorized access to the data, which could have led to unauthorized disclosure of personal information. In another case, a company operating an online store used cookies illegally. When a user decided to obtain more information about the processing of personal data before granting consent, and clicked on the link “Personal data”, this triggered uninformed consent to the processing of personal data through cookies.

Individual rights

A group of 850 professional footballers in the UK challenged use of their personal data. In the opinion of Herrington Carmichael, “Professional athletes’ performance statistics and attributes have become intrinsic to the sports industry. This information is passed through a multitude of platforms, giving information to clubs on potential player transfers and opponents and it is widely published in the media sphere.” The footballers are arguing that the unchallenged use of their personal data by the firms contravenes their data protection rights under the UK’s GDPR. They do not consent to the sharing of their data which may be used for illegitimate purposes by betting companies, scouting platforms or even video game manufacturers. Moreover, it can be damaging if the data being shared about them is inaccurate. They could miss out on transfers which are not only important for their personal careers but the sports industry as a whole. Collectively the group have claimed compensation for the misuse of their personal data from dozens of firms and demand an annual fee for any firms’ future uses of their personal data.

Opinion

Telemedicine and personal health data exploitation is analysed by Privacy International. The provision of real-time, video-based health consultations, as well as health monitoring software with elements of machine learning capabilities, wireless sensors, etc has become widely used by health professional and patients. As an example, during the pandemic everyday communications technologies, such as FaceTime or Skype, were widely accepted and used by nationwide public health services in the US and the EU. Data collected by these applications varies, and ranges from concrete data points, (eg, heart rate, glucose, blood oxygen levels), to video footage. One of the biggest security concerns stems from the fact that the tools, in terms of design, functionality or security, are controlled by a third party, not the healthcare actors.

European legal challenges for manufacturers of connected vehicles regarding personal data are explained in a nutshell by Bird&Bird:

“It could be that different pieces of information, such as vehicle service information, which on the surface don’t appear to constitute personal data, can be collated and linked to an individual via, for example, a Vehicle Identification Number. The consequence of this is that the CV manufacturer as the data controller might be under an obligation to divulge this data in response to data access requests which can be time consuming. There is a solution known as “tokenisation” which involves anonymising the data irreversibly.”

The EU regulator the EDPB has recently published draft Guidelines on the processing of personal data in the context of CVs and mobility related applications. CV manufacturers must abide by the GDPR obligations in full, including privacy notices to car users, guarantees of data security and minimisation during repair or performing data-driven after sales services.

Big Tech

Canada’s Office of Privacy commissioner published observations following the joint statement by a number of data protection authorities on global privacy expectations of video teleconferencing companies, such as Microsoft, Google, Cisco and Zoom. They should include multilayer visual and audible contextual and timely privacy notices, the ability to opt out of attendance or engagement reports, virtual and blurred backgrounds, user consent prior to host unmuting audio or activating video, transparency on third party contractors, and data center location. Whenever possible users should be able to choose which locations and jurisdictions their personal information is routed through and stored, contractual measures should exist to ensure that information is adequately protected when shared with third parties, including in foreign jurisdictions, along with end-to-end encryption, and limitation of the secondary use of data.

China’s market regulator proposed a long list of responsibilities it said it wanted the country’s internet platforms to uphold, in the latest effort by Beijing to establish an oversight framework for its technology sector. Super large platforms are defined as those having more than 500 million users, a wide range of business types, and a market value of more than 1 trillion yuan, (13 billion euros), a description that would apply to the likes of Alibaba Group, Tencent Holdings and Meituan. Customers data should not be obtained without users’ consent and should be transparent when using big data to recommend products. China’s top internet regulator also published draft guidelines that will subject companies with more than 1 million users in the country to a security review before they can send user-related data abroad. Companies that have already sent abroad, or intend to send abroad, the personal information of more 100,000 users or “sensitive” personal information belonging to 10,000 users, would also be bound by the requirement

Meanwhile in the US, an executive at TikTok, owned by Beijing-based internet technology company ByteDance, faced tough questions during the video-sharing app’s first appearance at a congressional hearing, saying it does not give information to the Chinese government and has sought to safeguard U.S. data. Lawmakers were concerned about TikTok’s data collection, including audio and a user’s location, and the potential for the Chinese government to gain access to the information. An executive testified that TikTok’s U.S. user data is stored in the United States, with backups in Singapore. Senators also voiced concerns that TikTok, rivals of YouTube and Snapchat, have algorithms that can be harmful to young people.

The Apple privacy updates, which began rolling out in April and prevent advertisers from tracking iPhone users without their consent, has had investors in digital ad companies on edge for fear that reduced access to data would upend the nearly 100 bln dollars mobile ad market. Ad businesses such as Snap’s or Facebook’s rely on direct response advertising, an industry term that refers to ad sellers and buyers who use information such as what devices consumers are using and what they are searching for, to place ads in front of interested audiences with the aim of quickly generating sales or website visits. Twitter is likely to be spared because the social networking site is mainly used for brand advertising, and Google is also shielded from the iPhone privacy changes because much of its usage comes from desktops, and promoted results placed on Google searches are not dependent on iPhone data.

While everyone is buzzing about Facebook’s rebranding and transition to the future Metaverse, last week privacy experts once again reminded us of the increasing regulatory lash on Meta: “Regulators the world over are seeking to exercise greater restrictions on what the FB platform can do, with a UK watchdog fining it 70 mln dollars for withholding information related to an ongoing antitrust oversight of its acquisition of GIF-sharing platform Giphy. In Ireland, regulators want to fine the company 38 mln dollars for breaching GDPR data collection policies. And in the US, Congress is increasingly discussing the prospect of amending protections given to social media platforms and reforming antitrust laws and data privacy regulations that affect Facebook.”

Do you need support on data protection, privacy or GDPR? TechGDPR can help.

Request your free consultation

Tags

Show more +