Weekly digest May 2 – 8, 2022: DPO dismissals, shareholders, athletes privacy, passwordless future & more

TechGDPR’s review of international data-related stories from press and analytical reports.

Legal Processes and Redress: DPO dismissals, Connecticut privacy draft law, EU Health Data

Ius Laboris blogpost explains when data protection officers have special protection from dismissal. Art. 38(3) of the GDPR expressly states that the data protection officer shall not be dismissed or penalised by the controller or the processor for performing his tasks. It establishes an additional guarantee for DPOs who cannot be dismissed for the mere performance of their duties. Therefore, an additional guarantee must be put in place for this type of employees, (this would be the case in a situation such as the comparison here between DPOs and employees appointed as members of an organisation’s Workers’ Representatives). Spanish law does not specifically provide this option to DPOs. However, in 2021, the Labour Chamber of the High Court of Justice of Madrid analysed the remedies for DPOs in the event of unfair dismissal. In particular, if they are entitled to choose between reinstatement in their job or an unfair dismissal severance payment if there are no valid grounds support their dismissal. In the end, the Spanish court not only confirmed the DPO was unfairly dismissed, but also authourised both treatments. Read more here

Meanwhile in the US, Connecticut legislators from both chambers passed an major act on personal data privacy and online monitoring, (SB 6). It is now currently under consideration by the Stare Governor Ned Lamont. If the bill becomes law, it will go into effect on July 1, 2023, making Connecticut the fifth state to enact a comprehensive data privacy law, JD Supra News&Insights reports. SB 6 would apply to individuals or entities that conduct business in Connecticut and control or process personal data during the preceding year of at least either: a) 100,000 consumers, excluding personal data controlled or processed solely for completing a payment transaction, or b) 25,000 consumers who derived more than 25% of their gross revenue from selling personal data. It also protects sensitive data, like minors or ethic origins, citizenship and immigration status, but with a number of exceptions under the HIPAA or FCRA. 

Its main principles and obligations on data controllers include: 

  • Data Minimization 
  • Duty to Avoid Secondary Use
  • Security Practices
  • Consent
  • Privacy Notices
  • Non Discrimination 
  • Data Protection Assessments

And for data processors: 

  • Data Processing Agreements
  • Data Subject Request
  • Duty of Care (assisting the controller)
  • Data Protection Assessments
  • Confidentiality
  • Subcontractors

According to Reuters, the European Commission wants to make health data easier to access by 2025 for patients, doctors, regulators and researchers in a bid to improve diagnoses, cut unnecessary costs from duplication of medical tests and boost medical research. Electronic prescriptions are also estimated to lead to large savings by reducing errors in dispensing medicines, as many states still use paper prescriptions. Under the plan:

  • Healthcare providers would be required to produce electronic health data that are interoperable.
  • Data generated from patients’ health records and wellness apps would be pooled in compatible formats, and 
  • made accessible to patients, regulators and researchers under strict rules to protect privacy. (eg, anonymised health records for analysts and data professionals)
  • Stronger cybersecurity is also planned.

In parallel, last week the European Commission announced that it had launched the European Health Data Space, (EHDS), one of the central building blocks of a strong European Health Union. The EHDS builds further on the GDPR, proposed Data Governance Act, draft Data Act and NIS Directive. It complements these initiatives and provides more tailor-made rules for the health sector. The EHDS will make use of the on-going and forthcoming deployment of public digital goods in the EU, such as Artificial Intelligence, High Performance Computing, cloud and smart middleware. In addition, frameworks for AI, e-Identity and cybersecurity, will support the space.

Official Guidance: UK regulators’ work plan, opinion on Data Act, athletes’ data, treatment of health data

The UK government promises to bring together the major regulators tasked with regulating digital services in 2022-2023: the Competition and Markets Authority (CMA), the Financial Conduct Authority (FCA), the Information Commissioner’s Office (ICO), and the Office of Communications (Ofcom). Their key priorities, among many, will be:

  • Protecting children online: This includes a joint working framework to support the oversight of Ofcom’s Video Sharing Platform (VSP) regulatory framework and the ICO’s Age Appropriate Design Code (AADC) regime, as well as joint research on age assurance.
  • Promoting competition and privacy in online advertising: This includes the CMA and ICO working together to review: Google’s emerging proposals to phase out third-party cookies; and Apple’s App Tracking Transparency and Intelligent Tracking Prevention features.
  • Developing a clear articulation of the relationships between competition and online safety policy.
  • Continuing to develop the understanding of end-to-end encryption, etc. Read the full workplan here.

The EDPS and EDPB published their joint opinion on the proposed Data Act. The draft law aims to establish harmonised rules on the access to, and use of, data generated from a broad range of products and services, including connected objects, (‘Internet of Things’), medical or health devices and virtual assistants. It also aims to enhance data subjects’ right to data portability under Art. 20 of the GDPR. The EDPB and EDPS urged legislators to ensure that data subjects’ rights are duly protected, namely:

  • The access, use and sharing of personal data by entities other than data subjects should occur in full compliance with all data protection principles.
  • Products should be designed in such a way that data subjects are offered the possibility to use devices anonymously or in the least privacy-intrusive way possible. 
  • Clear limitations regarding the use of the relevant data for purposes of direct marketing or advertising; employee monitoring; calculating, modifying insurance premiums; credit scoring. 
  • Limitations on the use of data should also be provided to protect vulnerable data subjects, in particular minors.
  • Defining the legal basis of emergency or “exceptional need” in which public sector bodies and EUIs should be able to request data.
  • Designating national data protection authorities as coordinating competent authorities under the Data Act.

Meanwhile, the EU Parliament adopted a set of proposals to develop AI in the long term. The report warns that the EU needs to act fast to set clear standards based on EU values, otherwise the standards will be set elsewhere. As AI technologies depend on available data, sharing of data in the EU needs to be revised and extended. Full integration and harmonisation of the EU digital single market will help cross-border exchange and innovation. Other measures include: 

  • Digital infrastructure should be strengthened, ensuring access to services for everyone. 
  • The deployment of broadband, fibre and 5G should be supported and key emerging technologies such as quantum computing should be a priority. 
  • The EU should support the development of AI skills so that people have the skills needed for life and work. 
  • The military and security aspects of AI also need to be tackled: the EU should cooperate internationally with like-minded partners to promote its human-centric, EU-value based vision, says the report. Learn more about AI road map and a special commitee report here

Data protection in sport and the legal implications of collecting athletes’ data was analysed by Australian lawyers from Holding Redlich. Data collection in sport is not new. It has long been commonplace to record athletes’ data, particularly things like heart rate, to understand the body and ultimately increase performance. “What is changing though is the type of data that can be collected, the technological advances, the ease at which it can be collected and the ways in which the data can be stored and manipulated” states the article. Additionally, data collection is no longer limited to the time an athlete is actually training, with variety of sources and data types proliferating. It is therefore important to oblige sporting organisations to:

  • account for and govern collection and use, (including disclosure), of personal information;
  • collection should be based on the principle of  ‘reasonably necessary’, (it depends on whether there is a clear connection between the information collected and the organisation’s functions or activities.)
  • ensure integrity of and an athlete’s ability to correct their personal information;
  • provide the rights of individuals to access their personal information, and make a complaint;
  • require a higher level of privacy consideration for sensitive athlete’s data;
  • contracts with athletes should include clauses or a well-drafted privacy policy that govern the collection and use of data and that these clauses should be sufficiently broad, etc.

The Spanish data protection authority AEPD has added a news section to its website on health and data protection, (in Spanish). The knowledge base  is made up of seven sections that range from general information on the treatment of health data and how to exercise the right of access to medical records to issues related to medical research and clinical trials or personal data breaches. The objective is to have a systematised compendium of legislation, criteria, doctrine and precedents. In 2021, 680 health-related claims were registered by AEPD, an increase of 75% compared to 2020. Additionally, in the second half of 2021, 15% of the breach  notifications received by the regulator  were made by data controllers whose main activity sector is healthcare or in the field of health.

Data Breaches, Investigations and Enforcement actions: abortion clinic visits, shareholders data, alarm services footage

US data broker company SafeGraph may be selling the location data of people who have visited health clinics that provide abortion services, according to IAPP News reports. The data sets, (location data from ordinary apps installed on peoples’ phones), reportedly show where groups of patients came from, how long they stayed at the clinic and where they went afterwards.  Sometimes app users don’t even know that their phone—be that via a prayer app, or a weather app—is collecting and sending location data to third parties. The company then calculates where it believes a visitor lives by their US Census block. Additionally, there are concerns vigilante activity and harassment of patients by anti-abortion activists could increase due to the availability of such location data. Read the full investigation on the topic by Vice here.

The Norwegian Data Protection Authority has reprimanded seafood company Mowi for failing to disclose all information required by the country’s pricacy legislation to the company’s shareholders. This is personal data that Mowi has collected directly from the company’s share managers. In Norway and other European countries, you can buy shares in listed companies via a bank that acts as the manager of the shareholding. This means that the company does not necessarily know who its shareholders are. However, the Public Limited Liability Companies Act gives the company the right to be informed by the nominee who the underlying owner of the shares is. When the company obtains such information from the manager, personal information is processed. The company must therefore provide the relevant shareholders with all the information required, (so that whoever buys shares via his bank is aware of the fact that his data can be shared with the company he bought shares in).

The Swedish privacy regulator IMY initiated an inspection of the alarm company Verisure. In the mass media information has emerged that claims that employees at the alarm company in connection with incoming alarms shared security footage and images among themselves in various ways without it being justified. The pictures were saved on employees’ own hard drives, and IMY has also received complaints from customers regarding Verisure’s processing of personal data. 

The inspection will find out what has happened but also will see what technical security measures the company has in the form of authorization controls and logs, and what instructions are given to the employees on how images may be handled. It will establish what routines are followed when alarms are received, in which situations the customers’ cameras are activated, what rules and routines exist for taking pictures and saving pictures on the employee’s hard drive, and finally, is the information that has appeared in the media correct.

Data Security: passwordless standards

Your Phone May Soon Replace Many of Your Passwords’, says US cybersecurity guru Brian Krebs in his latest blogpost. Apple, Google and Microsoft announced they will soon support an approach to authentication that avoids passwords altogether, and instead requires users to merely unlock their smartphones to sign in to websites or online services: “Apple, Google and Microsoft already support these passwordless standards, (e.g. “Sign in with Google”), but users need to sign in at every website to use the passwordless functionality. Under this new system, users will be able to automatically access their passkey on many of their devices — without having to re-enroll every account — and use their mobile device to sign into an app or website on a nearby device”. Experts predict the changes should help repel many types of phishing attacks and ease the overall password burden on Internet users, says the article.

Big Tech: bank consumer data, competition and privacy on digital platforms

The Bank for International Settlements, central bankers’ umbrella organisation, has published a paper calling for consumer and companies control of their digital data. The paper notes consumers are mostly unaware of the value of their data and should be freely able to opt in or out at will from data collection, in a transparent safeguarded data governance system. Citing the experience of India’s Data Empowerment Protection Architecture, the paper says such a system need not be expensive and can operate at scale. Read the full text here

How much does competition trump privacy where personal data is concerned? How much does this issue figure in the minds of regulators, keen to support business, and civil society groups, (CSG), concerned with protecting freedoms? This is particularly true for digital platforms, such as social media platforms, search engines, digital entertainment, or online retailers. The way in which market dominance is traditionally measured does not always capture the extent of these companies’ market power,  as their products and services are often ‘free’ to consumers.  Privacy International took input from 10 International regulatory authorities and around three times that from civil groups, and has published the findings in a report. This trend is fuelled by the increasing reliance of many sectors of the economy on data, particularly personal data. 

Access to personal data is perceived as an increasingly valuable capability in the digital economy and its acquisition at vast scales is what allows big tech companies to make billions of dollars each year via targeted advertising. Among its main conclusions is that competition and personal data considerations are part and parcel of the way both regulators and CSGs work, and this is not specific to a legal jurisdiction or location. You can read the full report here

Book a free consultation to discuss your DPO needs and the most suitable package

Request your free consultation