Data protection & privacy digest 2 – 15 Aug 2022: commercial surveillance, sensitive data “by comparison”, worker electronic monitoring

TechGDPR’s review of international data-related stories from press and analytical reports.

Legal processes: commercial surveillance, sensitive data “by comparison”, workers electronic monitoring, farming data

The CJEU’s recent decision may have a major impact on digital services that use background tracking and profiling to target users with behavioral ads, TechCrunch reports. The EU top court’s decision related to the anticorruption law in Lithuania. It found out that the country’s law covering online disclosure of data contained in the declaration of private interest of directors of institutions receiving public funds, (data concerning the declarant’s spouse, cohabitee, partner, etc.), is contrary to the fundamental rights to privacy and data protection in the EU. The court believes disclosure online of relatives and associates’ names and their significant financial transactions is not strictly necessary for the objective pursued and may constitute highly sensitive data “by comparison”.

It is likely to reveal information of sensitive aspects of the private life of the persons concerned and to make it possible to draw up a particular detailed portrait of them, such as their sex life and sexual orientation, (Art. 9 of the GDPR). Finally, such processing results in this data being freely accessible on the internet to a potentially unlimited number of people. Thus, some privacy law experts suggest the judgement’s broad definition of what constitutes sensitive data, (involving the act of comparison or deduction), potentially covers a wide range of online processing, including online ads, dating and health apps, location tracking and more, concludes TechCrunch. 

In the US, the Federal Trade Commission, (FTC), seeks public comment ahead of ruling on the prevalence of commercial surveillance and data security practices that harm consumers. The Commission invites comment on whether it should implement new trade regulation rules or other regulatory alternatives concerning the ways in which companies a) collect, aggregate, protect, use, analyze, and retain consumer data, as well as b), transfer, share, sell, or otherwise monetize that data in ways that are unfair or deceptive. The permissions that consumers give may not always be meaningful or informed. Studies have shown that most people do not generally understand the market for consumer data that operates beyond their monitors and displays, the FTC states. Many privacy notices that acknowledge such risks are reportedly not readable to the average consumer or a minor. In the end, these practices that nowadays heavily rely on automated systems may have significant consequences for consumers’ wallets, safety, and mental health. 

The EDPS published its opinion on the proposal for a regulation regarding conversion of the Farm Accountancy Data Network into a Farm Sustainability Data Network (FSDN). The proposal aims to regulate the processing of personal data in the context of the collection of individual farm’s economic, environmental and social data as well as the further management and use of such data. The EDPS positively notes that in case individual data will be shared by the Commission or liaison agencies, the data of the farmers and all other individual details obtained would be anonymised or pseudonymised. However the EDPS considers that the proposal does not provide a specific reason of public interest justifying the publication of personal data in identifiable form, even if the data were to be pseudonymised prior to publication. 

The EDPS therefore recommended specifying that only duly anonymised FSDN data may be made publicly available. That being said, the regulator considered it important to preserve a clear distinction between these concepts, as pseudonymous data can still be related to an identifiable individual and therefore qualifies as personal data. Moreover, the EDPS considered that it is not clear whether the proposal refers only to the exchange of data between the national liaison agencies and the Commission or also extends to the sharing of data with the general public or otherwise making it available for reuse. Finally, the interoperability provisions include the need to identify all the IT tools and linked databases, data protection roles and responsibilities and relevant applicable safeguards. Read the full opinion here.

Meanwhile Ontario provided updated guidance on a new legislation which includes an electronic monitoring policy for workers. “Electronic monitoring” may include GPS systems to track employee movement, using sensors to track how quickly an employee performs a task or tracking the websites an employee visits during working hours. The policy must include:

  • A statement as to whether or not the employer electronically monitors employees.
  • How the employer may electronically monitor employees.
  • The circumstances in which the employer may electronically monitor employees; and
  • The purposes for which information obtained through electronic monitoring may be used by the employer.
  • The date the Policy was prepared, and the date any revisions were made.

Any employer that employs 25 or more people in total across all of its locations in Ontario will be required to have a written policy. When determining whether the 25-employee threshold has been met, an employer must count all employees across all of its locations in Ontario, regardless of the number of hours worked by the employees or if they are full or part-time, including probationary employees, employees on layoff, leave of absence or strike and employees who are trainees.

Official guidance: use of cloud, sports associations, dpo, government data, customer research

The Danish data protection authority has published a questionnaire after recent inspections of the use of the cloud, (in Danish only), by public authorities and private companies. The questionnaire covers most of the points that data controllers must be aware of if they use  cloud solutions. It is divided into four parts:

  • know your services,
  • know your suppliers,
  • supervision of suppliers,
  • transfer to third countries.

Furthermore, each part is subdivided into two parts: a) the first part concerns the organisation’s general rules, policies, procedures, etc. to enable the organisation to comply with the relevant data protection rules; b) the second part looks at whether the organisation has followed these policies, etc. with regard to the specific cloud service and provider, and if not, how the organisation ensures compliance with the relevant data protection rules. The questionnaire can be downloaded via this link.

The French regulator CNIL offers amateur sport associations a self-assessment tool to test their compliance with the GDPR. The data subjects in this case include member athletes, athletes of an opposing team, paid or volunteer sports educators, referees, etc. The information collected responds to very different uses: storing the file of members, organizing competitions and tournaments, managing the club’s website, etc. The life cycle of the personal information contained in the files created by sports structures is likely to include 4 stages:

  • collection,
  • sharing and exchange, 
  • reuse, 
  • retention and destruction. (You can access the original questionnaire here).

The Dutch data protection authority recommends adjusting the proposal for an amendment of the Reuse of Government Information Act. The proposal, in which the government encourages government institutions to make government data, including personal data, available for reuse, does not set sufficient limits, raising the risk that personal data is shared without the permission or knowledge of the people involved. According to the proposal, that data must also be searchable with software and can be combined with other data. Personal data in the country’s Trade Register and the Land Registry is already public and that is already causing problems. By running an algorithm on it and combining the personal data with other sources, companies can, for example, create profiles of people to sell it.

The Latvian privacy regulator published guidance on the mandatory appointment of a data protection officer. Especially in cases where the economic activity of the company is directly related to the processing of personal data on a large scale, any company is obliged to involve a data protection specialist in the organisation of specific processes:

  • for a company whose main activity is related to the profiling of natural persons, with the intention of carrying out an assessment of their creditworthiness;
  • for a security company that uses video surveillance of publicly accessible areas as part of its core service;
  • for a company that performs customer behavior analysis, (products a customer has viewed, purchased, etc.), in order to send targeted marketing communications;
  • to a person who conducts customer research for the purpose of preventing money laundering;
  • mobile apps that process user geolocation data for the maintainer;
  • for companies that collect customer data as part of loyalty programs;
  • for persons who monitor clients’ well-being, physical fitness and health data through wearable devices;
  • for companies that process information obtained from devices connected to the IoT, (smart meters, connected cars, home automation devices, etc.).

Another guidance by the Latvian privacy regulator refers to the prevention of money laundering and financing of terrorism and arms proliferation. According to the country’s legislation anyone must conduct customer research before starting a business relationship, as well as during the maintenance of a business relationship. Taking into account the fact that customer research applies not only to legal entities, but also to natural persons, the regulator explains new procedures that determine the licensing of common customer research tools for service providers, as well as the monitoring of their activities. Considering that personal data will be processed in the customer research tool, the privacy regulator has the following rights: 

  • re-registration, suspension or cancellation of the service provider’s license;
  • inspections of the customer research tool service;
  • receiving information and documents free of charge from the service provider, which are necessary for the verification of the operation or for the consideration of the customer complaint received about its operation;
  • information erroneously or illegally included in the shared customer research tool be corrected or deleted;
  • requiring the service provider of the customer research tool to review its information systems, facilities and procedures and appoint an independent expert.

Investigations and enforcement actions: profiling, video surveillance and geolocation, access codes, privacy notice, reused mail box

sensitive data "by comparison"

The Lower Saxony data protection commissioner has imposed a fine of 900,000 euros on a bank for profiling for advertising purposes. The company had evaluated data from active and former customers without their consent. To do this, it analysed digital usage behaviour and the total volume of purchases in app stores, the frequency of use of account statement printers and the total amount of transfers in online banking compared to the use of branch counters. For this it used a service provider. In addition, the results of the analysis were compared with a credit agency and enriched from there. The aim was to identify customers with an increased inclination for digital media and to prioritise electronic communication channels to contact them. Information was sent to most customers in advance along with other documents. However, these do not replace the necessary consents. The fine is not yet final.

The Luxembourg data protection authority recently issued a 3000 euro fine to an unnamed company for intrusive use of CCTV cameras and failing in their obligation to inform their workers and third-party visitors. The company neither justified not demonstrated how the video surveillance, (installed and operated by subcontractor firms), of the interior of the premises using door cameras was appropriate and necessary to protect the property, (fencing in this case could be a replacement measure), and in particular to prevent burglary. It also considered the psychological pressure that the cameras exerted on employees and third-party visitors, who felt observed at their workstations or meeting tables because of the cameras, which did not indicate if were working, or not.

In another recent case the Luxembourg regulator fined an unnamed company 1500 euros for performing geolocation on its employees while using a vehicle to travel to customers. The following purposes of geolocation were stated by the data controller: geographical tracking, asset protection, optimal fleet management, optimisation of work processes as well as the provision of responses to customer complaints.” Further investigation found out other undisclosed purposes such as: combatting theft, reduction of the number of kilometres driven, justification in the event of a dispute, monitoring and invoicing of services, and finally, monitoring of working time and setting remuneration.

 In the regulator’s opinion, the lack of clear policy, an unidentified legal basis for all the above-mentioned processing, as well as a one-year data retention period, were in violation with the requirements of Art. 5, (lawfulness, fairness and transparency), and Art. 13, (information obligation), of the GDPR. Finally, the employees were unaware that their data could have been transferred to the parent company, situated in a third country. 

In Denmark, citizens’ information was exposed to an unnecessary risk, as Lolland Municipality’s employees were able to disable access codes on phones and tablets. The Danish data protection authority issued a fine of approx. 6000 euros. In 2020 an employee in the municipality had a work phone stolen. Via the phone there was access to the employee’s work email account, which contained information about several citizens’ names, social security numbers, health information and sensitive events. The phone was not protected by a code as it was switched off, so access to its information was unlimited. The municipality stated that over a number of years it had been possible for employees to remove the otherwise mandatory access codes, so that telephones could be used without the use of a code. It had immediately initiated restorative measures in the form of new precautions and changes in the technical set-up of telephones handed out. 

The Romanian data protection authority has fined the CDI Transport Intern si Internazionale, (among the largest passenger transport companies in Romania), 7000 euros after a complaint that the company’s website contained no information regarding the method of collecting personal data. It also failed to inform users of the rights provided for in Art. 15-22 of the GDPR that data subjects benefit from, such as those relating to the purpose of processing and the legal basis, the identity and contact details of the operator, the period for which the data will be stored or the criteria used to establish this period, nor the fact that the operator has the obligation to inform the data subjects in the event of a breach of personal data security.

Finally, the Spanish data protection authority AEPD punished an online teaching institution to the tune of 3000 euros after a claimant, a newly hired tutor, was offered a corporate email box that belonged to the person they were replacing. The organisation stated that the plaintiff started working as an employee to replace another worker in the same field and with the same tasks on sick leave, so that their work was a continuation of those specific teaching activities and tutoring with students, for which it was necessary to have knowledge of all the background and communications between teacher and pupil. It argued that the data to which the plaintiff could have access was needed for the exercise of their duties. The data in the mailbox included pupils’ personal information, but also tax documentation, banking details, invoices, etc. The new tutor was instructed that she could access and delete folders in the inbox if needed. The regulator decided that the basic security measures were not respected in this case. 

Data security: email aliases, IoT devices

According to the US cybersecurity guru Brian Krebs, one way to protect your email inbox is to get into the habit of using unique email aliases when signing up for new accounts online. You can create an endless number of different email addresses linked to the same account by adding a “+” character after the username section of your email address, followed by a notation relevant to the website you’re signing up at. It is said that many threat actors will remove any aliases from their distribution lists because they believe that these consumers are more concerned with security and privacy than other users and are therefore more likely to report spam to their aliased addresses. Finally, email aliases are so uncommon that finding just a few email addresses using the same alias in a database breach can make it easy to determine which organization was probably hacked and which database was released.

The US Health Sector Cybersecurity Coordination Center published an advisory note for the healthcare sector of the risks posed by Internet of Things devices. Since these devices can collect data that includes personally identifiable information it is important to secure these systems. Ultimately, the goal is to protect the entire system, but there are steps that can be taken to help accomplish this: a) securely store, process, and transfer data, b) keep devices safeguarded, c) update devices to reduce vulnerabilities. To minimize risks from IoT devices you need to:

  • Change default router settings: Most people do not rename their router and keep the manufacturer’s default settings. Those settings typically benefit manufacturers more than the user. 
  • Pick a strong password: Make sure to use a secure password for each device. 
  • Avoid using Universal Plug and Play: It makes it easier to network devices without additional configuration. 
  • Keep your software and firmware updated: Firmware keeps you protected with the latest security patches and reduces the chances of cyber-attacks. 
  • Implement a Zero Trust Model: A zero trust model assumes that nothing can be trusted in or outside of the network. Only a limited amount of people require access to certain resources to accomplish their jobs. For this strategy to be effective administrators must determine who the users are and what role they play.  

Big Tech: drivers data, cyberattack on NHS software, Meta’s tracking code

Only 28% of drivers have any idea what sort of data they generate, and is collected, when they drive, and they may never have heard of the at least 37 companies that are leading a growing vehicle data market says a report in The Markup. It’s a market with vast amounts of personal data all for sale: by whom, for whom, and with what aim? With the growth of third party vehicle data hubs concentrating data, and the range of data presenting a risk to anonymisation, the report notes a lack of regulation that High Mobility’s CEO and founder Ristro Vahtra warns could be a “privacy hell”. The report also criticises car manufacturers for failing to develop clear screen interfaces like mobile phones for drivers to choose privacy settings, which in some cases are entirely lacking. Legislation tackling this is currently in the committee stage in the US Congress.

UK government agencies along with the National Cyber Security Centre are investigating if patient data was stolen in a severe cyberattack on NHS software supplier Advanced. It was hit by ransomware on August 4th, taking several urgent treatment centres, the 111 phoneline for, among other things, booking a doctor’s appointment, and some mental health facilities offline. The hack could take nearly two weeks to resolve, and updates on the status of the data are awaited, although Advanced says it has “contained” the breach.

When you click on anything you see on Facebook or Instagram, owner Meta has been inserting code into the websites you visit, allowing your navigation to be tracked. That’s according to former Google engineer and privacy activist Felix Krause, who has published new research. It’s unknown how long Meta have been using the tracking code on their in-app browser. Krause built a tool to see how many extra instructions were added to a website by a browser. In most cases none were added, but navigation via Facebook or Instagram added as many as 18 lines of code. This so-called “Javascript injection” is often classified as a “malicious attack”, but there is no suggestion Meta has used it beyond monitoring all user interactions, like every button and link tapped, text selections, or screenshots.

Do you need support on data protection, privacy or GDPR? TechGDPR can help.

Request your free consultation

Tags

Show more +