web browsing data

Data protection digest 18 Feb – 2 Mar 2024: web browsing data for sale, banking sector outsourcing, cybersecurity core 2.0

This issue highlights how web browsing data, non-anonymised according to America’s FTC, was sold worldwide in the Avast/Jumpshot case, the EDPB’s new enforcement action on the right of access, cloud outsourcing in the banking sector, the NIST’s new cybersecurity framework for all organisations, and federated learning analysis.

Stay tuned! Sign up to receive our fortnightly digest via email.

Web browsing data for sale

The UK software provider Avast will have to pay 16.5 million dollars to the US Federal Trade Commission, and the business will not be allowed to sell or license any web browsing data for advertising purposes. Avast Limited, a UK-based firm, obtained customer surfing data unjustly through its antivirus software and browser extensions, retained it indefinitely, and sold it without providing consumers with sufficient notice or asking for their consent. The company also did this through its Czech subsidiary. 

Following its acquisition of rival antivirus software supplier Jumpshot, Avast renamed the business as an analytics firm. Jumpshot sold surfing data that Avast had gathered from users between 2014 and 2020 to a range of customers, including marketing, advertising, and data analytics firms as well as data brokers. The business said that before sending the data to its clients, it eliminated identifying information using an algorithm. 

web browsing data

However, according to the FTC, the business did not adequately anonymise user web browsing data that it sold through a variety of products in non-aggregated form. The FTC says, the business did not prohibit some of its data purchasers from using Jumpshot’s data to re-identify Avast users. For instance, Jumpshot allegedly signed a deal with advertising giant Omnicom for a supply of an “All Clicks Feed” for 50% of its clients in the US, UK, Mexico, Australia, Canada, and Germany. 

Americans’ sensitive data

The US seems to have increased regulations on restricted cross-border data transfers due to national security concerns. 

President Biden issued an Executive Order to protect Americans’ sensitive personal data. It will prevent the large-scale transfer of America’s sensitive and government-related data to countries of concern, (reportedly they are China, Cuba, Iran, North Korea, Russia and Venezuela), and prohibit commercial data brokers and other companies from selling biometrics, healthcare, geolocation, financial and other sensitive data to countries of concern, or entities controlled by those governments, intelligence services and militaries. 

The US Justice Department’s National Security Division has already published an Advance Notice of Proposed Rulemaking to provide transparency and clarity about the intended scope of the program. It would include six defined categories of bulk US sensitive data – US persons’ covered personal identifiers, personal financial data, health, precise geolocation data, biometric identifiers, human genomic data, and combinations of those data. The security requirements for certain data classes of transactions would include: 

  • basic organisational cybersecurity posture,
  • measures against unauthorised disclosure, 
  • data minimisation and masking,
  • use of privacy-preserving technologies,
  • compliance requirements and audits.

The Department of Justice is also considering identifying three classes of restricted data transactions: a) vendor agreements, (including for technology services and cloud services), b) employment agreements, and c) investment agreements. Nonetheless, the order program is without prejudice to the free flow of data necessary for substantial consumer, economic, scientific, and trade relationships that the US has with other countries. 

Other official guidance

The EDPB’s new enforcement action: 31 data protection authorities across the EEA, (DPAs), including 7 German state-level regulators, will participate in the 2024 enforcement action, (mixture of surveys and formal investigations), on implementing the right of access. It is one of the most frequently exercised data protection rights, which DPAs receive many complaints about. In addition, it often enables the exercise of other data protection rights, such as the right to rectification and erasure. To understand how organisations must respond to access requests from individuals, see the EDPB’s latest guidelines on the right of access

Generative AI and data protection: In the UK, the House of Lords Communications and Digital Committee has published a report on large language models, (LLMs). These may have personal data in their training sets, drawn from proprietary sources or information online. Safeguards to prevent inappropriate regurgitation are being developed but are not robust. Data protection in healthcare attracts particular scrutiny as some firms are already using the technology on NHS data, which may yield major benefits. 

But equally, models cannot easily unlearn data, including protected personal data. There may be concerns about these businesses being acquired by large overseas corporations involved in, for example, insurance or credit scoring. Clear guidance is needed on how the data protection law applies to the complexity of LLM processes, including the extent to which individuals can seek redress if a model has already been trained on their data and released. Also, data protection provisions have to be embedded in licensing terms.

Consent principle

It is not always necessary for a company or an authority to obtain your consent before they can handle your data explains the Danish data protection authority. This is because consent is only one of several legal bases when it comes to the handling of your data. Storage of your information shall cease when you withdraw your consent, but only the information that is handled or processed based on consent. 

Information where the legal basis is someone else, for example in the case of a commercial contract or employment relationship, can continue to be handled or stored. It is also not needed if you, the data subject, are unable to give consent, for example, to a healthcare facility due to a serious illness. Public authorities can also process your data for specific tasks, such as handling your tax declarations. Private companies might have some legitimate reasons too, (such as for maintaining user services), but they should not violate your interests or rights. 

Finally, a revocation of consent does not have a retroactive effect, and the revocation therefore does not affect the handling of information that took place before.

 Rise in outsourcing contracts in the banking sector

The European Central Bank urges supervised institutions to tackle vulnerabilities stemming from their increasing operational reliance on third-party providers. Most banks outsource certain services to take advantage of lower costs, more flexibility and greater efficiency. Considering the relatively stringent data protection regulations in the EU, it is noteworthy that personal data processing is included in 70% of outsourcing contracts, and over 70 major banks contract these vital services out to companies with headquarters located outside the EU, (eg, cloud services in the US, the UK, and Switzerland). 

The ECB discovered that over 10% of contracts concerning essential tasks do not adhere to the applicable requirements. Furthermore, 20% of these non-compliant contracts have not had a rigorous risk assessment during the past three years, and 60% have not undergone an audit.

Starting in 2025, the Digital Operational Resilience Act will go into effect and offer further tools for monitoring important IT service providers, particularly those that ensure the operational resilience of financial institutions.

Receive our digest by email

Sign up to receive our digest by email every 2 weeks

Illicit marketing

The Italian privacy regulator imposed a fine of over 79 million euros on Enel Energia for serious shortcomings in the processing of personal data of numerous users in the electricity and gas sector, carried out for telemarketing purposes. The case originated from a previous investigation which involved a 1,8 million euro privacy fine on four companies and confiscated databases used for illicit activities. It emerged that Enel Energia had acquired 978 contracts from the above companies, even though these did not belong to the energy company’s sales network. 

Furthermore, the information systems used for customer management and service activation by the company showed serious security shortcomings. Enel failed to put in place all the necessary measures to prevent the unlawful activities of unauthorised actors who for years fueled an illicit business carried out through nuisance calls, service promotions, and the signing of contracts with no real economic benefits for customers. Over time it involved the activation of at least 9,300 contracts.

Meanwhile, in California, a company will pay a 375,000 dollar civil penalty after it violated multiple consumer privacy laws. DoorDash is a San Francisco-based company that operates a website and mobile app through which consumers may order food delivery. To reach new customers, DoorDash participated in marketing cooperatives and disclosed consumers’ personal information as part of its membership without providing notice or an opportunity to opt-out. The other businesses participating in the cooperative also gained the opportunity to market to DoorDash customers. 

web browsing data

Data brokerage

Belgium’s data protection regulator recently fined Black Tiger Belgium, (formerly Bisnode Belgium), a company specialising in big data and data management, a total of 174,640 euros. At the time when the complaints were lodged, Bisnode Belgium operated a consumer database and a company database through which Bisnode Belgium offered “Data quality”, (to improve the quality of its customers’ data), and “Data Delivery”, (to provide data to its customers, especially for the implementation of marketing campaigns). These databases consisted of personal data and user profiles from various external sources. 

The regulator received a complaint based on the so-called ‘right of access’ with Bisnode, which allows anyone to request access to the data it keeps about them at any time. The investigation found that the company under its legitimate interest indirectly collected and processed personal data on a large scale, for a long period, (15 years), without the data subjects being informed individually, clearly and proactively about the processing carried out. The company also lacked records of its processing activities. 

Other enforcement decisions

Student privacy vs teachers’ authority: The Icelandic data protection authority ruled on personal data processing by the University of Iceland. According to the complaint, a teacher had monitored a student through the teaching site in the Canvas learning management system. However, the supervisory authority concluded that there was no electronic monitoring, as the teacher’s assessment of the complainant’s activity in the learning management system was not sustained or repeated regularly. It was also considered that the said processing of personal information had been necessary for the university in connection with statutory tasks entrusted to the university by law. 

However, the complainant was not sufficiently informed of the teacher’s ability to examine their use of the Canvas learning management system and make it the basis for grading. The peer assessment of the complainant’s fellow students in a group project was one of the factors that formed the basis of the grading for the assessment component. The University’s processing therefore failed to comply with the transparency requirements under privacy legislation.

Biometric scanning abuse: In the UK Serco Leisure, Serco Jersey and seven associated community leisure trusts have been issued enforcement notices ordering them to stop using facial recognition technology and fingerprint scanning to monitor employee attendance. The investigation found that Serco and the trusts have been unlawfully processing the biometric data of more than 2,000 employees at 38 leisure facilities. Serco had to record employee attendance to pay workers as per its contractual duties but rejected less invasive options available, including timesheets or electronic cards. Although Serco had indicated that these choices may be abused, it had shown no proof of real, widespread misuse. 

Data security

Password retention guide: Too often identity theft is caused by the use of computer authentication credentials stored in databases that are not adequately protected with cryptographic functions. Stolen data is used to illicitly enter entertainment sites, (35.6%), social media, (21.9%) and e-commerce portals, (21.2%). In other cases, they allow access to forums and websites of paid services, (18.8%), and financial services, (1.3%). As a result, the Italian data protection authority recently developed an FAQ and more detailed guidelines regarding password storage, providing cryptographic functions currently considered the most secure, (in Italian only). 

Cybersecurity core 2.0: America’s NIST has meanwhile released version 2.0 of its landmark Cybersecurity Framework. The agency has finalised the framework’s first major update since its creation in 2014. Now it explicitly aims to help all organisations — not just those in critical infrastructure, its original target audience — to manage and reduce risks. The framework’s core is now organised around six key functions: Identify, Protect, Detect, Respond and Recover, along with CSF 2.0’s newly added Govern function. The CSF is used widely internationally. Versions 1.1 and 1.0 have been translated into 13 languages, and the NIST expects that CSF 2.0 also will be translated by volunteers around the world. 

Federated Learning

The UK Responsible Technology Adoption Unit, in cooperation with the NIST, published a series of analyses about Privacy-Preserving Federated Learning. Organisations often struggle to articulate the benefits of the approach, associated with machine learning that involves training a model without the centralised collection of training data. This can lead to lower infrastructure and network overheads. However, bespoke privacy infrastructure can introduce additional costs. Plus, there are fewer people with the skills and experience required to design and deploy it. 

On the other hand, federated learning allows organisations to use and monetise data assets that would not have previously been accessible. In removing the need for access to the full data, it protects the value of the data for the data owner. Finally, legal consultation is a necessary cost, but in principle PETs can significantly reduce data protection risks, as when used appropriately, differentially private data can be considered anonymised. 

Do you need support on data protection, privacy or GDPR? TechGDPR can help.

Request your free consultation

Tags

Show more +