Weekly digest April 18 – 24, 2022: business and human rights in the activities of tech companies

TechGDPR’s review of international data-related stories from press and analytical reports.

Photo by LYCS Architecture on Unsplash

Official Guidance: business and human rights in the activities of tech companies, relaxed covid measures, regulators’ annual analytics

Privacy International, (PI), submitted its input to the forthcoming report by the UN High Commissioner for Human Rights, on the practical application of the UN Guiding Principles on Business and Human Rights to the activities of technology companies. In summary, the PI report highlights the systemic lack of accountability of this industry, national authorities’ slow or nonexistent enforcement of privacy laws against its exploitative practices, and its relations with governments. Among many things, it:

  • asserts the need for tech companies to provide transparency over their technologies and to make their algorithms auditable, and for states to mandate such transparency when these technologies are used to deliver public functions; 
  • reasserts that contracts between public authorities and tech companies must point to redress mechanisms for complaints handling and enforcement of sanctions for abuses or violations of human rights;
  • calls for public authorities to conduct individual human rights risk and impact assessments, as well as data protection impact assessments, during any surveillance technology procurement process, in addition to companies conducting human rights due diligence, on any prospective state client’s end-use of their technology;
  • asserts that public authorities should not systematically use surveillance and data processing systems deployed for private purposes and/or data derived from these systems, etc.

As COVID-19 measures relaxed across the UK, the ICO has set out some key things organisations need to consider around the use of personal information. You should check government guidance for where you live. Guidance varies between England, Northern Ireland, Scotland, and Wales. In general, the organisations should ask themselves a few questions: a) How will still collecting extra personal information help keep our workplace safe? b) Do we still need the information previously collected? c) Could we achieve your desired result without collecting personal information? Also, data protection is one of a number of factors to consider when thinking about collecting this information. Organisations should also take into account:

  • employment law and your contracts with employees,
  • health and safety requirements; and
  • equalities and human rights, including privacy rights.

The ICO had previously outlined some practical methods for destroying documents and guidance on storage limitations for further information. 

Meanwhile, the EDPS published its analytical annual report 2021. It highlights the EDPS’ achievements regarding EU institutions’ compliance with the data protection framework. The report also underscores the EDPS’ increasing role in advocating for the respect of privacy and data protection in EU legislation. The EDPS increased the use of its corrective powers, (eg, the decision to order Europol to delete datasets with no established links to criminal activity). This year was also unprecedented in terms of EDPS advice given to the EU legislator, (with 88 opinions, including formal comments, issued in 2021, compared to 27 in 2020). The EDPS also continued its active participation in the EDPB’s work, and furthered its work on raising awareness about personal data breaches to assist EU institutions in preventing and handling them. You can consult the full report here.

For those, who can read Hungarian, the country’s data protection regulator NAIH similarly prepared its annual activities wrap up for 2021. It looks at a) the authority’s experience over the first ten years, b) statistical characteristics of cases, c) data protection officers tutorials, d) law enforcement, national defence, and national security data-related procedures, e) important court decisions, f) data protection issues in business secrets, g) minors’ data protection, and much more.

Legal Processes and Redress: lawful data scraping, law firm nonliability for data breach

A decision in the US Ninth Circuit Court of Appeals offers an insight into the conflicting positions between Europe and America on data protection and offers relief for data scrapers who feared a shutdown of their industry. A case pitting business networker LinkedIn against hiQ Labs, a “people analytics” company, sought to prevent the latter from taking data from LinkedIn for its own business purposes. It was successfully argued that the information was publicly available, so no criminal act had taken place. Another point raised was that finding in LinkedIn’s favour would mean big tech companies would have a monopoly on ‘big data’ in the future. It may mean problems ahead for key articles of the GDPR, as privacy policy, competition and criminal law are all pulling in different directions.

A federal jury in Kansas City cleared a law firm, (Warden Grier), of liability to one of its clients, (Hiscox Insurance), after suffering a data breach, Hogan Lovells blog reports. The plaintiff claimed that the defendant failed to meet its standard of care by not sufficiently analyzing its breached server, leaving the plaintiff responsible for approximately 1.3 mln dollars in data analysis and related legal bills. Warden Grier’s counsel argued to the jury that Hiscox was confusing the roles of “service providers” and “data owners.”  Here, Warden Grier argued it was a “service provider” under applicable data breach laws and industry norms, and thus its role was to provide Hiscox with access to impact data, which it had done. Read the full article here

Data Breaches: the leak of health data

Photo by National Cancer Institute on Unsplash

The French regulator CNIL issued a 1.5 mln euros fine against the company DEDALUS BIOLOGY. A massive data leak concerning nearly 500,000 people was revealed publicly. The surname, first name, social security number, name of the prescribing doctor, date of the examination but also and above all medical information, (HIV, cancers, genetic diseases, pregnancies, drug treatments followed by the patient, or even genetic data), of these people has thus been disseminated on the internet. In its decision the CNIL stated:

  • As part of the migration from software to another tool, requested by two laboratories using the services of DEDALUS BIOLOGY, the latter extracted a larger volume of data than required.
  • The company has therefore processed data beyond the instructions given by the data controllers.

Many technical and organizational shortcomings in terms of security were upheld against the company in the context of the operations of migrating the software:

  • lack of specific procedure for data migration operations;
  • lack of encryption of personal data stored on the problematic server;
  • absence of automatic deletion of data after migration to the other software;
  • lack of authentication required from the Internet to access the public area of ​​the server;
  • use of user accounts shared between several employees on the private zone of the server;
  • absence of supervision procedure and security alert escalation on the server. The full decision in French can be read here

Crypto-asset industry: EU crypto firms appeal against new draft rules

According to Reuters, more than 40 crypto business leaders have asked the EU not to require crypto firms to disclose transaction details and dial down attempts to bring to heel rapidly growing decentralized finance platforms, (the above draft legislation explained in one of our previous digests).  In a letter sent to EU finance ministers, crypto businesses asked policymakers to ensure their regulations did not go beyond rules already in place under the global Financial Action Task Force, which set standards for combating money laundering. In their opinion, this would reduce crypto holders’ privacy and safety. In addition, the letter also asked that the EU excludes decentralized projects, which include decentralised finance, (DeFi), from the requirements to register as legal entities. It also said that certain decentralized “stablecoins” should not be subject to the wider MiCA regulation.

Artificial Intelligence: ISO new guide and EP recommendations on AI Act

The ISO published guidance for members of the governing body of an organization to enable and govern the use of Artificial Intelligence, in order to ensure its effective, efficient, and acceptable use. The document also provides guidance to a wider community, including executive managers; external businesses or technical specialists, such as legal or accounting specialists, retail or industrial associations, professional bodies; public authorities and policymakers; internal and external service providers (including consultants); assessors and auditors. The guide is applicable:

  • to the governance of current and future uses of AI as well as the implications of such use for the organization itself;
  • to any organization, including public and private companies, government entities, and not-for-profit organizations;
  • to an organization of any size irrespective of their dependence on data or information technologies.

Similarly, the European Parliament’s Committee on the Internal Market and Consumer Protection, and Committee on Civil Liberties, Justice and Home Affairs released a joint report with their recommendations for the proposed Artificial Intelligence Act. Proposed amendments from the committee include a ban on predictive policing, a public AI technology registration requirement and further alignment with the GDPR, IAPP News reports. Advocacy group ‘Access Now’ has already examined the recommendations from the committees. According to them, the draft report contains significant improvements for the protection of fundamental rights. These include the rights of people affected by AI systems to lodge a complaint or seek judicial remedies, for public authorities to register their use of high-risk AI systems in a public database, and numerous improvements to procedures and enforcement. At the same time, the recommendations “have missed an important opportunity to protect people’s rights by completely banning remote biometric identification in publicly accessible spaces.”

Big Tech: GPS data, Google’s “Deny All button”, Pegasus spyware, new Microsoft Purview

Data Broker Otonomo is facing a California class-action lawsuit for allegedly collecting and selling GPS data secretly from 50 mln vehicle owners worldwide, IAPP News reports. The company, originally founded in Israel, claims it has systems to protect customer privacy, but investigative journalists in 2021 discovered Otonomo data could reveal customers’ home addresses, where they worked, and where they drove to. At that time legal opinion was the company could face problems down the road. The company has deals with several car manufacturers to include their systems onboard, but the lead plaintiff says he was never informed of this nor was his consent sought.

Beginning with YouTube France, but due to be rolled out across Google Europe-wide, the giant search engine is updating its cookie consent banner, which a few months ago was hit with a hefty 150 million-euro fine by French data regulator the CNIL. The familiar ‘Accept All’ and ‘Customise’ buttons will be joined by a ‘Deny all’ button disabling cookies altogether. Multiple clicks over several pages were previously needed to opt-out of tracking, in violation of the principle that opting out should be as simple for users as opting in.

More high-profile scrutiny of NSO group’s Pegasus spyware is on the way, as the European Parliament launched an inquiry committee into the Israeli company’s potential use of the software on EU member states’ governments, or its use by those governments. Pegasus software was last week reportedly discovered on UK government computer networks, infecting files even within the Prime Minister’s office, and in Spain, it was found infecting pro-Catalonian independence networks.

Microsoft has bundled its Azure Purview and Microsoft 365 Compliance data governance and risk management services into a new package with enhanced and new features to beef up data security and privacy. Christened Microsoft Purview, the new platform should simplify life for administrators, and the integration of functions allows for new capabilities Microsoft says it will extend with time. A key feature will allow admins to apply sensitivity labels to data consistently, across platforms and data types. Labels will now travel with data and be recognised by all services it extends to, says Microsoft.

Book a free consultation to discuss your DPO needs and the most suitable package

Request your free consultation