Weekly digest November 1 – November 7, 2021 “Privacy, DP, and Compliance news in focus”

TechGDPR’s review of international data-related stories from press and analytical reports.

Photo by Travel Coffee Book on StockSnap

Legal Processes

China’s Personal Information Protection Law, PIPL, came into effect on November 1. It largely blends  the EU (GDPR) and California (CCPA) privacy rules for the handling of personal and sensitive information, including different legal bases, as well as general principles for data processors, including the conducting of regular audits, training and data management programs, as well as the appointment of personal data protection officers, and places restrictions on the cross-border transfer of personal information. The PIPL contains penalties for breach of its provisions, including fines of up to 6.7 mln euros or up to 5% of the preceding year’s business income, whichever is higher. Finally, foreign companies, (even with no presence in China), engaging in the processing of personal information of individuals in the country  are required to establish a dedicated entity or appoint an agent or designated representative in China to be responsible for dealing in related matters. The name and contact details of such local agents or representatives will need to be provided to the relevant authority. Read more analysis by WhiteCastle.

The UK government steps into the active phase of consultations on reform of the national data protection regime, with the deadline for organisations to respond expiring on November 19. The proposed reforms aim to establish a “pro-growth and innovation friendly” data protection regime, shifting away from a “one size fits all” approach to compliance with data regulations. The consultation concentrates on 6 key areas:

  • Reducing barriers to responsible innovation by relaxing the rules around organisations’ reliance on the legitimate interests, or automated decision making under Art. 22 of the GDPR. 
  • Reducing burdens on businesses by amending privacy management programmes, DPIAs, appointing DPOs and maintaining detailed records of processing which align with Article 30 of the GDPR, increasing the threshold for reportable data breaches, etc.
  • Reworking rules in relation to cookies and direct marketing by allowing the use of analytics cookies and similar technologies without user’s consent, or collecting information from a user’s device without their consent for other limited purposes.
  • Boosting trade and reducing barriers to data flows including the use of alternative transfer mechanisms and a “risk-based” approach to granting adequacy decisions to other jurisdictions.
  • Allowing the processing of personal data for public health and emergency situations.
  • Reforming the Information Commissioner’s Office by refocusing its statutory commitments away from handling a high volume of low-level complaints. 

In Germany, The Federal Ministry of the Interior, Building and Community, the BMI, has evaluated the new Federal Data Protection Act, the BDSG, which came into force in 2018. Both public and private users, including the data protection supervisory authorities as well as leading business associations and other institutions, were interviewed. The new BDSG, a German equivalent of the GDPR,  has proven to be appropriate, practical and with clear standards, despite various criticisms. In addition, the Federal Statistical Office carried out a cost re-measurement as part of the evaluation and found that the compliance effort of the BDSG for the economy has been reduced by about one million euros. The complete evaluation in German can be accessed here

The latest insight from EURACTIV, an independent pan-European media network specialized in EU affairs, oversees the upcoming EU Data Act. The aim is to make more data in the EU usable to support sustainable growth and innovation across all sectors (B2B and B2G). However, the independent quality-checks so far have led to rejection of the proposal for reportedly not providing sufficient information on the conditions for public bodies to access data,  compensation for businesses and integration with other legislative measures.  A data-sharing arrangement would be ‘encouraged’ via smart contracts and application programming interfaces. However, the text also refers to the introduction of ‘essential’ technical measures for interoperability, raising the question of whether these measures would be mandatory or not. Transparency obligations would force service providers to specify in the agreement what type of data is likely to be generated and how it can be accessed by customers, with SMEs exempted. Machine-generated data may be also excluded from the scope, making this type of data more accessible. The adoption of the Data Act is expected by the first quarter of 2022.

Official Guidance

The US National Institute of Standards and Technology, NIST, explains the role of privacy-enhancing cryptography, PEC, and Differential Privacy techniques. In large, the PEC and the Differential Privacy paradigms can be composed to enable better privacy protection, namely in scenarios where sensitive data should remain confidential for each individual original source. Differential privacy adjusts the query result into a noisy approximation of the accurate answer, which PEC can compute without exfiltrating additional information to any party.  For more practical guidance, such as secure multiparty computation, (SMPC), private set intersection, (PSI), private information retrieval, (PIR), zero-knowledge proofs, (ZKP), and fully-homomorphic encryption, (FHE), followed by a case study related to private medical data research, see the full article.

Photo by Hal Gatewood on Unsplash

The Luxembourg data protection authority, the CNPD, published a comprehensive update of the guidelines, (in French), on cookies and similar technologies, such as  “fingerprinting”, “web beacons ”, “flash cookies”, used for excessive tracking, profiling and targeting users and customers. The guidance clearly distinguishes essential and non-essential cookies, draws a line where there is an obligation on data controllers to obtain consent, explains the danger of using consent management platforms set up by third parties, and provides plenty of visual examples on what a “cookie banner” should and should not look like.

The Italian Data Protection Authority, Garante, provided clarification on direct marketing through social media platforms. A data subject complained of receiving a marketing communication sent by the company through LinkedIn. The communication offered real estate services for a specific property owned by the claimant. The company justified this practice on the following grounds: the claimant’s LinkedIn profile was set to allow them to receive communication from any other LinkedIn user. Garante did not accept the company’s arguments.  LinkedIn specifically is a platform whose purpose is to connect users who share the same professional interests or who are seeking job opportunities, and not for sale of products and services. Garante also found the personal data acquired via the public real estate register was in breach of Art. 5 of the GDPR. The real estate register may be accessed only to verify ownership of a certain property, but not for direct marketing purposes. Garante did not sanction the company as it is a micro-enterprise whose business has been strongly impacted by the pandemic, but imposed a 5000 euro fine for failing to respond to its requests during the investigation.

The Polish Data Protection Authority, UODO, continued a series of blog posts, (in Polish), on creating a successful Code of Conduct. This time it pays attention to effective mechanisms for monitoring compliance with the provisions of the code for private entities (Art.41 of the GDPR). First of all, the code of conduct must designate the entity that monitors compliance with this document by the organizations that accede to it. The monitor, in order to be accredited,  must demonstrate its independence in relation to the code’s creator and have appropriate financial, human, organizational and technical resources. From this point, the monitoring entity would be responsible for all preliminary audits and regular checks, as well as for ad hoc audits in case of data breach complaints. Further steps include issuing comments, post-inspection recommendations and their  implementation, imposing sanctions, suspension and exclusion, handling appeals, cooperating with the supervisory authority and authors of the code, participation in the code review mechanism, education and promotion of data protection principles, ongoing cooperation with members of the code, (e.g. in the event of a data breach notification), clarifying doubts and assistance in ensuring an adequate level of personal data protection.

The Dutch Data Protection Authority, the AP, has mapped out the trends and risks for the protection of personal data in education. “It is a challenge for many educational institutions to keep an overview of all processing of personal data. Due to the autonomous position of teachers and the ‘proliferation’ of apps and software in education this makes it difficult for educational institutions to keep control over the data processing for which they are responsible”, the AP states. The regulator identifies three key trends and risks: excessive monitoring of pupils and students and their learning performance; dependence on major suppliers and the growing exchange of data in partnerships. The recommendations of the AP focus on setting up the basics of privacy and management programs, such as keeping up-to-date records of processing activities and running self risk assessments and employee training. The AP has also called on ministers to table a package of measures to help institutions with the task.

Photo by Compare Fibre on Unsplash

Data Breaches and Enforcement Actions

The Romanian Supervisory Authority, the ANSPDCP, found IKEA Romania in violation of Art. 32 of the GDPR. The company organized a drawing contest in which the children of IKEA Family members participated. The participants uploaded to the online platform dedicated to the members their own drawings, together with participation forms which contained their personal data but also that of their parents/legal guardians. In order to vote for the best drawing, the children’s drawings were mistakenly published on the online platform, together with the personal data contained in the participation forms. The disclosed data included name, surname and age of minors, name, surname, city, country, e-mail, IKEA Family membership number and  handwritten signatures. The exposure lasted for about 40 hours, affecting 114 individuals, so a minor fine of 1000 euros was issued.

A British firm – Huq – which sells people’s location data has admitted that some of its information was gained without seeking permission from users. Huq uses location data from apps on people’s phones, and sells it on to clients, which include dozens of English and Scottish city councils. The apps in question  measured wi-fi strength and scanned barcodes. So a council could use the data they provided to estimate how many people visited a High Street within a given timeframe, for example. Huq claimed it was aware of two “technical breaches”, and had asked for code revisions and for the apps to be republished. Firms that collect location data from apps and then sell it on are under increased scrutiny in the EU. The Danish data authority is currently looking into whether there is “a legal basis” for the way Huq has processed personal data. Meanwhile, the UK’s Information Commissioner’s Office has issued a reprimand to another UK-based location data collection firm, Tamoco, for failing to provide sufficient user privacy information.

The Danish Data Protection Agency, Datatilsyney, received a data breach notification from a company, (Coop Danmark A/S), concerning personal information that was located on the company’s shared drive without adequate access control. The information concerned a total of 477 employees and external consultants. Coop discovered the breach while testing a new scanning tool. The regulator found that Coop had not complied with the requirement for necessary security measures. The company should have previously been aware that employees could have incorrectly placed personal data on the company’s shared drive. The company should have checked and cleaned up the company’s common drive and introduced relevant security measures at an earlier stage. However, Coop reported the security breach to the Authority in a timely manner, as the notification took place within the time limit of 72 hours, so no fine was issued. 

The French regulator, the CNIL, sanctioned the RATP – Paris’s public transportation company,  with a fine of 400,000 euros after noting that several bus centers had counted the number of days of strikes by workers in evaluation files which were used to prepare promotion offers. It also noted an excessive data retention period and data security breaches. The RATP had failed in its obligations, particularly because only data strictly necessary for the assessment agents should have been in the promotion files. The indication of the number of days of absence was sufficient here, without it being necessary to specify the reason for the absence linked to the exercise of the right to strike. The CNIL thus imposed a fine and decided to make its decision public.

Opinion

Challenges with anonymising genetic data are analysed in Herbert Smith Freehills blog series. “As soon as one dataset is merged with another relating to the same set of data subjects, it becomes more likely that the information could be used to re-identify a data subject. For example, it was reported last year that the British National Health Service had sold medical records to pharmaceutical companies that could be used to re-identify “anonymised” genetic information collected for diagnostic purposes.” Advances in AI are also making it harder to anonymise data, because it is increasingly easy to match up various pieces of data and link them to one individual. And sometimes anonymisation just isn’t desirable – the more identifiable information that is collated, the more valuable the dataset for research. As a result, an attempt to anonymise genetic data might even end up falling short, resulting in pseudonymisation only.  Unlike anonymised data, pseudonymised data does fall within the GDPR. For this reason, it could be risky for an entity offering a diagnostic test to rely on anonymisation alone for the legitimate processing of genetic data, in case the data is in fact pseudonymised.

Photo by National Cancer Institute on Unsplash

Data Security

Brian Krebs’s cybersecurity blog shows how the holiday shopping season is a perfect attack vector for phishers. Krebs analyses a fairly elaborate SMS-based phishing scam that spoofs Fedex delivery in a bid to extract personal and financial information from unwary recipients. A phishing link usually implies that the recipient could reschedule delivery. Clicking “Schedule new delivery” brings up a page that requests your name, address, phone number and date of birth. Those who click “Next Step” are asked to add a payment card to cover the “redelivery fee.” After clicking “Pay Now,” the visitor is prompted to verify their identity by providing their Social Security number, driver’s license number, email address and email password. So the main rule is to Avoid clicking on links or attachments that arrive unbidden in emails, text messages and other mediums, or visit the site or service in question manually. Also most phishing scams invoke a temporal element that warns of negative consequences should you fail to respond or act quickly.

Big Tech

The Federal Trade Commission has found that Internet Service Providers accounting for 98% of the US mobile market collect and share more data than their customers might be aware of, and those same customers are ill-informed or even misdirected when trying to exercise choice about how their data is used. Sensitive data like race or sexual orientation was sometimes grouped , and  real-time location was shared with third parties. The Staff Report notes the scope of such data collection is expanding, in line with similar trends in other industries, and so strengthens the argument for restricting data collection and use.

Meta  informed us last week it is ending its use of facial recognition on its platforms, shutting down a feature that has sparked privacy concerns and multiple lawsuits in the US. Facebook platform will delete face scans of over a billion people, and will no longer automatically recognize people’s faces, meaning users who opted in to the service won’t receive alerts when a photo or video of them may have been added to the social network. Tough if you are a blind user as the Automatic Alt-Text tool allowing the tagging of friends will be disabled. In AI VP Jerome Pesenti’s words “the company would consider facial recognition technology for instances where people need to verify their identity or to prevent fraud and impersonation.”

China’s regulatory crackdown continues with 38 apps from a number of companies told to stop excessively gathering personal data immediately or face penalties. The companies include a news app and music streaming service owned by social media behemoth Tencent Corp. The order arrived days after China’s Personal Information Protection Law, PIPL, went into full effect. Meanwhile, internet company Yahoo has announced its withdrawal from the Chinese market in the latest retreat by foreign technology firms responding to Beijing’s tightening control over the industry.  However, analysts say Yahoo’s withdrawal from China is largely symbolic as at least some of Yahoo’s services, including its web portal, have already been blocked. China has also blocked other US internet services, such as Facebook, LinkedIn and Google. Mainland users who wish to access these websites use a virtual private network, VPN, to circumvent the block, the Guardian reports.

Book a free consultation to discuss your DPO needs and the most suitable package

Request your free consultation