TechGDPR’s review of international data-related stories from press and analytical reports.
Official guidance: data subject complaints, new business, ID cards, traffic licenses, TCF and OpenRTB, education platforms, insolvency claims, data sent by mistake, DPO
The UK Information Commissioner’s Office offers a brief guide on how to deal with data subject complaints, (your staff, contractors, customers), when you are a small business. The main steps are as follows:
- Respond as soon as possible, in plain language, to let the customer know you’ve received their data protection complaint and are looking into it.
- Let them know when they can expect further information from you and give them a point of contact. Include information about what you’ll do at each stage.
- Send them a link to a complaints procedure, (if there is one).
- Check the complaint has come from an appropriate person.
- Check all the details of their complaint against the information you hold.
- Ask for additional information if necessary.
- Update them so they know you’re working to resolve the issue.
- Record all your actions, due dates, and
- Keep copies of relevant documents and conversations.
Financial institutions, for a range of services such as setting up and maintaining a bank account, electronic banking services, granting a loan or even a transfer order, make copies of our identity documents. The Polish data protection authority UODO assumes that such copying is not allowed in any situation. For instance, the country’s banking law allows processing information contained in identity documents, but this does not give the right to make copies of these documents. In many cases, it is enough to show an identity document for inspection. On the other hand, anti-money laundering and financing of terrorism legislation entitles financial institutions to make copies of identity documents.
The UODO states that before applying financial security measures, institutions must assess whether it is necessary to process the personal data of a natural person contained in the copy of the identity card for these purposes. According to the principles of purpose limitation and data minimisation referred to in Art. 5 of the GDPR, personal data must be collected for specific, explicit and legitimate purposes, using relevant criteria and limited to what is necessary for the purposes for which they are processed.
The Hungarian data protection authority NAIH issued a notice on data management related to the reading of the bar code on traffic licenses at filling stations. According to the submissions received by the regulator, in order to sell fuel at the official price, a fuel provider reads bar codes on vehicle registrations, (or records the registration number of the vehicle), and stores it in its system. The data is then forwarded for tax control purposes. In relation to data management, information was not available for customers at the filling stations, and the employees were not able to provide any meaningful information. The NAIH started an ex-officio investigation into the lawfulness of the processing, and to see if the tax authority and fuel providers had complied with Art. 13 of the GDPR.
The Latvian data protection authority DVI recently issued a series of recommendations, (in Latvian), including:
- To evaluate the use of TCF and OpenRTB systems. Following the Belgian regulator’s decision, the transparency and consent system created by IAB Europe and the real-time bidding system were recognised as non-compliant. The decision stipulates that personal data obtained through TCF must be deleted immediately. This means that organisations using the tools, (website/app operators, advertisers and online ad technology companies), must stop using the tool, (unless it uses non-personal data).
- What to do if another person’s data has been received by mistake, (Do not open, do not publish, use minimal research to identify the sender, who should be notified, let the sender solve this situation himself, etc.).
- Safe use of online platforms used during the educational process.
- The processing of personal data by insolvency administrators in the register of creditors’ claims, and
- Functions and tasks of a data protection specialist.
Legal processes: EU Data Act, Quebec Bill 64, California privacy laws, China cross-border transfers
The Czech Presidency of the EU Council brought more clarity on the proposed Data Act, namely the part that refers to public sector bodies’ access to privately held data, Euractiv.com reports. Public authorities might request data, including the relevant metadata, if its timely access is necessary to fulfil a specific task in the public interest, (eg, local transportation, city planning and infrastructural services). At the same time, safeguards for requests involving personal data have been added, as the public body will have to explain why the personal data is needed and what measures are taken to protect it. The top priority should be anonymisation, or at least aggregation and pseudonymisation, of collected data.
In Quebec, the first amendments from Bill 64, (modernises data protection legislative provisions), to the Quebec Privacy Act and the Quebec IT Act will come into force on 22 September. They create obligation for a person carrying on an enterprise to protect personal information and automatically designates the person exercising the highest authority within the enterprise as the main responsible. Other provisions create mandatory reporting of confidential incidents, biometric information database registration no later than 60 days before it is put in service, notification of any processes used to verify/confirm an individual’s identity based on biometric data, and allow disclosure of personal data necessary for commercial transactions, (eg, mergers, leasing).
In California a new privacy rights act, the CPRA, will take effect on 1 January 2023, while the new California Privacy Protection Agency is consulting on draft regulations, with special attention on the draft American Data Privacy and Protection Act and its possible preemptive effect on California privacy laws. Other key regulatory issues include data processing agreements, programs on exercising data subjects rights, data minimisation and valid consent requirements, and prohibition of “dark patterns”.
China will enforce cross-border data transfer rules starting from 1 September. Consequently, many critical industries like communication and finance or transportation will face additional checks under the countries’ latest cybersecurity, data security and personal information protection legislation. Companies seeking to transfer personal data on 100,000 or more people, (10,000 or more for sensitive data), handle the personal data of 1 million or more people, as well as operators that transfer the personal information of at least 100,000 cumulative individuals a year will undergo security reviews. Business will have to explain to government investigators the purpose of transfer, the security measures in place, and the laws and regulations of the destination country. More details on the new regulatory framework can be found in this guidance (by KPMG China).
Enforcement actions: commercial prospecting, employee’s consent, smart TV reset, Chromebook ban, PHI disposal, medical results without encryption
A famous French hotel group was slapped with a 600,000 euro fine from the privacy regulator CNIL for carrying out commercial prospecting without the consent of customers, when making a reservation directly with the staff of a hotel or on the website. The consent box to receive the newsletter was prechecked by default. Also a technical glitch prevented a significant number of people from opposing the receipt of prospecting messages for several weeks. As the processing in question was implemented in many EU countries, the EDPB was asked to rule on the dispute concerning the amount of the fine. The CNIL was then asked to increase the sum so that the penalty would be more dissuasive.
Guernsey’s data protection authority has issued a reprimand, (recognition of wrongdoing), to HSBC Bank’s local branch for inappropriate reliance on consent. An employee felt obliged to consent to providing sensitive information about themselves in connection with what they believed was a possible internal disciplinary matter. They then made a formal complaint. The authority’s opinion is that reliance on “consent” where a clear imbalance of power exists is inappropriate, as it is difficult for employers to demonstrate that consent was freely given. Whilst in this case the controller ceased processing as soon as concerns were raised, they nonetheless continued to use consent as justification for the processing. How to manage data protection in employment? See in Guernsey’s latest guide.
The Danish data protection authority expressed serious criticism of retailer Elgiganten A/S that had a returned television stolen during a break-in at their warehouse, which had not been reset to zero for the plaintiff’s personal data. This meant that a third party gained access to the TV and thus to information from streaming services that the plaintiff was logged into, as well as the browsing history. Before the break-in, the company had carried out a risk assessment for theft of their products and assessed the risk to be high, so the warehouse was secured by locks, a high wall, surveillance cameras and motion sensors. The burglar gained access by simply punching a hole in the wall.
The Danish data protection authority is maintaining its ban on Chromebook use by a Helsingør municipality, on the grounds of high risks for individuals. The regulator stated that the decision does not prohibit the use of Google Workspace in schools – but the specific use of certain tools in the municipality is not justifiable regarding children’s information. The Municipality assessed that Google only acts as a data processor, but in the opinion of the regulator, it acts in several areas as an independent data controller, processing personal data for its own purposes in the US.
The Danish regulator ruled that the municipality cannot reduce the risk to an acceptable level without changes to the contract basis and the technology the municipality has chosen to use. Although the decision specifically relates to the processing of personal data in Helsingør Municipality, the regulator encourages other municipalities to look at the same areas in relation to unauthorised disclosure and transfers to unsafe third countries.
The recent HIPAA settlement, (over 300,000 dollars), offers lessons on data disposal and the meaning of Protected Health Information, (PHI), workplaceprivacyreport.com reports. A dermatology practice reported a breach last year when empty specimen containers with PHI labels were placed in a garbage bin on the practice’s carpark. The labels included patient names and dates of birth, dates of sample collection, and name of the provider who took the specimen. The workforce should have been trained to follow disposal policies and procedures. These requirements can include: shredding, burning, pulping, or pulverizing records so that PHI is rendered essentially unreadable; store labelled prescription bottles in opaque bags in a secure area and using a disposal vendor as a business associate to pick up and shred or otherwise destroy the PHI.
- the laboratory webpage allowed doctors to remotely consult the medical results of patients without employing any encryption;
- the laboratory failed to conduct a DPIA for the large-scale processing of health data;
- while rejecting that the health data had been processed on a large-scale, it had failed to clarify what criteria they were using to determine this;
Data security: cyber security breaches landscape, Americans’ personal data bought by FBI, social engineering on healthcare
The UK government published an in-depth qualitative study with a range of businesses and organisations which have been affected by cyber security breaches. The findings help businesses and organisations understand the nature and significance of the cyber security threats they face, and what others are doing to stay secure. It also supports the government to shape future policy in this area. The guide also contains 10 practical case studies on: understanding the level of existing cyber security before a breach, determining the type of cyber attack , understanding how businesses and organisations act in the immediate, medium, and long-term aftermath of a breach, etc.
Top US Democrats in Congress demand the FBI and Department of Homeland Security detail their alleged purchases of Americans’ personal data, Gizmodo.com reports. They suspect federal law enforcement agencies of using commercial dealings with data brokers and location aggregators to sidestep warrant requirements in obtaining Americans’ private data. Reportedly data points may include, among others, records of internet browsing activity and precise locations. The demand includes the release of of documents and communications between the agencies and data brokers with whom they may have dealings or contracts.
The US Health Sector Cybersecurity Coordination Center published guidance on the impact of social engineering on healthcare. Social engineering is the manipulation of human psychology for one’s own gain. “A social engineer can manipulate staff members into giving access to their computers, routers, or Wi-Fi; the social engineer can then steal Protected Health Information, (PHI), Personal Identifiable Information, (PII), or install malware posing a significant threat to the Health sector”, says the study. It also answers the questions on phases, types of social engineering attacks, (eg, tailgating, vishing, deepfake software, smishing, baiting and more), the personality traits of a social engineer, data breaches and steps to protect your organisation.
Big Tech: US mobile carriers, Google location data, FB’s Cambridge Analytica settlement, TikTok iOS app, Oracle class action
The US Federal Communications Commission will investigate mobile carriers’ compliance with disclosure to consumers how they are using and sharing location data, Reuters reports. Top mobile carriers like Verizon, AT&T, T-Mobile, Comcast, Alphabet’s Google Fi and others were requested to detail their data retention and privacy policies and practices. Recent enforcement of anti-abortion legislation in many states also raised concern that the police could obtain warrants for customers’ search histories, location and other information that would reveal pregnancy plans. Last month Google responded to this by promising to delete location data showing when users visit an abortion clinic.
The Federal Court of Australia ordered Google to pay 60 million dollars for misleading consumers about the collection and use of personal location data. Google was guilty of misleading and deceptive conduct, breaching Australian Consumer Law. The conduct arose from representations made about two settings on Android devices – “Location History” and “Web & App Activity”. Some users spotted that the Location History default setting changed from from “off” to “on”. Another misleading practice was telling some users that having the Web & App Activity setting turned “on” would not allow Google to obtain, retain or use personal data about the user’s location.
Facebook agreed to settle a lawsuit seeking damages for allowing Cambridge Analytica access to the private data of tens of millions of users, The Guardian reports. Facebook users sued the tech giant in 2018 after it emerged that the British data analytics firm, connected to former US president Donald Trump’s successful 2016 campaign for the White House, gained access to the data of as many as 87 million of the social media network’s subscribers. Reportedly, if owner Meta had lost the case it could have been made to pay hundreds of millions of dollars.
Reportedly, when you open any link on the TikTok iOS app, it’s opened inside their in-app browser. While you are interacting with the website, TikTok subscribes to all keyboard inputs, (including passwords, credit card information, etc.), and every tap on the screen, like which buttons and links you click. Such discovery was made by a software engineer Felix Krause. You can read more technical analysis of the most popular iOS apps that have their own in-app browser in the original publication.
Finally, the Irish Council for Civil Liberties, (ICCL), started a class action against Oracle in the US for its worldwide surveillance machine. Oracle is an important part of the tracking and data industry. It claims to have amassed detailed dossiers on billions of people, and generates over 42 billion dollars in annual revenue. Oracle’s dossiers may include names, addresses, emails, purchases online and in the real world, physical movements, income, interests and political views, and a detailed account of online activity. For example, one database included a record of a man who used a prepaid debit card to place a 10 euro bet online. Oracle also coordinates a global trade of people’s dossiers through the Oracle Data Marketplace, claims the ICCL. You can view the full complaint here.