TechGDPR’s review of international data-related stories from press and analytical reports.
Legal Processes and Redress
The EU commission warned Belgium about failing to ensure full independence of its data protection authority. The Commission considers that Belgium violates Art. 52 of the GDPR, which states that the data protection supervisory authority shall perform its tasks and exercise its powers independently. The independence of data protection authorities requires that their members are free from any external influence or incompatible occupation. However, some members of the Belgian Data Protection Authority currently cannot be regarded as free from external influence because they either report to a management committee depending on the Belgian government, or they have taken part in governmental projects on COVID-19 contact tracing, or they are members of the Information Security Committee. Belgium now has two months to take relevant action, failing which the Commission may decide to refer the case to the Court of Justice of the European Union
The Dutch regulator, the AP, asks legislators to vote down the proposal for the Data Processing by Partnerships Act, (WGS). In its current version it gives government organizations and private parties very broad powers to share personal data with each other, for example, in cases of suspicion of fraud or organized crime. According to the AP, this can have major consequences for people who end up ‘”on the wrong list” or create a risk of “mass surveillance”. The purpose of the partnerships to share, store and analyze personal data on a large scale is not defined clearly enough in the bill, the AP states. According to the government, every partnership concerns ‘”weighty general interests”, such as ”monitoring the proper functioning of the market”. The WGS concerns broad categories of data – social security numbers, living situation, residence status, financial data, police data and even data about sexual behaviour. Moreover, it is not only about people’s personal data, but also their family and friends, the AP notes. Read the regulator’s opinion, (in Dutch), here.
A three billion pound class action against Google over tracking millions of iPhone users has been blocked by the UK’s top court. Legal experts said the decision meant the “floodgates” remained closed to US-style representative actions on data breaches and cyber incidents in England and Wales. The Supreme Court has upheld Google’s appeal in Lloyd v Google, limiting the ability for individuals to recover damages for simple loss of control of their personal data. Richard Lloyd, a consumer rights activist, claimed Google illegally misused the data of 4 million iPhone users by tracking and collating their internet usage on their handsets’ Safari browser in 2011 and 2012, even when users were assured they would be opted out of such tracking by default. The Supreme Court found that a claim for damages under the Data Protection Act 1998, (which precedes the UK GDPR), required proof of damage in the form of either material damage, such as financial loss, or mental distress. That could be the time period, the quantity and nature of data captured, how that data was used and what commercial benefit there was to Google in processing it. In the absence of any evidence, an individual is not entitled to compensation. Read the full decision here.
A new White Paper on digital payments and data privacy was published by the French regulator, the CNIL (in French). Payment data can make it possible to trace personal activities or to identify the behavior of individuals, creating a complex area of compliance for DP specialists. The Paper distinguishes between terms “payment data”, “purchase data”, “contextual” (behavioral) data, “silent party” data, “highly personal nature” (biometric) data. The CNIL considers that only authentication, and not identification, is necessary for merchants and other payment recipients. Qualifying the actors also could be the key: “Criteria such as direct contact with the data subject to subsequent re-use of data for their own account can be used in determining whether an actor should be considered a data controller or data processor.”
Some other criteria include – data minimisation, careful selection of third party recipients, location of payment data storage and international data transfers, determining a specific purpose for each data processing activity from legitimate interest, (eg, for security or fraud prevention), or consent of the user to legal obligations, (eg, for compliance with anti-money laundering laws). For the latter, the CNIL stresses that data protection is only part of the regulatory framework applicable to payment data in the EU, which also includes the Payment Services Directive, the Anti-Money Laundering Directive, and the Network Information Security Directive. Finally, for security reasons, the CNIL promotes “tokenization,” – the method of substituting payment data with randomly generated, single-use tokens, on which the regulator will soon publish additional recommendations.
Last week the CNIL also developed an awareness guide, (in French), to the GDPR to support associations in their compliance. Its objectives: to reiterate the main principles, (benchmarks), to respect, and to propose an adapted action plan. France has a particularly rich network of associations, listing more than 1.3 million bodies with various profiles, both in terms of size and sectors of activity, (charitable, political, sporting, social). Most of them collect a lot of information, sometimes sensitive, which concerns various audiences – their members, partners, employees, volunteers or even donors. The guidance includes a variety of steps to be taken: keeping records of processing activities, transparent privacy notices, consent mechanisms and licit cookie banners on the websites, direct advertising, (including charitable prospecting), compliance, prohibition on tracking criminal history of workers and volunteers, running DPIA, data breach notification, establishing a checklist of basic technical and organisational measures, and much more.
Data Breaches and Enforcement Actions
The Dutch regulator the AP has imposed a 400,000 euro fine on Transavia airline for failing to protect personal data. Poor security allowed a hacker to penetrate Transavia’s systems in 2019, granting access to the data of 25 million people. It has been established that the hacker downloaded personal data of about 83,000 people- name, date of birth, gender, e-mail address, telephone number and flight and booking details, as well as some medical data. Security was not in order on three points:
- The password was easy to guess and was enough to get into the system.
- There was no so-called multi-factor authentication.
- Once the hacker took control of these two accounts, they also had access to many of Transavia’s systems. The access was not limited to only the necessary systems.
The hacker penetrated the system in September 2019. Two months later Transavia closed the leak. The airline reported the data breach in a timely manner and informed those involved.
In Italy, the Court of Cassation upheld data protection regulator Garante’s decision to fine C.S. Group 60,000 euros, DataGuidance reports. The C.S. Group, a car-sharing company, lodged a complaint against the decision of the Garante to impose two fines for failure to notify it of the processing of the rented vehicles’ geolocation data and of their profiling of customers. The C.S. Group denied that the use of an algorithm to calculate tailored discounts based on additional information provided by customers could be framed as profiling, and requested the redetermination of the sanctions. The court rejected the complaint and confirmed the fines, highlighting that “processing personal data by means of an algorithm is in itself profiling, even when personal data is not stored indefinitely and is not associated with an individual customer, since it constitutes a screening of the data provided, in order to evaluate personal aspects and possibly to predict future behaviour”. The ruling, (in Italian), is available here.
Luxembourg’s CNPD imposed corrective measures on a company for DPO-related violations (Art.37-39 of the GDPR). The company violated its obligations to communicate the data protection officer’s contact details to the supervisory authority, and also failed to ensure that other tasks – current or past – carried out by the DPO did not result in a conflict of interests with their role as a DPO. The investigation showed that the DPO was also Head of Compliance and Money Laundering Reporting Officer, and in such a role could determine the purposes and means of processing of personal data, which contradicts the independent role of the DPO. The court also states that there were no immediate measures to mitigate the risk such as parallel appointment of a deputy DPO, (outside the AML department) who would be in charge of such cases. No administrative fine in this case was imposed.
The Irish data protection authority brought in some changes to its breach notification form. Here are some of the updates for controllers and processors:
- confirming whether the breach is likely to result in a risk to the rights and freedoms of natural persons, (i.e. whether the breach reaches the risk threshold), and whether the breach falls under the Law Enforcement Directive.
- determining whether the breach relates to cross-border processing and related questions including details of the controller’s establishments, location of affected data subjects and whether they are “substantially affected”.
- classifying the controller’s industry sub-sector according to Eurostat NACE criteria.
- choosing the approximate numbers of data subjects from bands (1-10, 11-100).
- detailing existing TOMs and other measures to mitigate the risk.
- uploading supporting documents.
- declaring, (controllers), the understanding that any information provided in the breach notification may be utilised at a future date in relation to an inquiry.
UK based Privacy International continues to investigate data related issues in the digital health sector. PI and its partners question whether adopting a given digital solution leads to more effective delivery of quality care. One of the negative outcomes is in places where digital infrastructure is still developing, (eg. India), where the time lag between data collection and digitisation can take up to 72 days, which negatively impacts patients: “Such delays not only call into question the effectiveness of the system, but also raise serious questions as to the safety of the data awaiting to be digitised, ranging from storage to access – as well as participating staff know-how and awareness of data protection obligations.” However, similar failures may occur even in digitally progressive countries,(eg, non-functional Track and Trace QR code alert systems in the UK, or the NHS England Covid app outage). At the same time, data protection authorities have limited expertise and resources to effectively advise on the deployment of such systems in the health sector. PI also worries about the absence of proper impact assessment of the security of personal health data in centralised digital systems used by government agencies, or private-public partnerships in the UK, (eg, between NHS and Amazon), and worldwide. Read the full analysis by PI here.
Europol has published its Internet Organised Crime Threat Assessment (IOCTA) 2021. The report states the rise of ransomware crews deploying multi-extortion methods by exfiltrating victims’ data and threatening to publish it. Such modi operandi could include, for example, cold calling victims’ clients, business partners and employees with the purpose to commit investment fraud. In addition, many of the ransomware affiliate programs deploy DDoS attacks against their victims to pressure them into complying with the ransom demand. “Personal information and credentials are in high demand as they are instrumental in improving the success rate of all types of social engineering attacks. Unfortunately, the market in personal information flourishes as ransomware and mobile information stealers produce an abundance of marketable material as a by-product of the primary attack.” Criminals have also realised how much potential there is to compromise digital supply chains – organisations need to grant network access to update distributors, which makes these third-party service providers an ideal target. According to Europol, one of the solutions would be to intensify public-private partnerships, (eg, expertise and information sharing with financial institutions can help to obtain data on cybercriminals and may help rapidly freeze their criminal proceeds.).
Constant monitoring of workers and setting performance targets through algorithms is damaging employees’ mental health and needs to be controlled by new legislation, according to a group of UK MPs. Under the act workers, like delivery drivers, (who have to log most of their activity on shifts, sometimes while driving on the road), would be given the right to be involved in the design and use of algorithm-driven systems, where computers make and execute decisions about fundamental aspects of someone’s work – including in some cases allocation of shifts and pay. The parliamentary group report also recommended that corporations and public sector employers fill out algorithmic impact assessments, and expand the new umbrella body for digital regulation. Read more analysis of the proposal by the Guardian.
WhatsApp Ireland, owned by Meta, has secured permission from the High Court to challenge the Data Protection Commission ( DPC)’s decision to fine it 225 million euros. Last August the DPC held that the messaging service had failed to comply with its obligations under the GDPR in several respects: WA’s processing of data of users and non-users of the service, and the sharing of personal data between WA and Meta companies. WA also seeks declarations from the court including that certain provisions of the 2018 Data Protection Act are invalid, and are incompatible with the State’s obligations under the European Convention on Human Rights. Namely, the 2018 Act allows the DPC to engage in a form of administration of justice that is not permissible and is contrary to the Irish Constitution. Finally, the size of the fine constitutes an interference with WhatsApp’s constitutional property rights, WA claims.
Meta plans to remove detailed ad-targeting options that refer to “sensitive” topics, such as ads based on interactions with content around race, health, religious practices, political beliefs or sexual orientation. In its blog post, the company gave examples of targeting categories that would no longer be allowed on its platforms, such as “Lung cancer awareness,” “World Diabetes Day”, “LGBT culture”, “Jewish holidays” or political beliefs and social issues. It said the change would take place starting January 19, 2022. However, advertisers, (small businesses, non-profits, and advocacy groups), on Facebook and other platforms, can still target audiences by location, use their own customer lists, reach custom audiences who have engaged with their content and send ads to people with similar characteristics to those users.
Beginning in 2022, Apple and Google will impose new privacy requirements on mobile apps in the Apple App Store and Google Play Store, a publication by the National Law Review reminds consumers. Apple’s new account deletion requirement will apply to all mobile app submissions to the Apple App Store beginning January 31, 2022. Similarly, Google’s new Data Safety section will launch in February 2022, and app developers will be required to submit to the Google Play Store Data Safety forms and Privacy Policies by April 2022. These announcements have encouraged mobile app developers to review any laws that may require them to maintain certain types of data, and to make sure that their apps clearly explain what data the app collects, how the app collects data, all uses of the data, and the app’s data retention and deletion policies.