Data protection digest 1 – 14 September 2023:  gatekeeper obligations, synthetic datasets, automotive cybersecurity

In this issue, you will find EU gatekeeper obligations, guides on ‘sharenting’, online exams, synthetic data, and the right to object, the Meta ban in Norway, the automotive industry, ads-free Facebook and Instagram, and the Privacy Sandbox availability.

Legal processes and redress: gatekeeper obligations, US adequacy decision, Google litigation, UK data protection reform, Quebec privacy laws

Gatekeeper in the EU: The European Commission has designated, for the first time, six gatekeepers – Alphabet, Amazon, Apple, ByteDance, Meta, and Microsoft – under the Digital Markets Act. They will now have six months to ensure full compliance with the DMA obligations for each of their designated core platform services. This includes a list of do’s and don’ts

  • allowing third parties to inter-operate with the gatekeeper’s own services,
  • enabling end users to unsubscribe from the gatekeeper’s main platform services as simply as they subscribe to them, 
  • giving companies that advertise on a gatekeeper’s platform access to the gatekeeper’s performance measurement tools and information, allowing advertisers and publishers to undertake their independent verification of advertising hosted by the gatekeeper, and
  • a ban on tracking end users outside of the gatekeepers’ core platform service for targeted advertising without effective consent having been granted. 

EU-US DPF application: The German Data Protection Conference publishes application instructions for the EU-US Data Privacy Framework. The document contains, on the one hand, information for data exporters, those data controllers and processors who transfer data to the US. On the other hand, individuals can find out what legal protection and complaint options they have. This includes links to numerous materials, for example from the EDPB. At this point, the adequacy decision applies to EU law. However, given the previous adequacy decisions for the US that were declared invalid, many want to know whether the new adequacy decision will suffer the same fate as Safe Harbor and the Privacy Shield. 

In addition to the planned evaluations by the EU Commission, which can result in adjustments or a repeal, there are options for a judicial review of the new adequacy decision. For instance, on 6 September, a French member of parliament, who is also a member of the data protection authority CNIL, requested that the framework be annulled due to the lack of guarantees of a right to an effective remedy for data subjects by US companies, as well as a violation of the GDPR’s minimisation and proportionality principles due to the access and use of EU personal data for the US security purposes. 

Google taken to court: Alphabet’s Google is facing a class action in the Netherlands brought by non-profit organisations, demanding Google stop its constant surveillance and profiling of consumers and the sharing of data in online ad auctions, and also pay damages to consumers. Allegedly, through its services and products, the tech giant:

  • Collects users’ online behaviour and location data on an immense scale, without having provided adequate information about it and without users’ consent.
  • Through the use of ‘invisible’ third-party cookies, Google continues to collect data through others’ websites and apps, even when someone is not using its products or services. 
  • Continually collects users’ physical locations, even when they are not actively using their devices and think they are ‘offline’. 
  • Shares users’ data, including highly sensitive data concerning health, ethnicity and political affiliation, with hundreds of parties through its online advertising platform, (a recent study shows that in Europe, the real-time bidding industry exposes people’s data 376 times a day.) 

In total, Alphabet’s Google faces approximately 25 billion euros in damages claims and regulatory administrative fines over its ad tech practices in Europe, Reuters sums up.

UK data protection amendments:  By the end of the year, the UK government will amend the UK’s data protection legislation by updating the ‘fundamental rights and freedoms’ definition, so it will refer to rights recognised under UK law, rather than retained EU law rights. There is no direct equivalent to the right to the protection of personal data in UK law. However, the protection of personal data falls within the right to respect for private and family life under Article 8 of the European Convention of Human Rights, which is enshrined in UK law by the Human Rights Act 1998. Data protection rights are also protected by UK GDPR, and the Data Protection Act 2018 and will continue to be protected by the Data Protection and Digital Information Bill in the UK’s domestic legislation, states the explanatory memorandum

Quebec privacy amendments: On 22 September, the latest set of amendments (Bill 64) to Quebec’s Privacy Act will come into force. Some of the major updates include strengthened privacy rights for individuals and several controller requirements, such as a new consent and cookies management framework, privacy policies, risk assessments, rules on automated decisions, cross-border transfers, and monetary penalties. Previously companies were also obliged to designate privacy officers, conduct mandatory breach reporting, and register their biometric information systems while receiving some exceptions to the consent requirement, (under commercial transactions and research and statistical purposes). 

Official guidance: ‘sharenting’, online exams, smart data sandbox, right to object

‘Sharenting’ children’s data: The Italian data protection authority has prepared tips for parents to limit the online dissemination of content concerning their children. The neologism, coined in the US, derives from the English words “share” and “parenting”. It has been a phenomenon that has been under the attention of the Guarantor for some time, especially due to the risks it entails on the digital identity of the minor and therefore on the correct formation of their personality. When something appears on a screen, not only can it be captured and reused without our knowledge by anyone for improper purposes or illicit activities, but it contains more information than we think, such as geolocation data. If you decide to publish images of your children, it is important to at least try to follow some precautions, such as:

  • make the minor’s face unrecognizable, (by simply covering the faces with the emoticon “smiley”);
  • limit the visibility settings of images on social networks only to people who know each other or who are trustworthy and who do not share without consent in the case of sending via an instant messaging program;
  • avoid creating a social account dedicated to the minor;
  • read and understand the privacy policies of the social networks on which we upload photographs, videos, etc.

Online proctoring: The use of digital distance learning by public and private higher education institutions is becoming more widespread. With the remote monitoring devices used in this context being intrusive by nature, the French data protection regulator CNIL reiterates the obligations under the GDPR: For instance, institutions organising examinations, as well as any subcontractors, (e.g. remote monitoring solution providers), should assure candidates that their data will not be used for any purpose other than taking and proctoring a remote examination. Also, examination modalities allowing remote validation of skills without the use of remote monitoring devices should be given priority where possible. 

In general, taking proctored exams remotely should be an opportunity for students, not an obligation. In this case, a face-to-face alternative should be offered to candidates, (except in specific cases, such as a health crisis or for institutions that have made distance learning the very essence of their organisation). Students should be informed as soon as possible of the conditions for implementing remote monitoring so that they can make their choice with full knowledge of the facts. Institutions and organisations should ensure that devices used for remote monitoring are compatible with the equipment available to students, that they do not pose security risks to students and that the necessary software can be easily installed and uninstalled. Read the full guidance, (in French), here

Smart Data: The UK Information Commissioner’s Office has published the Regulatory Sandbox Final Report for Smart Data Foundry. The sandbox specifically targets projects operating within challenging areas of data protection. Smart Data Foundry’s product is comprised of two parts. The first is the research facility, and the second is the innovation service which provides synthetic data for further research opportunities. There are broadly speaking two approaches to the creation of these synthetic datasets:  

  • Using simulation – known as ‘agent-based modelling’ – where data is generated from approximations and predictions of behaviour using characteristics given to a computer-generated population to understand how they would interact. This processing does not use personal data beyond some aggregate information generated from real data to test and improve parameters. This is the synthetic data approach that Smart Data Foundry is already using. 
  • Using ‘learning-based’ synthetic data generation to create synthetic doubles of existing datasets utilising differential privacy and modern learning-based approaches which aim to learn all the meaningful patterns in data, and use this learnt knowledge of patterns in the original data to generate new data that exhibit similar patterns, without recreating any input data. 

To understand key data protection considerations in such scenarios, read the full report

Right to object to data processing: The right to object gives a person the opportunity to request the termination of the processing of their data if it is processed for the following purposes: a) for legitimate interests of the data controller including marketing, as well as in the case of automated decision-making, b) in the public interest and c) for scientific or historical research and statistics. To exercise your right to object, you should:

  • Identify the data controller, (It can be a natural person, company, organisation or state administrative body.)
  • Contact the controller in writing, (recommended), and clearly state that you are exercising your right to object to the processing of your data. Please specify which processing operations you object to.
  • State the reason. The reason and the characteristics of your special situation require the manager to evaluate the necessary changes in data processing and whether, by continuing data processing, you as a data subject will not have your rights infringed. 
  • Wait for the answer. The administrator is obliged to respond to your request within a month. This must either stop the processing of your data to which you have objected or provide a valid reason for continuing the processing.

Enforcement decisions: fertility apps, Chinese academic database, Meta ban in Norway, waste collection and the GDPR

Fertility apps checks: The Information Commissioner’s Office is reviewing period and fertility apps available in the UK as new figures show more than half of women have concerns over data security. A poll commissioned by the regulator revealed women said transparency over how their data was used and how secure it was were bigger concerns than cost and ease of use when it came to choosing an app. The poll showed a third of women have used apps to track periods or fertility. The research also showed over half of people who use the apps believed they had noticed an increase in baby or fertility-related adverts since signing up. While some found the adverts positive, 17% described receiving these adverts as distressing. The ICO is now urging users to come forward to share their experiences through a survey in a call for evidence

Chinese academic database: The China Cyberspace Administration announced that the China National Knowledge Infrastructure, (CNKI),  has been fined approx. 6 million euros for illegally collecting and processing personal information. The operators collected users’ personal information without consent on the 14 CNKI-related apps that failed to publicly disclose or state collection and usage rules, did not provide an account cancellation function, and illegally kept their information after the users closed their accounts. CNKI is one of the biggest Chinese academic information gateway websites. It has over 1,600 institutional clients in 60 countries and regions, as well as 32,000 institutional customers from diverse sectors on the Chinese mainland. Top universities, research institutions, government think tanks, corporations, hospitals, and public libraries are among the primary consumers.

Waste disposal and the GDPR: A fine of 45,000 euros was imposed by the Italian privacy agency on a Sicilian municipality for having installed cameras to control the collection of waste. The municipality had appointed two companies, also sanctioned by the guarantor, to purchase, install and maintain fixed cameras, and to collect and analyse the videos relating to violations. The authority’s intervention follows reports from a citizen who complained about receiving some fines for having disposed of unsorted waste incorrectly. 

The monitoring was carried out without the citizens having been adequately informed of the presence of the cameras and the processing of the data. The municipality had placed a sign directly on the dumpster, which was not easily visible and lacked the necessary information. Furthermore, the municipality had not identified the data retention periods and had not appointed, before the start of the processing, the two aforementioned companies as data processors.  

Meta ban confirmed: The Norwegian data protection authority won against Meta in court. In July, the regulator made an emergency decision on a temporary ban on behaviour-based marketing on Facebook and Instagram, which involves very intrusive monitoring of users. The regulator therefore decided on a compulsory fine of approx. 90,000 euros per day if the ban was breached. The penalty was set to start on 14 August. However, Meta has petitioned the Oslo District Court for a temporary injunction. In the ruling, the court stated that the Norwegian data protection authority’s decision was valid and that there was no reason to stop it. In addition to this case, Meta has submitted several administrative complaints against the Norwegian Data Protection Authority’s decision. Those processes are ongoing. 

DNA data and transparency obligations: The US Federal Trade Commission finalised an order with 1Health.io, that settles charges that the genetic testing firm left sensitive genetic and health data unsecured, deceived consumers about their ability to get their data deleted, and changed its privacy policy retroactively without adequately notifying consumers and obtaining their consent. The company failed to keep its promises to only share consumers’ sensitive data in limited circumstances, to destroy customers’ DNA samples shortly after they had been analyzed, to not store DNA results with a consumer’s name or other identifying information, and to remove such data from its servers upon consumers’ request. 

Data security: automotive industry

Automotive cybersecurity: The Federal Office for Information Security in Germany published a report on the status of cybersecurity in the automotive industry. The greatest damage in the automotive industry comes from cybercriminal “double extortion” – ransomware and data leaks. The report contains:

  • Assessments of the cybersecurity of production systems and processes.
  • Advice on exploiting security vulnerabilities for car theft and unauthorized opening of vehicles.
  • Description of attacks on vulnerabilities in the communication protocol or other security mechanisms used to control charging processes between electric vehicles and their charging stations.
  • Assessments of new legal regulations and standardization activities.
  • Outlook on technological and regulatory developments that will be important in the coming years, (the industry is affected by the EU NIS 2 Directive as a critical sector).

According to the Associated Press’s recent publication, automakers are failing the privacy test, and owners have little or no control over the data collected. The nonprofit Mozilla Foundation’s newest “Privacy Not Included” study states that security requirements are a major worry considering manufacturers’ record of vulnerability to hacking. The minimal privacy criteria were not fulfilled by any of the 25 automobile companies whose privacy notices were assessed in Europe and North America. This outcome is significant for over a dozen other product categories, including fitness trackers, reproductive health applications, smart speakers, and other connected household products. 

Big Tech: ads-free Facebook and Instagram, the Privacy Sandbox

Paid Facebook and Instagram: Meta may allow Facebook and Instagram users in the EU to pay to avoid ads as a response to scrutiny from privacy regulators. Those who pay for the subscriptions would not see ads while Meta would also continue to offer free versions of the apps with ads in the EU. Previously users had effectively agreed to allow their data to be used in targeted advertising when they signed up to the services’ terms and conditions until the lead Irish regulator ruled it could not process personal information in that way. Therefore Meta also proposed offering EU users a new opt-in consent mechanism for receiving targeted ads. Reportedly, it would be updated to offer users a “yes or no” option for opt-ins across its platforms. 

Privacy Sandbox ‘availability’: Finally, the Privacy Sandbox for the Web reaches general availability on Chrome for relevance and measurement APIs. General availability means advertising providers and developers can now scale usage of these new technologies within their products and services, as these are now available for the majority of Chrome users. Google also rolled out new Ad privacy controls in Chrome that allow people to manage how the Privacy Sandbox technologies may be used to deliver the ads they see. These controls allow users to tailor their experience by customising what ad topics they’re interested in, what relevance and measurement APIs they want enabled, and more. Starting in Q4 of 2023, Google will enable the industry to bolster their testing efforts with the ability to simulate the deprecation of third-party cookies for a percentage of its users. Then, in Q1 of 2024, it will turn off third-party cookies for 1 per cent of all Chrome users for effectiveness testing.

Do you need support on data protection, privacy or GDPR? TechGDPR can help.

Request your free consultation

Tags

Show more +