Data protection digest 2 – 16 June 2023: rules on electronic evidence, explainable AI, and wildcat telemarketing

TechGDPR’s review of international data-related stories from press and analytical reports.

Legal processes and redress

Electronic evidence: The European Parliament voted to adopt new rules on the exchange of electronic evidence by law enforcement authorities to make cross-border investigations more effective. It will allow national authorities to request evidence directly from service providers in other member states, (“production orders”), or ask that data be stored for up to 60 days. Evidence can consist of content data, (text, voice, images, video or sound), traffic data, (timestamps, protocol and compression details, and information about recipients), or subscriber data. Currently, the exchange depends on various bilateral and international agreements on mutual legal assistance, resulting in a fragmented landscape and, often, lengthy procedures. However, authorities can refuse the requests when they have concerns about media freedom or fundamental rights violations in the requesting member state. 

From MiCA to MiCAR: The Market in Crypto Assets Regulation has been published in the Official Journal of the EU and will apply in all EU Member States through 2024. The new rules cover issuers of utility tokens, asset-referenced tokens and so-called ‘stablecoins’. It also covers service providers such as trading venues and the wallets where crypto-assets are held. It ensures that crypto transfers, as is the case with any other financial operation, can always be traced and suspicious transactions blocked. Information on the source of the asset and its beneficiary will have to “travel” with the transaction and be stored on both sides of the transfer.

In addition to the MiCAR, the EU financial digital package contains a Digital Operational Resilience Act, (DORA), that covers crypto-asset service providers as well, and a proposal on distributed ledger technology, (DLT) pilot regime for wholesale uses.

Draft AI Act: The European Parliament also adopted its negotiating position on the Artificial Intelligence Act, and is ready to discuss the final form of the law with the Council and the Commission. MEPs have enlarged the list of AI systems with an unacceptable level of risk to people’s safety and would therefore be prohibited to include: 

  • “real-time” remote biometric identification systems in publicly accessible spaces;
  • “post” remote biometric identification systems, with the only exception for serious crime law enforcement;
  • biometric categorisation systems using sensitive data, (gender, race, ethnicity, etc.);
  • predictive policing systems, (based on profiling, location or past criminal behaviour);
  • emotion recognition systems in law enforcement, border management, the workplace, and educational institutions; and
  • untargeted scraping of facial images from the internet or CCTV footage to create facial recognition databases. 

MEPs added exemptions for research activities and AI components provided under open-source licenses. The so-called regulatory sandboxes, or real-life environments will be established by public authorities to test AI before it is deployed, along with an individual’s  right to complain and receive information.

CJEU Opinion

Data subject rights: A CJEU Advocate General’s opinion states that a data subject must have available judicial recourse against an independent supervisory authority where they exercise their rights through that authority. In the related case, an individual was refused by the Belgian National Security Authority a ‘security clearance certificate’ because he had participated in various demonstrations in the past. He asked the national supervisory body for police information, (“OCIP”), to identify the controllers responsible for the data processing at issue and to order them to provide him with access to all the information concerning him. The OCIP replied that it had carried out all necessary checks without providing any further details. Unsatisfied with that answer, the individual brought an action against the OCIP. 

The opinion clarifies that in the above case, the level of information provided by the supervisory authority to the data subject on the outcome of the check may not always be restricted to the minimum information that all necessary verifications have been carried out, but may vary depending on the circumstances of the case applying the principle of proportionality. Read more legal reasoning on the case in the original opinion

Official guidance

UK Children’s Code: The latest evaluation report shows that a fifth of UK children are familiar with the code and a third are aware of data privacy due to the implementation of the Children’s Code, (a statutory code of practice since 2020). The code applies to any ISS provider, (including ed-tech products and services), that processes the data of children in the UK, including some organisations that are not based in the UK. For the supervision and enforcement phase, there were initial resource challenges around the integration of Children’s code activities into ‘business as usual’. Also, there could have been greater external expectation management around supervision and enforcement activities, as these were only possible once the transition period ended. Key skill gaps identified included technology professionals lacking awareness of:

  • how ISS providers operate as well as supporting technology (eg; age assurance technology);
  • the importance of communication and engagement policies, as without them  knowledge and experience embedded within the organisation is lost when a project or phase finishes. Read the full report here.  

Input data for triage algorithms: The Spanish data protection authority examined the performance of a running algorithm that could be compromised by inaccurate input data. Their analysis looked at the triage algorithms of the emergency health system, which must optimize resources in order to save lives. The authority suggests assessment of the algorithm used in the triage processing should just be a part of the wider assessment, including factors such as data gathering operations, data checking, human involvement and the way in which decisions are executed, reviewed and contested. 

A lack of definition of the input data could lead to errors or biases that are not part of the algorithm itself. Thus, the accuracy principle should be implemented for the input data, the output data, and even in the intermediate data of the whole processing activity. The precise definition of every input data, (gathered both directly and indirectly), and its semantics, must be set up “by design” and properly documented. Even more importantly, the value range, (“yes/no”, “0 to 10” or “high/medium/low”), should be defined and assessed in the context of the processing. 

Explainable AI: The latest analysis by the EDPS states that modern AI models often work as opaque decision-making engines, truly black boxes reaching conclusions with little transparency or explanation on how a given result is obtained. Explainable AI, or XAI, focuses on developing AI systems that can not only provide accurate predictions and decisions. Individuals using XAI would be able to understand the reasoning behind an automated decision and to take the appropriate, and informed, course of action. Obtaining clear information about the behaviour of AI also has an impact on the ability of its users, such as data controllers and processors, to evaluate the risks that this tool may pose to individuals’ rights to data protection and privacy.  

DSARs: Guernsey’s data protection authority has published new guidance on ‘data subject access requests, (for data controllers and individuals). One of the most commonly-used rights is the right of access, also sometimes referred to as a ‘subject access request’, or ‘data subject access request’. This is where individuals ask what personal data a controller holds about them and why. An individual can also request information about the reasoning behind any automated decisions, such as a computer-generated decision to grant or deny credit or assess performance at work, (except where this information is a trade secret). In short, a DSAR is when an individual asks you:

  • what do you know about me?
  • what do you think about me?
  • what do you think you know about me?
  • what are you doing with all this information? 

Another guidance for individuals who may wish to make a DSAR contains information about how to make one, what you should receive back, and what to do if you’re not happy with what you receive.

CCTV: Another comprehensive guidance from the Guernsey regulator looks at CCTV use by data controllers, (with exceptions for household, journalistic, and artistic activities). It is based on seven principles that require you to do the following

  • Be clear about how personal information is used, for what purpose and on what legal basis.
  • Use personal information only for specific, explicit and legitimate purposes.
  • Collect no more information than is needed.
  • Make sure personal information is accurate and kept up to date. 
  • Keep information for no longer than necessary. 
  • Keep information secure. 
  • Be responsible and accountable for how personal information is used.

Loyalty programs: What rules should an entrepreneur follow when creating customer loyalty programs? A loyalty program is an additional service and the initial legal basis, which is the performance of the contract, is not applicable. The customer must give their consent to the processing of their personal data for one or more specific purposes. If the entrepreneur includes customer data transfer to other partners as part of the loyalty program, then the customer must not only be informed about it but also their consent must be obtained. 

There should be no direct or indirect pressure on the client. The entrepreneur must also take into account that the customer has the right to withdraw their consent to the processing and demand it cease, along with the deletion of all their personal data that is no longer necessary for the performance of the contract.

Enforcement decisions

Wildcat telemarketing and confiscated databases: The Italian data protection authority confiscated databases, for the first time, at two call centre companies allegedly conducting illegal and unregulated telemarketing activities. The operation was conducted by the finance branch of the Special Privacy Protection and Technological Fraud Unit in collaboration with the military. Four companies were fined between 200,000 and 800,000 euros in the operation. The sanctioned companies, through the acquisition of specific illegally-produced lists, contacted tens of thousands of subjects without their having ever given the necessary consent for the processing of their data for marketing purposes, proposing offers from various energy companies.

Clairvoyance consultations: The French privacy regulator has imposed a 150,000 euro fine against KG COM. It collected data excessively, including sensitive data, without prior and explicit consent, and did not sufficiently ensure data security. KG COM operates several websites offering clairvoyance consultations via an online dialogue interface, (chat), or by telephone. The investigation found that: 

  • it systematically recorded all telephone calls between teleoperators and prospects;
  • it kept health data relating to sexual orientation without obtaining consent; 
  • it kept customers’ banking data beyond the time strictly necessary to carry out the transaction, (while the legal basis for the retention of bank data for anti-fraud purposes is a legitimate interest, this does not apply to retention for subsequent purchases, for which the company should have obtained consent);
  • it systematically recorded all conversations for the purposes of service quality  control, proof of contract subscription and potential judicial requisitions;
  • it implemented insufficiently strong passwords for user accounts and failed to secure access to them by using HTTP instead of HTTPS;
  • it also used a mechanism to encrypt banking data that was vulnerable.

Spotify fine: The Swedish privacy authority has reviewed how Spotify handles customers’ right to access their personal data, and sanctioned the company to the tune of around 5 mln euros. Spotify has divided the customers’ personal data into different layers. One layer contains the customer’s contact and payment details, which artists the customer follows and the listening history for a certain period of time. If the customer wants more detailed information, for example, all technical log files relating to the customer, it has also been possible to request these from another layer. 

The regulator believes that although Spotify releases personal data the company processes when individuals request it,  the company does not inform customers clearly enough about how this data is used by the company. Often the individual receiving sufficient information is a prerequisite for exercising other rights; for example, the right to have incorrect information corrected or removed. 

Audits

College group: The UK Information Commissioner’s Office has conducted a consensual audit of the Chichester College Group concerning its data protection measures. Various areas requiring improvement were found, as the college group does not have a complete and fully documented information governance, (IG), policy and framework:

  • the flow of information between the senior management team, the data protection office, the audit and risk committee and other key IG committees and groups have not been finalised,
  • implementation of a process that ensures information risks need to be fully documented and managed throughout the organisation,
  • there is no ongoing compliance monitoring of staff who are involved in the processing of personal information,
  • the group must ensure that an appropriate written contract is in place with each of its data processors,
  • a central record of data processor contracts and a data processor procurement, due diligence and compliance process need to be finalised,
  • the group must ensure that an appropriate written contract is in place with each of its data processors.

Data security

Mobile applications: Users of mobile applications, before installing or starting to use mobile applications, should familiarize themselves with the privacy notices and rules of use of such applications, as well as carefully evaluate the requested collection of personal data or the permissions granted, states the Lithuanian data protection authority. The mentioned information must be available, (on the website that offers the app and on the app itself), to the user even before entering their personal data, granting permissions or creating accounts. Before using mobile applications, it is important to assess what goals are being pursued. For example, when using applications for direct communication, it is possible to restrict access to photos, and the device’s camera.

It is important to note that access to mobile applications may be restricted during application installation or at any other time chosen by the user. For example, restricting access to location data is also relevant if the location functionality is not needed by the user at that time. Similarly, it is advisable not to grant permission to the contacts saved on the user’s mobile device for social networking, dating, and messaging mobile applications, but to add specific persons selected by the user to such an application separately.

2FA: The Office of the Privacy Commissioner in New Zealand recommended all firms use two-factor authentication to secure the information they store. Any firm should exercise caution by implementing 2FA wherever applicable, as this would be a particularly valuable mitigating argument when defending against regulatory fines and other legal ramifications that may result from a data breach. In this scenario, what is appropriate is determined by the organization’s size as well as the scope and sensitivity of the personal information it has.

Big Tech

MOVEit cyberattack: According to the Guardian, British Airways, Boots, the BBC, Ofcom, Transport of London and others are probing the potential theft of personal information from employees following a cyber-attack. It targeted MOVEit software used by Zellis, a payroll provider. Zellis stated that a “small” number of its clients were affected by a vulnerability in the company’s file transfer technology. Microsoft’s threat intelligence team blamed the MOVEit assaults on a group known as Lace Tempest. Names, surnames, employee numbers, dates of birth, email addresses, first lines of home addresses, and national insurance numbers might have been among the information compromised in the hack. 

Airdrop and Bluetooth restrictions in China: Meanwhile, China is developing new guidelines to govern file-sharing systems such as Airdrop and Bluetooth. Service providers would be required to prevent the spread of harmful and unlawful material, maintain records, and report their discoveries. The Chinese Cyberspace Administration has produced draft regulations on “close-range mesh network services” and initiated a month-long public consultation. When conducting inspections, service providers would also be required to offer data and technical support to the authorities, including internet regulators and police. Users must also register their true names. Furthermore, features and technologies that have the potential to mobilise public opinion must be subjected to a security evaluation before they may be implemented.

Do you need support on data protection, privacy or GDPR? TechGDPR can help.

Request your free consultation

Tags

Show more +