TechGDPR’s review of international data-related stories from press and analytical reports.
Official guidance: secure multiparty computation, public procurement, risk analysis, DPIAs
The Spanish privacy regulator AEPD has published a tech-savvy blog post on Privacy by Design: Secure Multiparty Computation. It is possible to create federated data spaces, which avoid the communication and exposure of data to third parties, and at the same time provide access to the necessary information to multiple stakeholders, optimizing networks and processes, allowing, in addition, implement controlled data reuse policies. All this is independent of the additional data protection measures by design and by default that can be added, together with a governance model, for the guarantee of rights in the source data.
One such enabling technology is Secure Multiparty Computation, (SMPC). This is a cryptographic protocol that, through additive secret sharing, allows you to segment secret data into different parts, so that, when the data is shared, the original data cannot be revealed by any of the sources. For example, if three companies wish to collaborate to carry out a study of the sector to which they belong and thus jointly benefit from the results obtained. However, legal, strategic, and technical constraints might make this collaboration impossible.
In order to help the professionals concerned identify their responsibilities in different contexts of public procurement, the French regulator CNIL clarifies, (in French), the elements to be taken into account and the legal consequences to be drawn from the qualification of “(joint) controller”, and “subcontractor“. Administrations often entrust another body, (economic operator), with the mission of meeting needs in terms of works, supplies, or services, for example, the management of extracurricular services, water, transport, or parking. To perform these public contracts they are required to collect and use personal data which may concern staff or users of the public service: this data processing must comply with the GDPR. The designation of actors as “controller”, “subcontractor” or “joint controller” must occur as early as possible and be carried out with regard to factual elements and each contractual context. This establishes who will have to guarantee compliance with the main principles of the GDPR, in particular:
- the existence of an explicit and legitimate objective, (purpose), for each use of data;
- collection of relevant and non-excessive data;
- data security;
- a limited data retention period;
- proper consideration of people’s rights.
Dealing with risks. The Bavarian data protection commissioner explains how this works in data protection law. A new guide, (in German), helps to detect and manage risks in the processing of personal data even more easily. The paper attaches particular importance to the idea of scaling: risk analyses do not always have to be complex. Depending on the occasion, different “expansion stages” are possible. This is illustrated using several case studies. The new orientation guide and an information package, (with a set of forms that guide the implementation of risk analyses and are intended to support proper documentation), can be downloaded free of charge from here and here.
The Latvian data protection authority DVI also explains how to conduct a Data Protection Impact Assessment. A DPIA is the process by which a data controller can carry out an inventory, analysis, and assessment of the possible consequences, (in terms of severity and likelihood), of different risks, individuals’ rights, and freedoms. Carrying out a DPIA is not a one-off exercise, but a set of data processing assessments that need to be carried out on a regular basis. Additionally, organisations should not expect data processing to be constant, (even if no changes are made), as externalities also pose risks to continuous data processing. They should consider, for example, the following aspects:
- internal processes and planned activities with personal data;
- how the internal exchange of data takes place and whether the current exchange mechanisms are considered secure;
- the location of the data and access to how the data is transferred – on a computer, in folders, physically, etc.;
- employees’ knowledge of how to handle personal data in compliance with data protection requirements;
- internal documentation;
- whether data protection system rules have been developed, taking into account possible risks, (eg, unauthorized access, deletion, etc.).
The following questions will also help to assess the above aspects of processing:
- Does the protection of the organisation’s data system correspond to the risk posed by the data processed in it?
- Are the personal data processed and grouped more carefully, taking into account potential risks and high-risk?
- What devices are connected to the local network, (do the devices themselves and their connections pose a security risk)?
- What software is used in the organization’s information systems?
- Are computers equipped with security systems, passwords?
- Are employees’ access to processed personal data recorded?
- What more could be done to achieve higher security standards?
Legal processes: no united position on the AI Act, UK data protection reform
Members of the European Parliament have submitted hundreds of amendments to the upcoming AI Act, setting the tone for future discussions, according to the Euractiv news website. Reportedly, one of the most controversial topics is the definition of artificial intelligence itself. Another hot issue is the burden of obligations, not excluding data protection issues, for AI creators, introducing different requirements for new, former, and original providers of AI technology. At the same time Green MEPs made major proposals on prohibited practices, extending this category to biometric categorisation, emotion recognition, and any automated monitoring of human behaviour. Finally, conservative lawmakers want to exclude systems designed to assess creditworthiness from the high-risk list. Read more about the opposing proposals for the AI act from the EP’s left and right political groups in the original publication.
In a pre-emptive strike ahead of the publication of the Data Protection Reform Bill in the UK, Privacy International publishes its response here. It states that the right to privacy and data protection is linked to some of the most important political and existential questions of our time. At the core of the proposal is the suggestion that data protection is a burden on companies. It appears to be driven by the commercial interests of a few companies who may benefit from weaker rights protection, the result being the proposed loss of many important protections for people. The PI report looks at such privacy issues as:
- How can exploitation of the vulnerable be prevented?
- How does the UK treat immigrants who bring key skills and prosperity to the country?
- What safeguards are there against potential corruption of the democratic process by new technologies and their use by political parties and third parties?
In PI’s opinion, the UK proposal is a backward step. For example, innovation, (eg. in AI), relies on people sharing data; in order for people to share their personal information, they need to feel confident about doing so.
Investigations and enforcement actions: public bodies and IT incidents, unauthorized access, absence of legal purpose, DPOs, insufficient testing of software updates
The French regulator CNIL issued notice to twenty-two municipalities to appoint a data protection officer. The GDPR makes the appointment of a data protection officer mandatory in certain cases, in particular when the processing of personal data is carried out by a public authority or a public body, (Art. 37 of the GDPR). This obligation, therefore, concerns all local authorities, regardless of their size. In the case of local authorities, the delegate can be an internal agent or subcontractor shared between several municipalities. The 22 municipalities, in metropolitan France and overseas, have a period of 4 months to comply by appointing a data protection officer, under the conditions set by the GDPR, (expertise, independence, sufficient resources, etc.). If they do not comply with the formal notice, the CNIL may use its powers to pronounce sanctions – which can include fines and public reprimand.
The data protection officer, explains CNIL, plays an essential role in the compliance of data processing implemented by public authorities. They are the main point of contact for agents and citizens on all subjects relating to data protection: a) internally, they answer all questions regarding data protection and ensure that you are familiar with the GDPR “first steps”, (in the event of a computer attack, design of a new digital project, etc.), b) with regard to stakeholders, they oversee the organization of the processing of requests to exercise rights and any requests for clarification from the CNIL in the event of an audit.
Meanwhile the Italian privacy regulator ‘Garante’ fined Inail, (a financially independent public body which manages compulsory insurance against accidents at work and occupational diseases on behalf of the state), 50,000 euros. An investigation revealed that at least three IT incidents resulted in unauthorized access to the data of some workers, in particular details on health and injuries suffered. The application “Workers Virtual Desk” managed by the authority allowed some users to accidentally consult the accident and occupational disease files of other workers. In one case, however, the accident occurred following the execution of an outdated version of the “Workers Virtual Desk”, due to human error.
‘Garante’ emphasized that a body with such significant institutional skills, which processes particularly delicate data, including vulnerable data subjects, is required to adopt, in line with the principle of accountability required by the GDPR, technical and organizational standards that ensure the confidentiality of the data processed on a permanent basis, as well as the integrity of the related systems and services. The regulator’s judgement took into account the full cooperation offered by the public administration during the investigation and the small number of people involved in the identified data breaches.
In Norway the regulator Datatilsynet notified NAV, (Norwegian Labour and Welfare Administration), of a fine of approx. 495,000 euros for making CVs available on the service arbeidplassen.no without legal purpose. In order to receive services and benefits, job seekers have had to provide a quantity of information, including a CV. NAV has also set as a condition that the CV must be made available to employers on arbeidplassen.no, a condition NAV itself discovered that they have no authority to impose. NAV took immediate action, closing employers access to jobseekers CVs and notifying those affected.
Denmark’s data protection authority expressed serious criticism of the University of Southern Denmark’s insufficient testing of software updates. The university uses an HR system where employees can be assigned a grade to access applications. In connection with a software update, however, the system’s rights management was reset, which meant that all employees had access to the applications. This gave 7011 employees potential access to applications from a total of 417 applicants. Out of these, only some 400 employees had a conditional need to be able to access personal information in the HR system. Furthermore, the university did not keep a log of access to the applicants material and therefore could not identify what had been accessed.
Big Tech: voice recognition systems, UK’s Labour party lost database, the end of Google Assistant
According to Wired, voice recognition systems—such as Siri and Alexa become better at understanding people through their voices. Machines can learn a lot more: inferring your age, gender, ethnicity, socio-economic status, health conditions. Researchers have even been able to generate images of faces based on the information contained in individuals’ voice data, says the publication. And as the market grows, privacy-focused researchers are increasingly searching for ways to protect people from having their voice data used against them:
- Simple voice-changing hardware allows anyone to quickly change the sound of their voice.
- More advanced speech-to-text-to-speech systems can transcribe what you’re saying and then reverse the process and say it in a new voice.
- Distributed and federated learning—where your data doesn’t leave your device but machine learning models still learn to recognize speech by sharing their training with a bigger system.
- Encrypted infrastructure to protect people’s voices from snooping, and
- Voice anonymisation, (eg, altering the pitch, replacing segments of speech with information from other voices, and synthesizing the final output).
Britain’s Labour party is facing several class-action suits for failing to inform members after its database, hosted by a third party, was hacked with ransomware in 2021. The third party in question, the digital agency Tangent, was responsible for handling party membership data, and was reportedly targeted by an unknown ransomware gang that held the information hostage. Tangent refused to pay the ransom, leading the hackers to corrupt the database, rendering it inaccessible: “Labour claims that its own systems have not been affected by the breach, although its membership webpage has been down since it happened and, as a result, the party doesn’t have a complete or up-to-date membership list beyond December 2021”, according to the Bylinetimes newspaper.
Google wants to end location reminder capabilities on mobile and smart devices that use Google Assistant, Gizmodo and IAPP News report. The feature reminds users to do tasks when they arrive at specific locations. In just one example an investigation by Canada’s privacy regulator showed that people who downloaded the app for a popular coffee chain had their movements tracked every few minutes, even when the app wasn’t in use. Investigators said the app collected info to infer where users lived, worked, and traveled. The tech giant points to its privacy policy to claim it only collects data based on users’ settings, and that the app will only collect data when the app is active. However, third party apps can also share private information with Google when going through Google Assistant, based on user settings, says Gizmodo.