TechGDPR’s review of international data-related stories from press and analytical reports.
Website analytics: There are many website analytics and tracking tools on the market notes the Norwegian data protection regulator, but that doesn’t necessarily mean it’s legal to use them on your website. Be prepared to follow the GDPR, even if you do not know the name or identity of those visiting your site. The analysis tools collect a lot of information, which either alone or in combination can constitute personal data. If you currently have an analysis tool that collects information that you do not use for anything, you are breaking the law:
- You must have a legal basis for processing.
- There are many requirements for user consent to be valid. The mere existence of the cookie banner is not enough.
- Choose tools that promise to only process personal data on your behalf and as you decide.
- On some websites, the visitors’ behaviour can in itself reveal special categories of personal data, (eg, mental health care).
- Many service providers have offices or subcontractors in countries outside the EU/EEA. You must check this before using the tool.
- Make sure you provide honest and easily understandable information to the visitors, and respect their data subject rights.
Health care data aggregation: The French data protection regulator published recommendations for actors in the digital health sector, (in French). The sandbox projects included federated learning between several health data warehouses, a diagnostic aid solution in oncology, anonymous statistical indicators of populations in medical research, and a therapeutic game. The GDPR states that data processing in the field of health must be implemented in the public interest, and can only be mobilised by public entities, or legal entities entrusted with a public service mission.
Thus, commercial projects, (start-ups), should be based on their legitimate interests. People’s consent in many cases was also ruled out as the companies are not in a position to collect it, particularly for the reuse of data from healthcare establishments. Finally, whenever non-anonymous data is exported, an ad hoc risk analysis must be performed to determine the necessary security measures. Continuity of security measures outside of the workplace should be ensured as much as possible.
Customer location data: More retailers and companies are transferring their loyalty programs to mobile applications. These often demand access to the customer’s location-related data to personalise offers for each customer, taking into account their habits and other information. Regardless of the legal basis applied by the merchant for the data processing, (both consent and legitimate interest are possible), the customer has all the rights specified in the GDPR. Completely ceasing the loyalty program if the customer withdraws consent only to the processing of geolocation data will not comply with regulatory requirements. Therefore, when developing an application, it is necessary to take into account different possible levels of the loyalty program, granular consent, and withdrawal.
EdTech development: The French regulator also published a summary of the main recommendations, (in French), based on the “sandbox” project in the EdTech sector. That included actors developing a portfolio of learning skills, a communication solution in the school context, creating a warehouse of learning traces with a view to their publication and analysis and providing a “ personal cloud ” for students connected to their digital workspace. During the “sandbox” support, among other things, the technical architecture of solutions was analysed with the data controllers and their subcontractors. It has to be noted that:
- State establishments, (eg, primary schools), do not have a legal personality; teachers and directors are acting as agents of the administration of national education.
- When onboarding a technical solution, the Ministry of national education must be considered as the only data controller, (in joint controllership with the municipality).
- The company offering technical solutions would become a subcontractor.
- For processing operations that pursue “school” purposes the legal basis of the ” mission of public interest ” has been considered the most appropriate to establish.
- Other treatments may demand individual, (eg, parental) consent.
- Only authorised subcontractors and recipients of pupils’ data are allowed.
- Information notices must be adapted to different age groups, and more generally to the degree of maturity of the pupils concerned.
Legal processes and redress
Non-material damage under the GDPR: The Dublin District Court awarded 2000 euros compensation to a plaintiff regarding the use of CCTV footage of him by his employer, which led to victimisation from colleagues, serious embarrassment, and loss of sleep. As part of a meeting involving quality control and other managers and supervisors, CCTV video was displayed to various personnel. The plaintiff was not present at the meeting and found out afterwards that the tape had been utilised. The company’s data protection policies regarding CCTV were not clear or transparent, and no legitimate interest assessment about the remote control of the workers was carried out. Read more details of the case in the original analysis by the Irish lawyers.
US state privacy legislation: The most recent comprehensive state consumer data privacy law has been passed in Oregon. The law has some unique provisions despite being similar to consumer data privacy laws passed in different states. It applies to nonprofit organisations, has broad definitions of covered data, (including categories of sensitive and biometric data, as well as derived data), a smaller HIPAA, (protected health information), carveout, and grants Oregon residents the right to request a list of the third parties to whom controllers disclosed their data, opt-out options and more. Meanwhile, the Colorado Privacy Act has been enforceable since 1 July, making Colorado the third state after California and Virginia to pass a comprehensive privacy law to protect its residents.
COPPA 2.0: Amendments to the Children’s Online Privacy Protection Act, (and the Kids Online Safety Act), have been approved by a Senate Committee. It would close a loophole allowing companies to abuse minors’ data with little accountability, making it harder for the regulator to prove violations. It would be unlawful for a digital service or connected devices directed at children or teens, to collect, use, disclose to third parties, or compile their data for profiling and targeted marketing unless the operator has obtained consent from the relevant minor, (“verified parental consent”). The operators must also treat each user as a child or minor unless content is deemed to be directed to mixed audiences.
Security measures: Open Bank was fined 2.5 million euros by Spain’s data protection regulator for failing to implement a framework to permit encrypted communication. In order to comply with anti-money laundering legislation, the complainant was asked to confirm the origin of funds received in their bank account. However, the only possibility was to provide the information by email, (rather than through a secure direct channel). The information requested by Open Bank is classified as ‘financial data,’ which requires the implementation of strengthened safeguards. The regulator decided that Open Bank did not implement a data protection strategy from the start, neither before nor during treatment.
In another recent example, the Polish regulator punished a firm to the tune of almost 9000 euros for losing employees and contractors’ personal data in a ransomware attack. The organisation failed to complete a risk assessment, notify the regulator of the breach within 72 hours of becoming aware of it, and notify the data subjects affected by the breach. The regulator also claimed that the company did not comply fully throughout its inquiry. In particular, the company’s communication was frequently inconsistent.
Non-registration with the regulator: Guernsey’s data protection authority is to pursue legal action for failure to register. It is a legal requirement for any organisation, (including sole traders) that handle people’s personal information during the course of their business activities – even if this is just names and addresses – to register with the Guernsey regulator. If you are not sure if you need to register, there are three clear criteria:
- You, (whether a sole trader, organisation, business, charity, landlord, business association etc.), are established in the Bailiwick of Guernsey.
- You are working with personal data, (any information that may identify individual people, such as staff members, your clients, your business contacts, your service users, your tenants etc.), either as a ‘controller’ or a ‘processor’.
- The activity you are performing is not part of your personal/household affairs.
Non-cooperation with the regulator: According to Data Guidance, the Polish data protection authority fined a company 8000 euros for failing to cooperate, (Art. 58 of the GDPR). The regulator received a complaint alleging that the firm had improperly shared personal information with a third party. The regulator sent the business several letters demanding further information, including the legal basis and purpose of processing. The organisation, however, did not react to any of the letters.
Reimbursement app: A one million euro fine was imposed by the Italian privacy regulator on Autostrade per l’Italia (ASPI) for having illegally processed the data of around 100,000 registered users of the toll reimbursement app, called Free to X. The critical issues of the service – which allows the total or partial refund of the cost of the motorway ticket for delays due to construction sites – had been reported by a consumer association. The authority has ascertained that Autostrade plays the role of data controller and not of data processor, as erroneously indicated in the documentation that governs the relationship between Aspi and the company Free to X which created and manages the app.
Meta behavioural ads: The Norwegian data protection authority has prohibited Meta from adapting advertising based on monitoring and profiling of users in Norway. The decision comes shortly after the CJEU stated that Meta’s data practices still do not take place legally. When Meta decides which ads you get to see, they also decide which content you don’t get to see. This affects freedom of expression and information in society. There is a danger that behaviour-based marketing reinforces existing stereotypes or that it can lead to unfair discrimination between different groups. Behaviour-based targeting of political advertisements is particularly problematic.
Medical data anonymisation for research: The Italian regulator fined a company for processing the health data of numerous patients collected from around 7000 general practitioners without adopting suitable anonymisation techniques. The GPs adhering to the international health research initiative had to add to their management system “Medico 2000” a function, (“data extractor” add-on), aimed at automatically anonymising patient data and transmitting them to the above company. But in fact, the tool only pseudonymised data assigned to the patients. There was also the erroneous attribution of the role of the data controller to GPs, and therefore the absence of a legal basis for data processing by the company.
Videoconferencing tool: The EDPS has found that the use of Cisco Webex videoconferencing and related services by the CJEU meets the data protection standards under Regulation 2018/1725 applicable to EU institutions, bodies, offices and agencies. However, the decision is not a general endorsement nor certification of data protection compliance of the videoconferencing services provided by any Cisco Webex entity.
With regard to technical safeguards, the court confirmed that support information is encrypted in transit, while case attachments are encrypted both in transit and at rest, in order to secure personal data from accidental loss and unauthorised access, use, alteration, and disclosure. The keys for encryption are managed by Cisco.
The court also took organisational measures to limit or avoid transfers of personal data outside of the EU/EEA: in case Cisco needs to have remote access to the court’s Cisco Webex infrastructure, the DPO of the court, in collaboration with the court network engineers, shall analyse the possible risks for the data subjects and decide on the legitimacy of this access.
Ryanair facial recognition: Privacy advocacy group NOYB filed a complaint against Ryanair, alleging that the airline is violating customers’ data protection rights by using facial recognition to verify their identity when booking through online travel agents. The airline outsources this process to an external company named GetID. This means that customers have to entrust, (by consenting to it), their biometric data to a company they have never heard of or had a contract with. Passengers can avoid it by showing up at the airport at least 2 hours before departure or submitting a form and picture of their passport or national ID card in advance.
Alexa child accounts and geolocation: The US Federal Trade Commission will require Amazon to overhaul its deletion practices and implement stringent privacy safeguards to settle charges the company violated the Children’s Online Privacy Protection Act and deceived parents and users of the Alexa voice assistant service about its data practices. Amazon claimed it retained children’s voice recordings in order to help it respond to voice commands, allow parents to review them, and improve Alexa’s speech recognition algorithm.
Among many requirements, Amazon will have to implement a process to identify inactive Alexa child profiles. Following the identification of any inactive child profile, the company shall delete any personal information, (voice recordings and geolocation information), within 90 days, unless the parent requests that such information be retained. Misrepresenting the privacy policies related to geolocation and children’s voice information will also be prohibited.
Amazon Go shops: A recent class action against Amazon in New York over its cashier-less Amazon Go shops was voluntarily terminated for unspecified reasons. Previously, the complaint claimed that Amazon acquired biometric data from customers in violation of a New York City Biometric Identifier Information Statute. According to the complainant, Amazon scanned customers’ hands and illegally uses technologies such as computer vision, deep learning algorithms, and sensor fusion to measure customers’ bodies to identify and monitor where they walked in the shop and what they purchased. The lawsuit demanded 500 dollars for each infraction of the legislation.
Worldcoin biometric verifications: Members of the public in selected locations worldwide are being encouraged to have their eyes scanned as part of a cryptocurrency initiative that tries to identify humans from AI systems via biometric verification. The Worldcoin protocol operates by providing biometrically verified individuals with a digital identity in the form of a Worldcoin token, which promises to be the first crypto token to be issued globally and freely to people simply for being genuine individuals. Users will also receive access to the app, which will allow them to make global payments, purchases, and transfers utilizing digital and traditional currencies. The UK Information Commissioner’s Office commented on the situation:
- The organisation must conduct a data protection impact assessment before starting any processing that is likely to result in high risks, such as processing special category biometric data.
- Where they identify high risks that they cannot mitigate, they must consult the regulator.
- The organisation also needs to have a clear lawful basis to process personal data. Where they are relying on consent, this needs to be freely given and capable of being withdrawn without detriment.