TechGDPR’s review of international data-related stories from press and analytical reports.
‘Machine learning is no excuse to break the law’: The US Federal Trade Commission alleged that Amazon, (Alexa voice assistant), kept kids’ data indefinitely to further refine its voice recognition algorithm. If approved by the federal court, on top of a multimillion fine, Amazon will have to delete inactive child accounts and certain voice recordings and geolocation information and will be prohibited from using such data to train its algorithms. Reportedly, Amazon is not alone in seeking to amass data to refine its machine-learning models.
Similarly, the FTC proposed enforcement against Amazon’s subsidiary, Ring. The allegations say the company compromised its customers’ privacy by allowing any employee or contractor to access consumers’ private videos to train algorithms, among other purposes, without consent, and failing to implement security safeguards.
China SCCs: On 1 June, China’s new Standard Contractual Clauses for the cross-border transfer of personal data went into force. Entities using the SCCs must meet two requirements: a) a data transfer impact assessment must be performed by the data exporter, and b) the data exporter must sign SCC-compliant agreements with overseas recipients of the data. The Chinese SCCs do not distinguish between an exporter or receiver being a controller or a processor, in contrast to the EU SCCs. As an alternative to SCCs, organisations may also be required to undergo a security check by the Cyberspace regulator or certification by recognised institutions. Read more analysis by connectontech.com.
Montana’s new privacy law and TikTok ban: Montana became the first US state to ban the use of TikTok and prohibit mobile application stores from offering the Chinese app within the state by next year. The ban covers state networks, but also third-party firms conducting business for or on behalf of the state from using applications with ties to foreign adversaries. The state would fine any entity, (an app store or TikTok), 10,000 dollars per day for each time someone “offers the ability” to access the platform or download the app. How these prohibitions will be implemented, though, is still unclear.
Montana’s Governor also signed a new Consumer Data Privacy Act, joining California, Colorado, Connecticut, Indiana, Iowa, Tennessee, Utah, and Virginia, which already enacted comprehensive consumer privacy laws. The law is scheduled to take effect in October 2024.
Health care data: The US Federal Trade Commission is modernising the Health Breach Notification Rule, clarifying the rule’s applicability to health apps and similar technologies, many of which aren’t covered by HIPAA. Changes will be made to the terms “identifiable health information,” “breach of security,” “health care provider,” and “health care services or supplies,” as well as the information that must be included in the consumer notice, and more. In parallel, to bridge the gap between HIPAA safeguards and health data that is obtained outside of conventional medical settings, Washington enhanced the protection for customers’ identifiable health information by passing the “My Health My Data Act”.
Generative AI: The US Congressional Research Service published a paper on Generative AI and Data Privacy. Recently the term “general-purpose models”, (GPAI), was created by academics and policymakers to refer to software programs like ChatGPT that can do a variety of tasks. Large language models, (LLMs), which have the ability to detect, predict, translate, summarize, and produce language, are the foundation for many general-purpose AI applications. Duolingo, Snapchat, and other companies have partnered with OpenAI to deploy ChatGPT in their services. However, individuals may not know their data was used to train models that are monetized and deployed across such applications.
SAR guidance: The UK Information Commissioner’s Office has published new guidance for businesses and employers on responding to Subject Access Requests. Individuals can request the personal information held by their employer, or former employer, such as details of their attendance and sickness records, personal development or HR records. This includes where you got their information from, what you’re using it for and who you are sharing it with.
Organisations must respond to a SAR from a worker without delay and within one month of receipt of the request. However, you could extend the time limit for responding by up to two months if the SAR is complex or if they have sent you a number of requests. At the same time, the UK GDPR does not set out formal requirements for a valid request. Therefore, a worker can make a SAR verbally or in writing, including by social media.
Right to object and right to erasure: The EDPB summarises the right to object in connection to the right to be forgotten in complaints from data subjects. Requests to stop processing personal data for marketing purposes and to delete already gathered data are frequently linked. Most of the cases show deficiencies in the internal procedure adopted to deal with such requests, including the accuracy of the procedure and internal communication, the timeframe for processing requests, and the accountability of the system for receiving/tracking complaints.
Workforce monitoring: Employers tend to control employees’ work performance, to keep track of the duration and frequency of the employee’s work, but also of their location and other indicators. As a basic setting, the systematic monitoring of employees using automated means, (cameras, apps), is considered a non-standard solution, states the Latvian data protection authority. It can only be used for short-term employee monitoring, and only if less privacy-intrusive means will not achieve the goal. Such processing must be clearly agreed upon in advance and must be understandable to both parties. Otherwise, this can undermine mutual trust with the employee, and even may contribute to a decline in the quality of work.
Meta/Facebook enforcement: The largest GDPR fine to date of 1,2 bln euros has been issued by the Irish data protection authority on Meta Ireland. Following the “Schrems II” ruling Meta affected data transfers to the US on the basis of the Standard Contractual Clauses in conjunction with additional measures. But they did not prevent fundamental risks to data subjects in view of US state surveillance practices.
Meta now must return already transferred personal data and stop other illegal processing within the next few months. The decision may have similar effects for any digital service provider subject to US surveillance laws and relying on EU Standard Contractual clauses until the problems have been resolved by the adoption of the upcoming EU-US Data Privacy Framework by the Commission.
Charity organisation: The ICO completed an audit of Age UK Wiltshire, (charitable and voluntary sector). AUKW requested an audit in January and submitted an audit questionnaire detailing their data protection compliance concerns. After the investigation, the main areas for improvement were identified:
- Review and update existing data protection policies and create new policies covering records management, data sharing, DPIA, and information security.
- Ensure that data protection training is mandatory for all staff, including annual refreshers and specialised seminars.
- Complete an information audit to help the organisation have an understanding of all of the information that is held and its flows.
- Create an Information Asset Register, (IAR), to record the information assets identified by the information audit and ensure that the IAR is periodically reviewed.
- Review and update the current subject access requests, (SARs), and policy, including completing identity checks, that are communicated to staff.
- Create and maintain a SARs log as a documented record of all completed and ongoing SARs.
Video surveillance: The Italian privacy regulator ‘Garante’ imposed a 50,000 euro fine on a clothing company, (with over 160 stores), for having installed video surveillance systems in various company outlets. The company had justified the need to defend against theft and to ensure the safety of employees and corporate assets, and prevent unauthorized access. The investigation showed that all the shops were equipped with at least 3 video cameras, active 24 hours a day, 7 days a week, in the areas reserved for workers and suppliers. In larger outlets, it was up to 27. The fine was issued, taking into account the significant number of employees involved, (over 500), and points of sale, as well as the absence, (or violation), of authorization or agreement with the trade union representatives.
Tax data: The Belgian data protection authority decided to prohibit the transfers of data of Belgian “Accidental Americans” by the Belgian Federal Public Finance Service to the US tax authorities under the intergovernmental FATCA agreement. According to the Belgian data protection regulator, the data processing carried out under this agreement does not comply with all the principles of the GDPR, including the rules on data transfers outside the EU. The regulator also orders the public service to inform in a complete and accessible manner the data subjects of the data processing carried out as part of the FATCA agreement and of its modalities. It also asks to carry out a DPIA.
Automated rejection of credit card application: Berlin’s supervisory authority imposed a 300,000 euro fine against a bank after a lack of transparency over the automated rejection of credit card applications, according to the EDPB summary. A Berlin-based bank offered a credit card on their website. Using an online form, the bank requested various data about the applicant’s income, occupation and personal details. Based on the information requested and additional data from external sources, the bank’s algorithm rejected the application without any particular justification. Even when asked by the complainant, the bank only provided blanket information about the scoring procedure, detached from the individual case. However, it refused to tell him why it assumed poor creditworthiness in his case.
Biometric ID checks: Mobile World Congress’s organizer received a 200,000 euro fine in Spain for doing inadequate biometric ID checks at the 2021 venue. For the “in-person” option, the organizer requested a complainant to upload passport details, including photographs that were transferred to a service provider in a third country for facial recognition security purposes. However, the legal basis for it was verified from consent to legal obligation in different notices. Plus, neither the privacy policies nor the email communications provided clear information on data transfers to a third country. Additionally, the organiser’s DPIA failed to assess risks or the proportionality and necessity of the system implemented, (called BREEZZ).
Doctissimo fine: Following a complaint by the Privacy International association, the French privacy regulator fined the doctissimo.fr website 380,000 euros. It mainly offers articles, tests, quizzes and discussions related to health and well-being for the general public. The regulator noted infringements concerning the duration of data retention, the collection of health data via online tests, the security of data as well as the ways cookies were deposited on user’s terminals. Additionally, the company processes personal data with other entities, in particular for the marketing of advertising spaces on the website. These relationships of joint responsibilities were not framed by any contract.
Google Analytics: The Finnish data protection commissioner has issued a notice to the meteorological institute about the transfer of personal data to the US via website tracking technologies. The institute had not defined or applied the legal basis for the transfer of data in the use of reCAPTCHA and Google Analytics services. Nor had it suspended data transfers without delay after the CJEU’s “Schrems II” decision, even though it no longer had a valid basis. The institute has taken steps to remove the tools and services from its website. The order also includes the deletion of data that had been transferred illegally to the US.
Mobile device management: Mobile devices make it easier for employees to complete their job from home, at the workplace, or while on the road. In order to reduce an organisation’s risk profile, it is critical to manage security and device health. The US NIST explains the benefits of Mobile Device Management when an employee’s personal or corporate-owned device can be enrolled into an MDM solution to apply enterprise configurations, manage enterprise applications, and enforce compliance. To learn more about how to use standards-based, commercially available products to meet security and privacy needs you can download the latest guidance by NIST here and here.
De-identification: The Government of Canada publishes instructions on de‑identification as a privacy‑preserving technique. Although the pseudonymisation of data is a step toward anonymisation, it still permits re-identification. The acceptable risk level must be determined based on the context. it is always preferable that privacy experts work together with data specialists. For instance, there are activities that increase the risk of re‑identification, such as integrating datasets or data matching, so it is important to continually assess privacy and re‑identification risks, even after applying privacy safeguards.
NHS data sharing: According to the Guardian, NHS trusts are sharing sensitive data about patients’ health conditions, medical appointments, and treatments with Facebook without their knowledge and despite promises to never do so. An Observer investigation revealed a monitoring feature, (Meta Pixel), on the websites of 20 NHS trusts that has been collecting medical and patients’ browsing data for years and sharing it with the tech giant. The information contains specific details such as sites viewed, buttons pressed, and keywords searched, and matched to the user’s IP address. This included patients who visited hundreds of NHS webpages about HIV, self-harm, gender identity services, sexual health, cancer, children’s treatment and more.
The Privacy Sandbox: Google announced the next stages of Privacy Sandbox – General availability and supporting scaled testing. In Q1 of 2024, it plans to deprecate third-party cookies for one per cent of Chrome users. This will support developers in conducting real-world experiments that assess the readiness and effectiveness of their products without third-party cookies. This will follow the introduction in Q4 of 2023 of the ability for developers to simulate Chrome third-party cookie deprecation for a configurable percentage of their users.