Data protection & privacy digest 1 – 15 Dec 2022: draft US adequacy decision, Microsoft ‘data boundary’ for the EU, Age-appropriate design code

In this issue, you will find updates on the draft US adequacy decision, Standard Data Protection Model, HIPAA rules, multimedia boxes security, code of practice for app market, Microsoft ‘data boundary’ for the EU, Apple’s E2EE, and more.

Legal processes: draft US adequacy decision, EDPB’s binding decisions, draft AI Act

The EU issued a draft adequacy decision for the United States, saying US safeguards against America’s intelligence activities were strong enough to address EU concerns on data transfers. Previously, personal data could be freely sent to the US through the Privacy Shield framework, but this framework was abolished by the CJEU in the Schrems II judgment. Earlier this year, after negotiations with the European Commission, US President Joe Biden introduced a new EU-US Data Privacy Framework and signed a new law to comply with the CJEU decision. 

The Commission is now to submit the US adequacy decision to the European Privacy Council, which will state whether privacy is adequately safeguarded. The European Parliament will also scrutinise the decision. The Commission must then obtain the approval of all EU countries to formally approve the new mechanism, (probably in the first half of 2023). The decision will come into force when the US has fully implemented the new legislative changes. Finally, users can then challenge the decision via national and European courts. It is worth noting that:

A CJEU ruling upheld the EDPB’s role and authority to arrive at a collective decision under the GDPR’s consistency mechanism. The court stated that the action for annulment brought by WhatsApp Ireland against the EDPB binding decision is inadmissible. The decision led to a 225 million euro fine from Ireland’s Data Protection Commission, (DPC). It is now up to the Irish court to review the legality of the final decision of the Irish regulator. In 2021 the EDPB resolved a dispute on a draft decision of the DPC concerning WhatsApp Ireland’s GDPR transparency obligations to users and non-users of the service.  

The European Council has adopted its common position on the Artificial Intelligence Act ahead of official negotiations with the Parliament. It aims to ensure AI systems placed and used on the EU market are safe and respect existing laws, including relevant data protection. Since AI systems are developed and distributed through complex value chains, the text includes changes clarifying the allocation of responsibilities and roles of the various actors in those chains, particularly providers and users of AI systems. Several new provisions have been added:

  • where AI systems can be used for many different purposes, (general-purpose AI), and, where it is subsequently integrated into another high-risk system. In this case;
  • consultations and detailed impact assessments considering specific characteristics of general-purpose AI systems and related value chains would be applicable;
  • obligation for users of an emotion recognition system to inform natural persons when they are being exposed to such a system;
  • prohibition on the use of AI for social scoring by private actors;
  • some exclusions for national security, research, and development. 

Certain users of high-risk AI systems that are public entities will also be obliged to register in the EU database for such systems. The future AI act provides penalties, with proportionate caps on administrative fines for SMEs and start-ups, and a new complaint mechanism. 

Official guidance: standard data protection model, use of cookies, wrongful credit information, age-appropriate design code, HIPAA rules

The German Federal data protection commissioner updated the Standard Data Protection Model, (SDM), to provide suitable mechanisms to translate the legal requirements of the EU GDPR into technical and organisational measures. In particular, the new SDM first records the legal requirements of the GDPR and then assigns them to the protection goals of data minimisation, availability, integrity, confidentiality, transparency, risk assessment, and more. You can read the SDM 3.0 new version here.

The Croatian data protection authority AZOP issued a reminder on the use of cookies. Although the e-Privacy Directive stipulates the need for voluntary and informed consent to store or access cookies, the practical application of legal requirements differs in EU member states. Currently, observed implementations are based on one or more of the following practices:

  • an immediately visible notification that the website uses various types of cookies or similar technologies; layered access information that usually offers a link or a series of links, where the user can learn more about cookies whereabouts,
  • information on how users can indicate and later withdraw their preferences regarding cookies, including information about the action required to express such a preference,
  • the mechanism by which the user can decide to accept all or some or refuse cookies,
  • the possibility for the user to subsequently change the previous preference.

However, some cookies can still be exempted from informed consent under certain conditions, and only if they are not used for additional purposes:

  • cookies for user input, (session ID), for the duration of the session or permanent cookies in some cases limited to a few hours,
  • authentication cookies, which are used to authenticate the services, during the session,
  • user-oriented security cookies, used to detect authentication abuse, limited persistent duration,
  • multimedia content session cookies, (such as flash players), during the session,
  • session cookies for more balanced loading, for the duration of the session,
  • cookies for customizing the user interface for the duration of the session, (or a little longer),
  • cookies for sharing the content of social networks/third parties for the login of their members. 

Finally, third-party marketing cookies cannot be exempted from consent, including for operational purposes related to third-party advertising, such as frequency limiting, financial records, ad matching, click fraud detection, research and market analysis, product improvement, and troubleshooting.

The Latvian data protection authority DVI explains what to do if as a result of illegal activities, information is included in the database of a credit bureau. In the specific case, the regulator was approached by a person who was refused a loan for the purchase of a home, on the basis that the database of the credit information office contained information about her outstanding debts: loans she had not applied for. 

  • If a person finds that a database contains information about debts that they did not undertake, they can ask the creditor to limit the processing of data, including the transfer of this data to the credit information bureau. 
  • In practice, the restriction means that debt data will not be deleted, but it will also not be made available to other persons.
  • The person must attach evidence to the request that they have tried to resolve the matter on its merits, for example, a criminal case has been initiated.
  • Upon receiving a person’s request, the lender must assess whether it is justified.
  • Until the question of the validity of the loan is examined, the person can request a temporary settlement from the lenders, making a note in the database.

The Future of Privacy Forum released a brief comparing California and the UK Age-appropriate design codes. The California code of practice is a first-of-its-kind privacy-by-design law in the US which is set to become enforceable on 1 July 2024. It was modeled on the UK’s version and represents a significant change in the regulation of the technology industry and how children will experience online products and services. It follows 15 standards laid down in the UK law, including the “best interests of the child” standard, age assurance, default settings, parental controls, enforcement, and data protection impact assessments. The UK ICO has also published design tests to support designers of the products or services, that are likely to be accessed by children or young people.

The US Department of Health and Human Services highlighted the obligations of the Health Insurance Portability and Accountability Act, (HIPAA), on covered entities and business associates when using online tracking technologies, (Google Analytics, Meta Pixel), on webpages and apps with or without user authentication. Some entities regularly share electronic protected health information, (PHI), with online tracking technology vendors and some may be doing so in a manner that violates HIPAA rules. For instance:

  • It does not permit disclosures of PHI to a tracking technology vendor based solely on a regulated entity informing individuals in its privacy policy, notice, or terms and conditions of use that it plans to make such disclosures. 
  • Regulated entities must ensure that all tracking technology vendors have signed a Business Associate agreement and that there is applicable permission before the disclosure of PHI.
  • If the vendor is not a business associate of the regulated entity, then the individuals’ HIPAA-compliant authorisations are required before the PHI is disclosed to the vendor.  
  • Website banners that ask users to accept or reject a website’s use of tracking technologies, such as cookies, do not constitute a valid HIPAA authorisation. Read the full guidance here.

Investigations and enforcement actions: census data, diligence in choosing the subcontractor, social audio app, employee’s health data, multimedia boxes security, WC area surveillance

Portugal’s regulator the CNPD concluded that the National Institute of Statistics committed five administrative offenses, for violations the GDPR, within the scope of the 2021 census operation, and imposed a fine of 4.3 million euros. The CNPD decided that the organisation processed personal data relating to health and religion unlawfully. It failed to fulfill its duties of informing respondents of the census questionnaire, violated the duties of diligence in choosing the subcontractor, infringed the legal provisions relating to the international transfer of data and failed to comply with the obligation to carry out a DPIA relating to the census operation. In particular, choosing a subcontractor, (Cloudflare, Inc), despite the existence of a company office in Lisbon, meant the contract was with a US-based company under the jurisdiction of the California Court. It allowed the transit of personal data through any of the company’s 200 servers outside the European Economic Area. It contained the standard contractual clauses approved by the European Commission for the transfer of personal data to the US, without providing for any additional measures that prevent access to data by third-country government entities, established by the CJEU’s Schrems II judgment.

The Finnish data protection authority imposed an administrative fine of 230,000 euros on Viking Line for violations related to the processing of employees’ health data. A former employee complained that he had not received all the personal information requested, which was stored in the company’s systems. The regulator found out that:

  • Viking Line had stored his health information in the personnel management system for 20 years. 
  • Among other things, this included diagnosis information in connection with information about sickness leave. 
  • Some of the stored diagnosis information was incorrect, as it was not possible to enter all existing diagnosis codes into the system. 
  • Storing diagnosis information together with other information related to the employment relationship was against the law.

The French regulator CNIL imposed a penalty of 300,000 euros against telecoms company FREE, in particular for not having respected the rights of individuals and the security of its users’ data. Checks revealed several shortcomings, in particular in the rights of the persons concerned, (right of access and right of erasure), and data security, (low strength of passwords, storage, and transmission of passwords in plain text), and the recirculation of approximately 4100 poorly refurbished “Freebox” multimedia boxes. The technical and organisational measures of the reconditioning process did not prevent around 4,100 Freeboxes held by former subscribers from being reallocated to new customers without the data stored there having been properly deleted. This data could be photos, home videos, or  recorded television programs.

Finally, the Danish data protection agency has reported Danske Shoppingcentre P/S to the police for not having sufficiently restricted TV surveillance in at least one toilet area in a shopping centre. The regulator has recommended a fine of 47,000 euros. Danske Shoppingcentre explained that there had been problems with, among other things, vandalism in the toilets, and that it had therefore set up TV surveillance to prevent vandalism and theft as well as ensure security for customers. The company had a technical solution with a black marking on the camera to mask the urinal. However, it did not provide sufficiently masking, contrary to the principle of data minimisation. 

Data security: code of practice for app market, risk-based audit, phishing infographic, EU healthcare sector resilience

The UK ICO has completed the Rowan Learning Trust, (school-to-school support), voluntary audit on a risk-based analysis of the processing of personal data. The key elements of this are a desk-based review of selected policies and procedures, remote interviews with selected staff, and a virtual review of evidential documentation. The audit revealed that:

  • Data protection compliance is currently not discussed routinely in any local groups or at the board level across the trust. 
  • Compliance information is not reported to senior management. 
  • The trust should also implement a new data protection policy with supporting  documentation and ensure that staff are aware of and understand the contents.
  • There is currently no mandatory data protection training in place for the staff. 
  • The trust does not have a Record of Processing Activity document. 
  • There is currently no oversight of Records Management and operational responsibility assigned.
  • The trust has not conducted an information audit, so does not have an understanding of all of the information that is held and how it flows across the trust.
  • There are currently no compliance checks carried out across the trust to ensure that physical and electronic records are destroyed in line with their retention periods.

The UK government has published a voluntary Code of Practice to strengthen consumer protections across the app market. The government will work with the biggest operators and developers to support them in implementing the voluntary code over a nine-month period. Under the code, app store operators and developers will need to:

  • share security and privacy information in a user-friendly way with consumers. (eg, when an app and updates are made unavailable on an app store, the locations of  users’ data);
  • allow their apps to work even if a user chooses to disable optional permissions, such as preventing the app from accessing a microphone or the user’s location;
  • provide clear feedback to developers when an app is not published for security or privacy reasons;
  • have a vulnerability disclosure process in place, so software flaws can be reported and resolved without being made publicly known for malicious actors to exploit;
  • ensure developers keep their apps up to date to reduce the number of security vulnerabilities in apps.

America’s CISA published a Phishing Infographic to help protect both organisations and individuals from successful phishing operations. This infographic provides a visual summary of how threat actors execute successful phishing operations. Details include metrics that compare the likelihood of certain types of “bait” and how commonly each bait type succeeds in tricking the targeted individual. The infographic also provides detailed actions organisations and individuals can take to prevent successful phishing operations—from blocking phishing attempts to teaching individuals how to report successful phishing operations. 

The European Union Agency for Cybersecurity released the after-action report of the 2022 edition of Cyber Europe, the cybersecurity exercise testing the resilience of the European Healthcare sector. It featured a disinformation campaign of manipulated laboratory results and a cyber attack targeting European hospital networks. The scenario provided for the attack to develop into an EU-wide cyber crisis with the imminent threat of personal medical data being released and another campaign designed to discredit a medical implantable device with a claim on vulnerability. 

Big Tech: Microsoft ‘data boundary’ for the EU, Apple’s end-to-end encryption, Amazon buying customer data

Microsoft says its EU cloud customers will be able to process and store their data in the region from January. It will apply to all of its core cloud services – Azure, Microsoft 365, Dynamics 365 and Power BI platform. For many companies, data storage has become so large and distributed across so many countries that it becomes difficult for them to understand where their data resides and if it complies with the GDPR. The latest criticism of Microsoft 365 cloud services was recently expressed by the German data protection regulators, while the French ministry of national education has urged schools in the country to stop using free versions of Microsoft 365, (and Google Workspace), amid privacy concerns.

In the meantime, Apple unveiled a range of security and privacy enhancements. Users will be given the option to encrypt more of the data they back up to their iCloud using end-to-end encryption. The encryption key, or the code used to gain access to that secure data, will be stored on the device. That means that if a user who opts into this protection loses access to their account, they will be responsible for using their key to regain that access – Apple will no longer store the encryption keys in iCloud. The change will not apply to all data – email, contacts, and calendar entries will not be encrypted. Users will have to voluntarily opt into the feature. 

Finally, some Amazon users will now earn 2 dollars per month for agreeing to share their traffic data with the retail giant, Businessinsider reports. Amazon is keeping track of which advertisements participants viewed, where they saw them, and what time of day they were viewed as part of the business’s new invite-only Ad Verification program. Both Amazon’s own and third-party platform advertisements fall under this category. Only customers who were invited to participate in the program will be eligible for the reward; however, those who were not invited can join a waiting list.

Do you need support on data protection, privacy or GDPR? TechGDPR can help.

Request your free consultation

Tags

Show more +