<style>.lazy{display:none}</style> Data protection digest 31 July - 14 August 2023: privacy laws development, AI evaluations at school, and security of connected devices - TechGDPR

Data protection digest 31 July – 14 August 2023: privacy laws development, AI evaluations at school, and security of connected devices

In this issue you will find, that China is tightening controls on Generative AI, India is finalising its comprehensive privacy laws, while California is reviewing data privacy practices by connected vehicle manufacturers and related technologies.

Legal processes and redress

China privacy laws updates: The Chinese Cyberspace Administration has issued administrative measures for personal data compliance audits for public input. In the case of high-risk processing operations or security incidents, the department in charge of personal data protection, (under the new PIPL legislation), may order the organisation to delegate the compliance audit to a professional institution. Similarly, businesses can perform their audits or entrust them to a recognised professional institution. However, no more than three consecutive compliance audits for the same organisation may be performed by the same institution. Companies that process more than one million people’s personal information must complete it at least once a year. 

China has considerably tightened controls on information sharing in recent years, particularly data transfers abroad, on the grounds of national security.

China generative AI: In parallel, China passed innovative legislation to govern generative AI. Interim Measures for the Management of Generative AI Services go into effect on 15 August. They apply to broad public services in China and hold firms accountable for the output of their platforms. The data used to train the systems will have to fulfil certain stringent conditions, not addressed in previous legislation, Deacons lawyers clarify:

  • Providers of generative AI must take responsibility for network information security, personal data protection, and produced content quality. 
  • Service providers are liable for the created material and are obliged to ban and report unlawful and illegally linked information. 

Technology created in research institutes or destined for export will be excluded. 

Swiss privacy law revised: On 1 September, the revised federal data protection act will come into force. The current law remains in force until 31 August. Major innovations will include criminal aspects of breaches of obligations, reinforced duty for data controllers to provide information to data subjects, data protection impact assessment for high-risk processing both in public and private sectors, fees for private data processors, regulators’ additional duties and powers, and more. 

India comprehensive privacy law: The Digital Personal Data Protection Bill 2023 passed in parliament before receiving presidential assent. It will apply to the processing of digital personal data within India where such data is collected online, or collected offline and is digitised.  It will also apply to such processing outside India if it is for offering goods or services in India. Personal data may be processed only for a lawful purpose upon the consent of an individual.  Consent may not be required for specified legitimate uses such as the voluntary sharing of data by the individual or processing by the state. The main criticisms of the bill include:

  • The bill exempts data processing on grounds of national security which may lead to data collection, processing, and retention beyond what is necessary. 
  • The bill also does not grant the right to data portability and the right to be forgotten. 
  • The bill allows the transfer of personal data outside India, except to countries notified by the central government. This mechanism may not ensure adequate evaluation of data protection standards in certain countries.
  • The bill does not regulate risks of harm arising from the processing of personal data.

More analyses by PRS Legislative Research Institute are available here

Official guidance

Google Analytics: The use of tools like Google Analytics does not only require legal transfers to the United States, (following the announcement of the US adequacy decision by the European Commission), states the Danish data protection authority. In addition to third-country transfers, there are a large number of requirements in the GDPR that must be complied with. Among other things, you need to establish a legal basis for data processing, define data processing roles and conclude data sharing agreements, fulfil data subject rights, and much more.

Rights to data portability and restriction of processing: The wide range of digital services often leads to the desire or need to change a service provider, so it is important to be aware that we have data transfer rights. However, the Latvian data protection agency reminds us that such an option is available only if: a) the personal data processed by the organisation is based on your consent or the concluded contract; b) the information has been provided by the person themself; c) data refers to the person who requests data transfer.

Similarly, a person may face a situation where they need not delete personal data, but limit its processing. A situation may arise when an organisation holds personal data which is either inaccurate or out of date. If a person believes that their data is being processed illegally, they can also ask for its deletion or restriction of processing. There might be cases when the company does not need your personal data, but you need them to keep it, (eg, video surveillance records that a store normally deletes after a certain period of time but agrees to keep separately for police investigation needs). 

Finally, you can always ask to limit the processing of your data if you doubt that the legitimate interests of the controller are more important than your right to data protection. 

Harmful online design: The UK Information Commissioner’s Office and Competition and Markets Authority are calling for businesses to stop using harmful website designs that can trick consumers into giving up more of their data than they would like. It includes:

  •  overly complicated privacy controls, 
  • default settings that give less control over personal information, and
  •  bundling privacy choices together in ways that push consumers to share more data.

Where consumers lack effective control over how their data is collected and used, this can harm consumers and also weaken competition. Lack of consumer control over cookies is a common example of harmful design. 

Parental control and connected devices: The French data protection regulator CNIL has issued an opinion on decrees implementing parental control over means of access to the Internet including the different functionalities that parental control devices will have to integrate on connected devices – smartphones, computers, video game consoles – blocking the download of applications and blocking access to content installed on terminals. Its activation must be offered free of charge, from the first commissioning of the device. They must also integrate the principles of personal data protection by design and by default. The CNIL has recommended two mandatory features, which could be activated according to the maturity of minors, to protect them when browsing the web:

  • blacklists to block access to sites or categories of sites previously determined by parents; and
  • whitelists to limit browsing to only previously authorized sites (for the youngest category). 

Enforcement decisions

TikTok in the EU: The EDPB settles dispute on TikTok processing of children’s data. The binding decision addresses the objections of the Irish, (lead), supervisory authority regarding the personal data processing of registered minors, (including those under 13 years old). The objections centred on whether there had been an infringement of data protection by design and default about age verification, and other design practices. The binding decision might result in a fine and other reprimands for the social media giant, which will become known in the next few weeks. 

AI at schools:  In Canada, a case detailed by Osler’s lawyers considers the privacy of children in educational institutions when they are exposed to AI tools. In collaboration with a consulting firm, a school district developed an algorithm to target students who were at high risk of dropping out: a machine learning methodology analyses hundreds of types of raw data from a student database to generate a set of predictive indicators. The purpose limitation for such data processing was violated, according to the investigation commission. 

When the data was initially obtained, students and their parents were not informed and hence did not consent to the use of the data to build predictive indications of dropout risk. Even though the information was used for a purpose that was compatible with the school board’s goals of ensuring academic achievement, the regulator ordered the school to delete the tool’s existing output. It also requested that the school board do a privacy impact study before deploying the Tool. More information on the case may be found in the original publication

Police data leak: According to BBC News, the Northern Ireland Police Service has apologised for inadvertently disclosing the personal information of all 10,000 of its personnel. In response to a Freedom of Information request, the organisation provided the identities of all police and civilian staff, as well as their locations and functions. The FOI request requested a breakdown of all employee levels and grades from the PSNI. However, in addition to publishing a table indicating the number of personnel holding jobs such as constable, the PSNI also released a spreadsheet. This contained the surnames, initials, and other information of over 10,000 officers.

Carbon copy and sensitive data: The UK Commissioner’s Office has reprimanded two Northern Irish organisations for disclosing people’s information inappropriately via email. Both the Patient and Client Council and the Executive Office disclosed personal details by using inappropriate group email options. In the first case, the organisation sent an email to 15 people, each of whom had lived experience of gender dysphoria, using the carbon copy (cc) option. The people who received the email could reasonably infer that the other recipients also had experience of gender dysphoria, given their inclusion in the email. In the second case, following the report of the historical institutional abuse inquiry, the organisation sent an e-newsletter to 251 subscribers using the ‘to’ field. People included in the email were likely to be victims and survivors, as the newsletter content was tailored to survivors who were wishing to engage, or who were already engaging with the compensation scheme.

DDoS attack: The UK Information Commissioner also issued a reprimand to My Media World/ Brand New Tube. An unauthorised third party gained access to ITS’s systems and exfiltrated the personal data of 345,000 UK data subjects. The company has been unable to determine the specific cause of the incident concluding on separate occasions that a server misconfiguration and a DDoS attack were responsible for the access to their systems. The company also did not have any evidence of appropriate technical and organisational measures to protect users’ data. The nature of the data affected included the names, email addresses and passwords of users. The organisation must now ensure they have:

  • appropriate contracts in place with any third-party providers which set out the roles and responsibilities of each party, 
  • maintained records of processing activities, and
  • regular scans and testing of their environment, record outcomes and address any issues promptly. 

More security best practices recommended to organisations by ICO can be found here and here

Data security

Connected beacons: Connected tags, which have been around for several years, make it possible to locate and find the objects to which they are attached. While technology is useful for finding lost objects, states the French data protection regulator, many media stories show that they can be misused to track the location of people without their knowledge. Only the owner can detect the beacon and therefore track its movements. However different measures have been put in place by manufacturers of connected beacons to allow you to detect them in case of doubt.

If you have an iPhone, you’ll get a notification when an AirTag you don’t own moves with you for a period of time. A feature will then allow you to connect to the AirTag to make it ring. If you have the latest version of Android, you will automatically receive a notification when a separate AirTag from its owner moves at the same time as you for a while. If you do not have a smartphone, the AirTag will beep its position if it is too far from its owner for a certain time. 

The use of a connected beacon to follow a person without their consent is a criminal offence, punishable by one year’s imprisonment and a fine of 45,000 euros. More information on how to detect and disable the tags is in the original publication

Big Tech

Meta compulsory fine: The Norwegian data protection authority has imposed a compulsory fine on Meta – approx. 90,000 euros per day. The background is that Meta does not comply with the Norwegian data protection authority’s ban on behaviour-based marketing on Facebook and Instagram. However, Meta has petitioned the Oslo district court for a temporary injunction against the ban. 

The ban does not prohibit personalised marketing on Facebook or Instagram as such. Meta can, for example, target marketing based on information that users enter on their profile, such as place of residence, gender and age, or interests that users themselves state that they want to see marketing about. The decision also does not prevent Meta from showing behaviour-based marketing to users who give valid consent to it.

Google user tracking: A US court denied Google’s request to dismiss a lawsuit alleging that the company violated the privacy of millions of individuals by secretly tracking their internet usage, Reuters reports. The plaintiffs claimed that Google’s analytics, cookies, and applications allowed the Mountain View, California-based business to follow their activities even when they used Google’s Chrome browser in “Incognito” mode and other browsers in “private” mode. Since June 2016, Google users have been covered by the case. It demands at least 5000 euros in damages for each user. 

Connected vehicles: Finally, the California privacy protection agency announced a review of data privacy practices by connected vehicle manufacturers and related technologies. These vehicles are embedded with several features including location sharing, web-based entertainment, smartphone integration, and cameras. Data privacy considerations are critical because these vehicles often automatically gather consumers’ locations, personal preferences, and details about their daily lives. They’re able to collect a wealth of information via built-in apps, sensors, and cameras, which can monitor people both inside and near the vehicle. 

Do you need support on data protection, privacy or GDPR? TechGDPR can help.

Request your free consultation

Tags

Show more +