TechGDPR’s review of international data-related stories from press and analytical reports.
Legal processes and redress
EU Health Data Space: EU legislators are actively working on safeguards for the upcoming European Health Data Space. This includes promoting patients’ understanding and control of their personal health data. The latest amendments look at the main characteristics of electronic health data categories: patient summary, electronic prescription, electronic dispensation, medical image and image report, laboratory result, and discharge report. Under the Commission’s proposal, researchers, companies, and institutions will require a permit from a health data access body, to be set up in all member states. Access will only be granted to use de-identified data for approved research projects, which will be carried out in closed, secure environments, Sciencebusiness.com publication sums up.
Iowa privacy legislation: Iowa enacted its new comprehensive privacy law, making it the sixth US state to do so after California, Virginia, Colorado, Utah, and Connecticut. It will take effect in 2025. Anyone conducting business in Iowa or creating goods or services marketed toward Iowans who does one of the following is subject to the law: processes at least 100,000 consumers’ personal data; processes 25,000 consumers’ personal data, and more than 50% of gross revenue is generated from the sale of it. The law does not apply to financial institutions, nonprofit organizations, institutions of higher education, information bearing consumers’ creditworthiness, various research data, protected health information, and more.
Utah minors protection: Utah enacted two laws to limit children’s access to social media, making it the first US state to demand parental consent before children can use Instagram and TikTok. It also makes suing social media companies for damages simpler. To date, US lawmakers have had difficulty enacting stricter federal laws governing online child safety. Under Section 230 of the US Communications Decency Act, media service providers are largely shielded from liability for the content they provide.
Online service providers are also not required by federal statutes to use a particular method of age verification. Because of this, some have minimum age restrictions and ask users to enter their birthdate or age before granting access to the content. These restrictions are typically stated in the terms of service. According to Utah legislation, all users must submit age verification before creating a social media account. Minors under the age of 18 must have parental or guardian consent.
AI white paper: Principles, including safety, transparency, fairness, contestability, and redress will guide the use of AI in the UK, as part of a new pro-innovation national blueprint. Reportedly, Britain has more businesses offering AI goods and services than any other European nation, and hundreds more are being founded annually. Regulators pledge to provide organisations with advice over the coming year, as well as other resources like risk assessment templates. Currently, there is no deadline envisaged in the UK for passing AI legislation. Meanwhile, the EU AI act, which inherited a more risk-based approach and is being discussed by parliamentarians, can be reasonably expected this year.
Data protection by default: UK privacy regulator the ICO published resources to help UX designers, product managers, and software engineers embed privacy by default. The guidance looks at key privacy considerations for each stage of product design, from kick-off to post-launch when designing websites, apps, or other technology products and services. The ICO has also published videos with experts, technologists, and designers.
Employment guide: The Danish data protection authority’s guidance on data protection in employment relationships has been revised, (in Danish only). The update includes the acquisition of criminal records and references. The regulator also clarified an employer’s obligation to disclose information, trade union processing activities, workers monitoring needs, the use of IQ and personality tests, and more. In parallel, the Lithuanian regulator is preparing similar guidance for employees, business, and public sector, (in Lithuanian only).
Joint controllers: What is the difference between joint and independent data controllers? Joint controllers are established when the entities involved in processing perform it for the same or common purposes. Joint management can be established even when the entities pursue purposes that are only closely related or complementary, explains the Slovenian data protection authority. Purposes and means of processing are not always the same for all joint controllers but must be mutually determined via an agreement. They can also be defined by law. Subsequently, joint controllers are jointly and severally liable for damages.
Suspected data breach: Pursuant to the GDPR, in the event of a personal data breach that is likely to cause a high risk to the rights and freedoms of individuals, the data controller must notify the data subject without undue delay. However, notification is not mandatory if any of the conditions stipulated in Art. 34 (3) of the GDPR are met. Regardless of the above, in case of a suspected breach, (eg, unauthorised disclosure of a large amount of personal data), you have the right to request information from the data controller, (if they processed your data), as to whether your personal data is included in the incident, concludes the Croatian data protection agency.
ChatGPT ban: The Italian supervisory authority Garante has clamped down on ChatGPT. The limitation of the processing of Italian users’ data by OpenAI, the US company that developed and manages the platform, is temporary until it establishes privacy procedures. ChatGPT suffered a data breach on March 20 concerning user conversations and payment information for subscribers to the paid service. Garante noted the lack of information to users and all interested parties whose data is collected by OpenAI, but above all the absence of a legal basis that justified the collection and storage of personal data in order to train the algorithms.
Additionally, as evidenced by the checks carried out, the information provided by ChatGPT does not always correspond to the real data, thus establishing inaccurate processing of personal data. Finally, the service is aimed at people over 13 but does not use any filter for verifying the age of users and exposes minors to answers that are absolutely inappropriate with respect to their degree of development and self-awareness. OpenAI, which does not have an office in the EU but has appointed a representative in the European Economic Area, must communicate within 20 days on the measures taken.
Wrongful copy: The Greek data protection authority looked into a complaint from a Vodafone subscriber who received a CD containing the conversations of another person after requesting access to the recorded conversations with the Vodafone call center. Although Vodafone was immediately notified by the complainant, it did not take any investigative steps to confirm the incident, but initially contented itself with the processor’s response that it did not locate the complainant on the phone. It subsequently contacted her to return the CD. Vodafone was ordered to send the correct file and was fined 40,000 euros (Art. 15 and Art. 33 of the GDPR).
Email correspondence: Employees’ right to privacy is unaffected by a legitimate interest in processing personal data for legal defense. The Italian privacy authority fined a company that continued to use an employee’s email account after they had left the firm, viewing the content, and setting up forwarding to a company employee. The former collaborator had gathered references from potential clients they had met at a fair. The company claimed that a legal dispute resulted from the collaborator’s attempt to get in touch with them. Fearing losing relationships with potential customers, the company had not only written to them to explain that the person had been removed, but had also viewed the communications.
GPS monitoring: Tehnoplus Industry in Romania was fined for a GPS system installed on a company car, without the employee having been informed, or having previously exhausted other less intrusive methods to achieve the purpose of processing – monitoring the service vehicle. Tehnoplus Industry excessively processed the location data related to the complainant even outside working hours. Subsequently, the purpose and the legal basis of this processing and in addition the excessive storage period of the data collected, (over the established 30 days limit); were also unlawful.
In parallel, the French privacy regulator imposed a fine on Cityscoot for geolocating customers almost permanently in breach of the data minimisation principle. During the rental of a scooter by an individual, the company collected data relating to the geolocation of the vehicle every 30 seconds. In addition, the company kept the history of these trips. None of the established purposes of the processing, (the treatment of traffic offenses, handling customer complaints, user support, and theft management), could justify the monitoring and could have been organised without constant tracking.
Cybersecurity tools: The French regulator CNIL has updated its guidance on the security of data protection, (in French). It supports professional actors processing personal data by recalling the basic precautions to be implemented. 17 fact sheets look at the latest recommendations on authenticating users, tracing operations and managing incidents, securing the workplace, guiding IT development, securing exchanges with other organizations, encryption, and much more.
The European Union Agency for Cybersecurity also releases a tool to help small and medium-sized enterprises assess the level of their cybersecurity maturity. This tool contributes to the implementation of the updated Network and Information Security, (NIS2), Directive. The majority of SMEs are excluded from the scope of the Directive due to their size and this work provides easily accessible guidance and assistance for their specific needs.
Similarly, the UK National Cyber Security Centre launches two new services to help small organisations stay safe online:
- The Cyber Action Plan can be completed online in under 5 minutes and results in tailored advice for businesses on how they can improve their cyber security.
- Check your Cyber Security – which is accessible via the Action Plan – can be used by any small organisation including schools and charities and enables non-tech users to identify and fix cyber security issues within their businesses.
Mobile threat defense: America’s NIST investigates mobile threat defense applications that provide real-time information about a device’s risk level. Like any other app, MTD is installed on a device by a user. The app then finds undesirable activity and alerts users so they can stop or minimize the harm. For instance, it alerts users when it’s time to update their operating systems. Additionally, users of the app can receive alerts when someone is listening in on their internet connection. However, without being integrated with a mobile device management system, MTD applications are only marginally effective in your enterprise environment.
Child Care apps: In the US childcare facilities are using technology more and more reports edsurge.com which tells the story of a parent who signed her child up for child care. She wasn’t expecting to have to download an app to participate, and when that app began to send her photos of her child, she had some additional questions. Laws like the Family Educational Rights and Privacy Act and the Children’s Online Privacy Protection Act don’t apply in these circumstances, so parents will need to conduct some independent research. The other aspect is that cameras have the potential to make teachers and other classroom employees anxious or otherwise not themselves, she says. They may feel that administrators or parents don’t trust them and make them avoid some activities like dancing.
You are (not) hired: Reportedly, a third of Australian companies rely on artificial intelligence to help them hire the right person, while there are no laws specifically governing AI recruitment tools. Applicants are often unaware that they will be subjected to an automated process, or if not, on what basis they will be assessed. For instance, AI might say you don’t have good communication skills if you don’t use standard English grammar, or you might have different cultural traits that the system might not recognise because it was trained on native speakers. Another concern is how physical disability is accounted for in something like a chat or video interview. Read more analysis by the Guardian in the original publication.
Vehicle data: Because data ownership remains undefined under EU law the Commission’s proposed Data Act for fair access to such information, particularly in the vehicles sector, appears to have hit problems. Legislative proposals were expected to regulate a connected car sector estimated to be worth more than 400 billion euros by the end of the decade. Now car services groups warn very few big players are able to access this data, skewing the market, Reuters reports.