Citizens’ privacy awareness

Data protection digest 16-31 Dec 2024: citizens’ privacy awareness is on the rise, yet attitude relies on income and obligations

Citizens' privacy awareness

Citizens’ privacy awareness: According to the latest survey by the Lithuanian data protection authority, a larger share of the public can correctly name an institution, (other than courts), that would help protect their rights in personal data protection. 

The regulator’s name, (VDAI), was indicated by 29% of respondents. 15% of respondents believe that they have encountered unlawful or improper processing of their data in the past year. Almost half of them say they have acted to protect their rights. People who are better informed about various laws and regulations are more confident that organisations ensure their right to personal data protection. 65% of respondents say their employers comply with the requirements. However, generally, trust in companies and institutions has been decreasing. Finally, people with higher incomes and higher positions perceive personal data protection conditions as more favourable, (72% of the top and middle-level managers), as opposed to the unemployed and small entrepreneurs.

The research methodology on citizens’ privacy awareness can be seen here.

Stay up to date! Sign on to receive our fortnightly digest via email.

AI development and deployment

To bridge into the 2025 technological year, the top EU data protection regulator the EDPB adopted an opinion on using personal data to develop and deploy AI models. It looks at a) when and how AI models can be considered anonymous, b) whether and how legitimate interest can be used as a legal basis for developing or using AI models, and c) what happens if an AI model is developed using personal data that was processed unlawfully. It also considers the use of first and third-party data. To that end, the EDPB is currently developing guidelines covering more specific questions, such as web scraping for AI training.

More legal updates

Citizens' privacy awareness

Norway tightens the requirements for consent for the use of cookies and similar technologies from 1 January 2025. The requirements are aligned with the EU GDPR. For consent to be valid under the new Norwegian law, it must be: 

  • voluntary
  • specifically
  • informed
  • unambiguous
  • given through an active action
  • documentable
  • possible to withdraw as easily as it was given

The user must also be given accessible and understandable information that allows them to easily understand the consequences of any consent. Until now, for example, it has been sufficient for default browser settings to allow cookies. The requirement for consent does not apply to the technical storage of or access to information, (to transmit communications, or which is strictly necessary to provide a service).

As of 2025, 19 US states have comprehensive consumer privacy laws, (effective between 2024 and 2026). Most of this new legislation protects the personal data of consumers within their states—residents of that state, excluding individuals acting in employment or commercial contexts, explains JDSupra publication. Only the California Consumer Privacy Act, (CCPA), as amended by the California Privacy Rights Act, (CPRA), applies equally to consumers, employees, and business-to-business commercial contacts. In parallel, the California Privacy Protection Agency announces increases for CCPA fines and penalties as of 1 January 2025.

Processors certification

The French CNIL is working on a draft reference framework adapted to data (sub) processors to create a new certification. A public consultation is open until 28 February. A data controller is required to use trusted processors, who provide sufficient guarantees under the GDPR, in the context of a service provided.

They often include: IT service providers, (hosting, maintenance, etc.), software integrators, IT security companies, digital service companies, marketing or communication agencies, etc. To obtain certification for them, it will be necessary to provide proof of compliance with each of the criteria of the standard. The draft evaluation framework is made up of 90 control points which are organised chronologically:

  • Contractualisation;
  • Preparation of the processing environment, including the security measures;
  • Implementation of the processing;
  • The end of the treatment.

Website reconstruction

Organisational errors during website reconstruction may result in data being made available, states the Polish data protection regulator UODO. In the related case, a company, (Panek SA), did not implement appropriate security measures, based on the risk analysis.

It did not test the solutions it introduced, nor did it assess their effectiveness. Due to the lack of appropriate communication between the administrator and the processor, an employee of a subcontractor mistakenly placed files with data from the old service on a new page. These files were indexed by Google and thus became available to everyone, (data on 21,453 customers and employees of the company): name, email address, home address, and encrypted passwords. The company that built the website claimed that it had not received information about the functionalities, (not mentioned in the data processing agreement). The company itself emphasized that the incident would not have occurred if not for a server configuration error, for which the company’s IT services are responsible.

More from supervisory authorities

Video surveillance: One of the most common ways for entrepreneurs to protect their property is to install video surveillance cameras. If a company uses cameras to record in a place where people, (customers, employees, passersby) may be present, then it can be considered that the company is processing data and it must take into account the data protection requirements, states the Latvian regulator. The most commonly applied legal basis is the pursuit of legitimate interests. 

This implies the application of the balancing test, whether video surveillance will not significantly infringe on the interests of the observed people. The organisation also must apply appropriate security measures, and inform data subjects, using the information sign, followed by the name of the data controller, contact information, and the purpose of the processing, as well as an indication of where further information can be found.

How to erase data: The Information Commissioner’s analysis states that 14 million UK people, (29%), don’t know how to erase their data from an old device or tech product. Over a quarter of UK adults plan to treat themselves to a new device this Christmas. However, the latest poll found that the average Brit has three unused devices sitting at home. Effective data erasure means that your data can’t be accessed by anybody else, either by mistake or for malicious purposes such as fraud. For example, a factory reset via the settings can adequately erase your personal information from most mobile phones.

Sports industry

The Irish DPC in its latest survey engaged with over 100 clubs across four major sports in terms of participation at a national level.

Notably, 56% of sports clubs do not have a personal data retention schedule. 41% of clubs reported they do not have any data protection policies, including for subject access requests or other data subject rights under the GDPR such as erasure or rectification. Finally, when a club introduces new types of technology, it is recommended to carry out a Data Protection Impact Assessment, (DPIA), to assess and mitigate the risks. But only 9% of the clubs carried it out.

Cookie banners

The Liechtenstein regulator warned website operators on the obligation to obtain consent when using cookies that are not technically necessary or when passing on data to third parties. One of the most frequently observed errors is that many consent management tools do not technically ensure that no further (tracking) scripts are executed and that technically unnecessary cookies are stored in the browser when cookie banners are displayed. For example, when a website is simply accessed, the personal data of the website visitor, (including the IP address), is often already transmitted to third parties.

Customers’ loan applications

Finland’s Data Protection Commissioner fined Sambla Group, a provider of loan comparison services, 950,000 euros because, due to poor data security, information about customers’ loan applications had been accessible to third parties through personal links intended for customers. The links provided access to the loan applicant’s contact information, as well as information on income, housing expenses, marital status and children. The information had been directly accessible to anyone who knew the customer’s web address and had the technical expertise to exploit the security flaws.

More enforcement decision

Data subject request in a foreign language: Data Guidance published an exceptional case concluded by the Spanish AEPD. It decided to punish OK MOBILITY GROUP with a fine of 100,000 euros, (which was lowered to 60,000 euros following voluntary payment and acknowledgement of non-compliance),

for failing to reply to an access request from a data subject, to provide a defined retention period for personal data, and for supplying an incorrect fiscal identification number. A request in German was not viable grounds for non-compliance, because the firm offers its services in Germany and the contract was concluded in German, concluded the AEPD.

Netflix fine: Between 2018 and 2020, Netflix did not provide customers with enough information about their data. Additionally, the information that Netflix did provide was unclear in some areas. The Dutch data protection authority therefore imposed a 4.75 million euro fine on the streaming service. Netflix collects various types of personal data from customers. From email addresses, phone numbers and payment details to data about what & when customers watch. In addition, customers were given too little information when they asked Netflix what data the company collects about them. 

KASPR data scraping fine: The French CNIL issued a 240,000 euro fine on KASPR for collecting the contact details of users on LinkedIn who had chosen to limit its visibility. KASPR markets a paid extension for the Chrome browser that allows its customers to obtain the professional contact details of people whose profiles they visit on LinkedIn. Around 160 million contacts are included in the database set up by the company. The CNIL noted that the fact that people had chosen to make their contact details visible to their 1st and 2nd-level contacts did not amount to authorising KASPR to access and collect their contact details.

Receive our digest by email

Sign up to receive our digest by email every 2 weeks

Data security

Incident reporting obligation: The Belgian NIS 2 cybersecurity authority issued guidelines for organisations on incident reporting obligations, (available in English, Dutch and French).  An incident under the NIS2 law is defined as “an event compromising the availability, authenticity, integrity or confidentiality of data stored, transmitted or being processed, or of services that networks and information systems offer or make accessible”. 

Also, notification of an event is mandatory when it constitutes a “significant” incident. It could be a) a suspected malicious event, b) an event compromising the availability of data, or c) an event causing or likely to cause material, physical or moral damage affecting other natural or legal persons. Recurring incidents that are linked through the same apparent root cause also belong to this list.

Security risks: As companies depend on accumulating more consumer data to develop products such as artificial intelligence, targeted advertising, or surveillance pricing tools, they may create valuable pools of information that bad actors can target for illicit gain, states the Federal Trade Commission. Its latest analysis looks at systemic causes of risk in several areas through the lens of data management, software development, and product design for humans. In addition, addressing security threats is nontrivial. Security practices that are employed upstream and directed at systemic vulnerabilities of technology, such as implementing data policies and access control, can minimize risk for consumers.

Companies must not only take reasonable measures to secure consumer data but also avoid misrepresenting their security practices.

Big Tech

AI Task Force: The US House Task Force on AI released a comprehensive 253-page report on the rapidly advancing technology. Gen AI systems can generate text, image, video, and audio/voice content. These systems are trained on a large set of existing written, visual, or audio data.

They identify statistical patterns in this training data and then create novel content. The report evaluates AI policy proposals in public administration, education & workforce, agriculture, healthcare and financial services, and small businesses.

A Cambridge University study, meanwhile, warns that AI is about to get into your head like never before. After decades of the ‘attention economy’ dominating, whereby websites sought to hook users for as long as possible to serve them adverts, an ‘intention economy’ is likely to replace it, with AI tools deployed to understand, forecast and manipulate human intentions to sell that data to companies. The report asserts that this emerging new marketplace for ‘digital signals of intent’ could have a huge effect on human aspirations, behaviour, and psychology beyond selling products, and could interfere with free and fair elections, a free press, and fair market competition. 

Do you need support on data protection, privacy or GDPR? TechGDPR can help.

Request your free consultation

Tags

Show more +