How is data protection related to information security?
The goal of information security is to protect an organisation’s business processes. This means responsibility for the security of the entire operating system and the ability to resist any activities that threaten the availability, authenticity, integrity, and confidentiality of data processed in the system or the services provided and accessed through the system, according to the Estonian data protection regulator.
The information assets include all IT resources – hardware, software, various data communication devices, etc. However, people working in an organisation and customers can also be considered information assets. Therefore, it can be said that data protection and information security are like two sides of the same coin: data protection determines the basic principles of personal data processing, while information security helps to implement these principles.
Stay up to date! Sign up to receive our fortnightly digest via email.
Beyond the simple fact that it makes good business sense to ensure information security and protect assets, the obligation to implement information security comes among other things from data protection laws, which state that personal data security must be ensured by appropriate and secure measures. This means that each situation must be assessed individually. To start with:
- Map out what your organisation does and what business processes it involves.
- Identify the assets you have in place—whether they’re customer data, documents, employees, information systems, or security equipment.
- Don’t forget your “global defense zone”: your physical office, home office, coworking spaces, and other locations where your organisation’s assets and information might be located.
- If something major happens in any of these components, you need to know immediately if and how it will impact your organisation.
As a general approach, try to process as little personal data as necessary and only when needed, stresses the Estonian regulator.
List of AI companies signed up to the EU Code of Practice
The Commission has published the full list of signatories to the EU’s generative AI Code of Practice initiative so far, known also as the Code of Practice for General Purpose AIs (GPAIs), published on July 10, 2025. This will reduce their administrative burden and give them more legal certainty than if they proved compliance through other methods.
Among signatories there are: Amazon, Anthropic, Google, IBM, OpenAI, Microsoft, Mistral AI and a dozen other companies, (some signatories may not appear immediately on the list). In addition, xAI signed up to the Safety and Security Chapter; this means that it will have to demonstrate compliance with the AI Act’s obligations concerning transparency and copyright via alternative adequate means.
The code has also been complemented by Commission guidelines and the Q&A on key concepts related to general-purpose AI models.
More legal updates
European Biotech Act: The Commission opened a consultation, until 10 November, as part of the development of the European Biotech Act. It will propose a series of measures to create an enabling environment to accelerate the transition of biotech products from laboratory to factory and to the market, while maintaining the highest safety standards for the protection of the population and the environment. The act will address growing dependencies in biotech on data, storage, computing power, and AI.
In the EU, biotechnology reached a gross value added in 2022 of 38.1 billion euros: the highest contribution came from medical and pharmaceutical biotechnologies, and the fastest-growing area was industrial biotechnology. At the same time, European biotech companies face an opportunity gap, with the US having twice as many early-stage venture capital deals and three times as many late-stage deals. Over the last six years, 66 of the 67 biotech companies going public have targeted the US NASDAQ rather than European stock markets.
California privacy updates: The California Privacy Protection Agency (CPPA) has filed a judicial action seeking to enforce an investigative subpoena against Tractor Supply Company, a Fortune 500 company that bills itself as the nation’s largest rural lifestyle retailer. The CPPA’s petition alleges that Tractor Supply failed to comply with a subpoena seeking information about the company’s compliance with the California Consumer Privacy Act of 2018. The petition marks the CPPA’s first public disclosure of an ongoing investigation into a company and its first judicial action to enforce an investigative request. The agency has been investigating whether Tractor Supply failed to honour Californians’ right to opt out of the sale and sharing of their personal information online.
More from supervisory authorities
GDPR from A to Z: The German Federal Data Protection Commissioner (BfDI) has updated a catalogue that provides a compact compilation of the most important legal texts: the European General Data Protection Regulation (GDPR) and the Federal Data Protection Act (BDSG). In addition to the legal texts and the references to the GDPR, it contains explanations of specific topics and vague legal terms.
Data memorisation in LLMs: Additionally, the BfDI has finished its consultation on processing personal data in large language models in a way that complies with data protection laws. Civil society, industry, and scientific groups were all included in the consultation. It looked for information about the limits of anonymisation, the memorisation of personal information, the dangers of data extraction, and the protection of the rights of data subjects under the GDPR in AI systems.
AI in healthcare: The EU Publication Office offers a study on on the deployment of AI in healthcare. Present-day healthcare systems face several complex challenges, including rising demand due to an ageing population, increasing prevalence of chronic and complex conditions, rising costs, and shortages in the healthcare workforce. AI has the potential to address some of these by improving operational efficiency, reducing administrative burdens, and enhancing diagnosis and treatment pathways.
E-store data minimisation
The Latvian DVI explains what is the minimum amount of data to place an order in an e-store. In order to ensure the fulfillment of an order, certain personal data must be collected and processed. This process can be conditionally called a mutual agreement. The following data is required to place an order:
- customer’s name and surname (for indication in a supporting document, for example, an invoice);
- email address (for sending invoices and order status messages);
- phone number (to ensure delivery, the courier also receives this information);
- delivery address or parcel machine address (depending on the selected delivery method).
The merchant must be able to clearly indicate why each type of data is necessary. For example, first and last name is necessary to fulfill a legal obligation. Other data, on the other hand, is necessary to fulfill the requirements of the contract. For example, if the service is “intangible” (online courses), first name, last name and email address are sufficient, which are necessary for sending the invoice and access data. A merchant may also need additional information if the product or service is individually tailored to the customer (eg, tailored clothing, selection of skin care products manufacturing of spectacles).
Customer data may only be used for the purposes originally specified. It may not be transferred to other parties unless there is a legal basis for this, such as the customer’s consent, a legal obligation or a legitimate interest. It may also be justified to use the data for related purposes such as archiving, if this does not conflict with the original purpose of obtaining the data.
Data deletion request
The DVI has also tried to answer the question: Should the deletion request itself be erased if someone has asked for data processed with their consent to be deleted? If a person withdraws consent to the processing of their data and requests the deletion of all data related to this consent, the organisation is obliged to stop processing this data as soon as possible and delete it, unless there is another legal basis for continuing to store or use it. This means that all data that was collected on the basis of consent must be deleted (eg, the person being removed from the list of recipients of commercial communications).
However, the request document itself, by which the person withdraws consent, as well as the organisation’s response to it, cannot be deleted at the same time as the aforementioned data, since the basis for processing such information is not the person’s consent within the meaning of the GDPR. They may be stored to fulfill the institution’s interests in managing its documentation and ensuring the protection of its rights (so that, if necessary, it can be confirmed that the request has been received, fulfilled and when it occurred).
More official guidance
Biometrics: Canada’s Privacy Commissioner has published guidance on biometrics for the public and private sectors. While biometrics can enhance security and help in service delivery, they can also raise privacy issues. Biometric information is intimately linked to an individual’s body and is often unique, and unlikely to vary significantly over time. It can reveal sensitive information such as health information or information about race and gender characteristics. The guidance among other things addresses key considerations for organisations when planning and implementing initiatives involving biometric technology – transparency, safeguarding data, and accuracy, including testing for biometric systems.
IoT data security: America’s NIST finalized its ‘Lightweight Cryptography’ Standard to Protect Small Devices. Four relevant algorithms are now ready for use to protect data created and transmitted by the Internet of Things and other electronics. The standard is built around a group of cryptographic algorithms in the Ascon family, which NIST selected in 2023 as the planned basis for its lightweight cryptography standard . They require less computing power and time than more conventional cryptographic methods do, making them useful for securing data from resource-constrained devices. For more technical information on the standard, visit the NIST Lightweight Cryptography Project page.
Receive our digest by email
Sign up to receive our digest by email every 2 weeks
Optus data breach in Australia
The Australian Information Commissioner has filed civil penalty proceedings against Optus (telecommunications), following an investigation in relation to the data breach made public by Optus on 22 September 2022. The data breach involved unauthorised access to the personal information of millions of current, former and prospective customers of Optus, and the subsequent release of some of this information on the dark web. This included names, dates of birth, home addresses, phone numbers and email addresses, passport numbers, driver’s licence numbers, Medicare card numbers, birth certificate information, marriage certificate information, and armed forces, defence force and police identification information.
Based on this case the Australian regulator asks all organisations to:
- implement procedures that ensure clear ownership and responsibility over internet-facing domains
- ensure that requests for customers’ personal information are authorised to access that information
- layer security controls to avoid a single point of failure
- implement robust security monitoring procedures to ensure any vulnerabilities are detected and that any incidents are responded to in a timely manner
- appropriately resource privacy and cyber security, including when outsourced to third party providers
- regularly review practices and systems, including actively assessing critical and sensitive infrastructure, and act on areas for improvement in a timely manner.
Voiceprint for authentication purposes
The Swiss Federal Data Protection Commissioner has examined whether PostFinance (a retail banking and business client) is violating data protection regulations when using voice recognition as a means of authentication. It concluded the investigation on 16 May with a ruling instructing PostFinance to obtain the express consent of the person concerned when creating voiceprints for voice recognition and to delete voiceprints for which no consent has been explicitly given.
Voiceprints are a type of biometric data. Under data protection law, they are considered sensitive personal data if they enable the identification of an individual. Unlike a password, it cannot be recreated in case of misuse.
In other news
Meta AI: According to the privacy advocacy group Noyb, just 7% of consumers want Meta to utilise their personal information for AI, despite the fact that over 75% of users were aware of Meta’s ambitions. Noyb has commissioned the Gallup Institute to survey 1,000 Meta users in Germany in order to learn more.
In May this year, Meta decided to begin using EU personal data to train its AI systems by just asserting that they had a “legitimate interest” under Article 6 of the GDPR. Although nearly two-thirds of the participants claim to have heard about Meta’s announcement, just 40% of Instagram or Facebook users can recall seeing the in-app message that was concealed under a notification menu, (or can recall the email notice that was sent with a subject line designed to make people ignore it).
But as people age, knowledge about this issue increases significantly, while women are less inclined to give AI their data.
IBAN: The IBAN can in some cases allow a hacker to issue illegitimate direct debit orders. The hacker can also, more directly, usurp another person’s IBAN by communicating it when creating a direct debit mandate as part of a subscription to a service. In order to reduce the risk of fraudulent use of your IBAN and minimise its consequences, the French regulator CNIL recommends:
- Monitor your bank account transactions regularly and block your bank account if necessary.
- Contact your usual bank advisor if you have any doubts.
- Check the list of authorised creditors (eg, the beneficiaries of direct debits) in your online banking space.
- When receiving a pre-filled direct debit mandate, or an alleged update of it, be vigilant about the information describing the creditor.
One click was nothing. But you gave away a lot
As digital technology allows for limitless information sharing with just a single click, the Latvian DVI is launching an educational public awareness campaign to encourage every digital user, but especially young people, to realise that personal data is a value, not an accidental footprint left on the internet. The campaign emphasizes that seemingly harmless digital actions, such as posting your photos on social networks, participating in a free game, or clicking the “I agree” button without reading the contents of a document, can mean widespread and irreversible data transfer consequences that are not always easy to predict or reverse.
Similarly, Privacy International publishes a series of educational case studies to answer the question of “Why privacy matters” for schoolchildren, workers, people with disabilities, protestors and even sports fans and many others. Here are some outstanding points of the analyses:
- When surveillance creeps into classrooms and digital learning platforms, it threatens the freedom of pupils to feel safe to explore ideas, make mistakes and develop into their own unique selves.
- Employers are using surveillance to monitor, control, and exploit workers in ways that many may not even be aware of.
- The growing threat of intrusive surveillance such as AI-powered facial recognition in stadiums risks turning a vibrant cultural space into one of control and suspicion.
- Privacy is a universal right, but for people with disabilities, it’s often compromised in the very systems designed to support them.
- In society, dissent – especially through protest – is vital for progress, change, and holding power accountable. Without privacy, protestors risk losing their voices, and their own safety.
- Migrants have the same right to a private life and to be free from intrusive surveillance as anyone else. Yet, for people on the move, this right to privacy is under constant threat.
In case you missed it

Meta’s “story” photos: The Icelandic data protection regulator explains that Meta launched a feature that goes through photos on your phone and suggests what to post on Facebook. The social media app automatically selects photos or videos from your phone and sends them to Meta’s servers. The photos are then processed using artificial intelligence to display post suggestions in “Story”.
This is done without the user having specifically uploaded the photos or videos to the social media platform for publication there. Since this may be a significant intrusion into people’s privacy, and since the regulator has received reports that people have not realised that this feature has been enabled, the regulator provided the instructions on how to disable the feature:
- Open the app on your phone.
- Press + at the top of the screen.
- Tap “Story”.
- In the top right corner: Press the “Settings” gear.
- At the bottom is “Camera roll settings”.
- Turn off “Get camera roll suggestions when you’re browsing Facebook”.
Political advertising in the EU: Google and Meta announced that they will suspend all political advertising services in the EU due to the application of the Political Advertising Transparency and Targeting Regulation in October 2025, the Estonian regulator reports. The implementation of the new regulation will bring a number of operational and legal requirements that are difficult to implement. As a result, Google has decided to suspend all political advertising services, including on YouTube, until there is greater clarity on the implementation of the regulation. However, Meta believes that the implementation of the new regulation will make the current transparency and targeting systems too complex and ineffective, significantly reducing the ability of advertisers to reach the electorate.