Summary

Ministers have set out a vision for the UK’s technology strategy.  The UK is to become a “tech superpower” by 2030.  This will be achieved by creating “world-leading pro-innovation regulation” and influencing global technical standards.[1] 

However, the former Information Commissioner has pointed out that this kind of “digital growth” will only be possible if individuals’ are prepared to allow their personal data to be shared:  personal data is the “energy” which powers new technologies.[2]  If the data protection regime in the UK provides insufficient levels of protection for personal data, then this will erode public trust and undermine Ministers’ vision of the UK being a global leader in data-driven innovation. 

Whilst many of the changes suggested by this Bill will have little influence on data protection standards, we consider that there are some significant provisions which lower the standard of the protection of personal data in the UK.  Some of the relevant changes are set out on the face of the Bill, for example the Bill sets out a lower standard which needs to be met in order to enable the free flow of data from the UK to another jurisdiction.  However, other important protections such as the right not to be subject to solely automated decision-making could be removed through the powers which the Bill confers on Ministers.

Lowering the standard of the protection of personal data in the UK could have significant implications for EU-UK Trade, EU-UK data sharing in a criminal justice context and for the UK’s adherence to international data protection standards, as well as for the complexity of the UK’s data protection regimes. 

  1. The UK’s current data protection framework (both in the context of general processing and criminal justice processing) has been assessed by the EU as being essentially equivalent to the standard in the EU.  The finding of essential equivalence is set out in an “adequacy decision”, which enables the free flow of data from the EU to the UK.  This decision could be revoked by the EU if it considers that the UK no longer ensures an adequate level of protection for EU data. A lowering of the UK’s standard of protection of personal data – through the divergence of the UK GDPR from the EU GDPR – therefore risks the free flow of data between the EU and the UK. 
  2. Without the free flow of data from the EU to the UK, British businesses would have to spend significant time and resources on red tape.  It has been estimated that a loss of the free flow of personal data from the EU to the UK could cost UK business up to £1.6bn. 
  3. Lowering of the protection of personal data also risks the free flow of data from the EU to the UK for law enforcement purposes and the potential suspension of the law enforcement cooperation provisions in the Trade and Cooperation agreement. 
  4. Further, certain provisions of the EU-UK Withdrawal Agreement would come into effect if the UK lost data adequacy.  These would add to operational complexity for UK organisations.  They would be required to apply different data protection standards, depending on where the data they are processing originated. 
  5. Another significant consequence of the lowering of data protection standards in the UK is that UK standards would fall below the requirements of the Council of Europe’s updated Treaty on data protection, Convention 108+. Diverging from standards adhered to by the UK’s closest trading partners may have negative consequences and undermine Ministers’ vision for the UK as a global leader in this policy area.

Policy context

The vision set out by the Department for Science, Innovation and Technology is for the UK to “[leverage] post-Brexit freedoms to create world-leading pro-innovation regulation and influence global technical standards”.  Ministers have stated that they intend for the UK to become a “tech superpower by 2030”.  This will be achieved by legislation to amend the UK’s data protection frameworks in a way which will “promote innovation and boost trade, while protecting citizens’ rights.”[3]

Relationship between high data protection standards and growth in the digital sector

In her response to the Government’s consultation, “Data: a new direction”, the then Information Commissioner, Elizabeth Denham, highlighted that there is a significant link between new technologies and high data protection standards.  She stated that the “energy” powering new technologies was personal data:  information about peoples’ “behaviour, our interests, our spending patterns, our loves and likes, our beliefs, our health, sometimes even our DNA – the very building blocks that make us who we are.”   She stated that “the economic and societal benefits of this digital growth are only possible through earning and maintaining people’s trust and their willing participation in how their data is used. Data-driven innovations rely on people being willing to share their data.”[4]

Continuing relevance of EU standards

The EU standard for the protection of personal data is still relevant, despite the UK’s departure from the EU.  This is because the government has sought and obtained a formal finding from the EU that the UK standard of the protection of personal data is “essentially equivalent” to the UK standard.  This enables the free flow of data from the EU to the UK for both general data processing[5] and for the processing of personal data in a criminal justice context.[6]  

Aspects of the Bill that could undermine trust and lower the standard of protection

The Bill could lower the standard of protection of personal data in the UK for the following reasons:          

  • Lowering the test for data adequacy: the Bill introduces a new test for “adequacy” for international data transfers which is lower than the current test.
  • Widening the scope of scientific research: the inclusion of “commercial activity” within the scope of scientific research allows for commercial actors to benefit from derogations intended for scientific research.
  • Reduced independence of the Information Commissioner: government amendments to the Bill require the Information Commissioner to consider recommendations from the Secretary of State (“SoS”) as to the content of statutory codes on matters such as data sharing.  Where the Information Commissioner does not accept the SoS’s recommendations the Information Commissioner will be required to provide justification for this. This reduces the Information Commissioner’s independence.   

The Bill will provide the SoS the power to make significant amendments to fundamental provisions by secondary legislation, such as:

  • Recognised legitimate interests: adding to the list of legitimate interests which do not require controllers to carry out a legitimate interest assessment. Additions to this list could remove protections from individuals in a number of areas such as where personal data is processed for reasons of commercial interest or for the purposes of employment law.
  • Purpose limitation: adding to the list of types of processing where the purpose limitation principle does not apply, thereby enabling processing of personal data for new purposes, which were not envisaged when the data was collected.
  • Insufficient safeguards for automated decision-making: deciding that certain types of automated processing fall outside the scope of safeguards from solely automated decision-making, thereby enabling the use of decision-making using algorithms without protections for individuals and increasing the risk of unreliability, bias and unfairness.
  • Processing in reliance on relevant international law: allowing the SoS to designate any treaty ratified by the UK as providing a basis in law to process data in the public interest, regardless of whether the treaty provides the requisite protections for data subjects.

Lowering of the test for adequacy

An important principle under the EU data protection regime is that personal data should not be subject to a lower standard of protection simply because it is sent outside the EU.  The GDPR contains mechanisms for ensuring that personal data which is transferred outside the EU is not subject to a lower standard of protection than when the data is in the EU.  Under the GDPR, there are three ways of transferring personal data outside the EU:

  • data adequacy;[7]
  • appropriate safeguards (the most used being standard contractual clauses in contracts between the data exporter and the data importer);[8]
  • derogations (which operate as exceptions to the principle that the standard of protection of personal data must not be lowered due to the transfer of the data outside the EU and can therefore only be relied on in limited circumstances).[9]

Data adequacy is the most useful transfer mechanism because a data exporter can simply transfer the personal data to the jurisdiction in question without needing to put any transfer mechanisms in place.  In order to be ‘adequate’, the European Commission must find that the country, territory or one or more specified sectors within that country, or an international organisation provides a standard of protection of personal data which is essentially equivalent to the standard in the EU.  This does not mean that the relevant laws in the country or territory etc. must be exactly the same as in the EU, but rather that they achieve a very similar effect.  How this can be achieved without nearly identical protections for data subjects is a live consideration for both the EU and the UK.  

Article 45 of the GDPR sets out the factors which the European Commission must take into account when assessing data adequacy.  These include:

  • the data protection laws in the country, as well as other legislation governing human rights and fundamental freedoms and legislation in the areas of security, defence, national security and criminal law and the access of public authorities to personal data;
  • whether the country has one or more independent supervisory authorities responsible for ensuring and enforcing compliance with data protection legislation; and
  • the country’s international commitments.

In its current form, the UK GDPR replicates the EU GDPR’s provisions in respect of adequacy, with the Secretary of State making the assessment in place of the European Commission.  This, along with the other similarities between the two pieces of legislation, forms the basis of the EU’s adequacy decision in respect of the UK which allows the free flow of data from one to the other.

However, this position is significantly altered by the Bill, which deletes the current adequacy provisions in the UK GDPR and replaces them with a completely new test. The test will require the SoS to consider whether the level of protection provided for personal data in the other country “is not materially lower” than in the UK (rather than essentially equivalent). The amendments emphasise the importance of the outcome of the protections provided by another country’s laws, rather than a point-by-point comparison with the UK’s laws. In this regard, the SoS is instructed only to consider the protection of data subjects’ rights “taken as a whole” rather than whether specific rights are respected or even present in the other country.

Changes are also made to the integrity of the adequacy assessment process. The SoS is empowered to take non-data protection considerations, including the desirability of facilitating data flows to a particular country, into account when deciding whether to declare a country as ‘adequate’.[10] The SoS is still required to amend or revoke an adequacy decision if she becomes aware that the level of protection provided for data subjects has become materially lower than in the UK.  However, the instances in which the SoS may become aware of such a fact are significantly lessened. The requirement to review adequacy decisions every four years is removed, and monitoring obligations are weakened.[11] Finally, the Bill does not address the fact that most UK adequacy decisions[12] were adopted into UK law from the EU by way of transitional legislation, without any substantive review of the protections provided by those countries for UK data subjects.

These amendments pose a real risk of lowering the bar in respect of data subjects’ rights and may allow the SoS to find countries adequate which the EU (or the UK under its current legislation) would not find adequate. It also weakens oversight requirements, which could lead to countries remaining “adequate” despite substantial negative changes in their data protection laws.

Widening the scope of Scientific Research

The Bill widens the scope of scientific research by including “privately funded” and “commercial activity” within the definition.[13] It could be argued that this wording is merely clarificatory of the recitals in the UK GDPR.[14] However, research which only serves private interests is difficult to reconcile with international standards on scientific research as well as an opinion by the European Data Protection Supervisor, guidance from the ICO and academic commentary.

Reduced independence of the ICO

Currently, where the Information Commissioner prepares statutory codes under sections 121 – 124 of the Data Protection Act 2018, the SoS is required to lay those codes before Parliament.  Parliament decides whether to approve the relevant code.  Under the current version of the Bill the SoS decides whether to approve the code and can require the Information Commissioner to redraft It before it is laid before Parliament.  This approach was criticised as being incompatible with the regulatory independence of the Information Commissioner.  The amendments to the Bill[15] remove the requirement that the relevant code is approved by the SoS.  Instead, the Information Commissioner must consider recommendations made by the SoS (the “SoS”) in relation to the Information Commissioner’s codes of practice.  The Information Commissioner is required to justify either following or not following any of the SoS’ recommendations.  This reduces the Information Commissioner’s independence as compared with the current position.  

Recognised legitimate interests

Organisations must have a lawful basis under Article 6 UK GDPR to process personal data. Many organisations rely on the lawful basis that processing is necessary for the organisation’s legitimate interests. This is the most flexible lawful basis and can be relied upon to justify the processing of personal data in a wide range of contexts, including commercial contexts. However, in order to lawfully rely on it, organisations must conduct a ‘legitimate interests assessment’ which balance their interests with the rights and freedoms of the relevant individuals. If the individuals’ rights and freedoms outweigh the interests of the organisation, then their interests cannot be said to be legitimate, and they cannot rely on that lawful basis to process the data. 

The Bill deviates from this position by introducing a list of ‘recognised legitimate interests’ as a new Annex 1 to the UK GDPR. If an organisation is processing personal data for one of these recognised legitimate interests, it will no longer be required to carry out a legitimate interest assessment: it can simply carry out the relevant processing of personal data without taking into account the rights and freedoms of individuals.

Many of the types of processing included in the list of recognised legitimate interests are sensible, including processing for the purposes of dealing with emergencies, national security, preventing or detecting crime and safeguarding vulnerable individuals. However, the list also enables the processing of personal data for the purposes of ‘democratic engagement’, including processing by a candidate for election or in relation to a referendum to the extent necessary for the election or referendum campaign (as appropriate). This is an extremely broad right.  In particular, there appears to be no restriction on an election candidate or participant in a referendum processing personal data to carry out data analytics as part of a political campaign, including through the use of third party services. As drafted, there is a risk that this analysis could be done without any consideration for the rights and freedoms of individuals, despite the potential intrusion into the private lives of those individuals – this was most evident in the exploitation of personal data held on Facebook in the context of the Brexit referendum.

It is inappropriate that the processing of personal data for the purposes of political campaigning should be given the same weight as processing for safeguarding, emergencies or national security, all of which relate to key risks that may need to be mitigated by processing personal data in exceptional circumstances. Processing for the purposes of democratic engagement does not combat any such risk, and – in fact – enhances the risk of the exploitation of and encroachment on individual’s rights and the democratic process that can arise in the context of data analytics and political campaigning. As such, there is a real risk that the inclusion of ‘democratic engagement’ in the list of recognised legitimate interests in Annex 1 will have a detrimental effect on individuals engaging with the democratic process, rather than encouraging such engagement.

This risk is particularly apparent in respect of children’s personal data. The amendment allows the processing of personal data for the purposes of democratic engagement with children aged 14 or over.[16] This is despite the fact that 16 is the youngest possible age that a child can vote in the United Kingdom, and only in certain jurisdictions and circumstances.[17] The majority of children cannot vote in the UK; they must be aged 18 or over.

The explanatory notes to the Bill state that these amendments “reflect the variations in voting age across the nation, where in some parts of the UK, such as Scotland, a person can register to vote at the age of 14 as an attainer”.[18] An attainer is someone who has registered to vote in advance of them being able to do so, to allow them to be on the electoral roll as soon as they turn the required age. It is an administrative process aimed at reducing the time between a child becoming eligible to be registered on the electoral roll and voting in an election. It is not clear how processing a child’s personal data for democratic engagement where that child is not legally entitled to vote could ever constitute a legitimate interest – let alone a recognised legitimate interest.

Further, the Bill allows the SoS to amend the list of recognised legitimate interests by regulation. This is a wide power which could enable the SoS any number of processing and interests, including commercial interests, in the list, so that organisations who wish to exploit personal data commercially would not need to balance their interests against the rights and freedoms of the individuals concerned. Any such additions would be made using the affirmative Parliamentary procedure and would therefore not be subject to sufficient Parliamentary scrutiny. While the SoS must take into account the interests and rights and freedoms of individuals in determining how to amend the list of recognised interests, this is not a sufficient safeguard. In particular, such determination is in the hands of the SoS alone, and there is no requirement: (i) on the SoS to conduct a balancing exercise under the UK GDPR or (ii) that any amendments to the list introduced by the SoS must not be overridden by the interests, rights and freedoms of individuals.

Insufficient safeguards for automated decision-making

In its consultation, DCMS noted that, when used responsibly, data-driven artificial intelligence (AI) systems had the potential to bring “incredible benefits to our lives”.[19] However, the consultation also focused on responsible AI where risks were managed, and the rights of data subjects were respected.[20] The Department for Science, Innovation and Technology recently confirmed how public trust is central for accelerating of AI and that there is a need to reduce key risks such as bias, discrimination, infringement of privacy and undermining of human rights.[21]

Under current law, organisations cannot make decisions based solely on automated processing if the decision affects the legal rights (or other equally important matters) of data subjects unless the data subject consents or another specific derogation applies.[22] This need for human intervention acts as a key protection for the rights of individuals. The Bill removes this prohibition and inserts safeguards in place of this restriction. However, these safeguards contain key terms which the SoS can significantly alter by exercising her powers. For example, the scope of “meaningful human involvement”[23] determines whether an automated decision is based “solely on automated processing”[24] and therefore whether safeguards are required in respect of this processing. The SoS can however determine whether such human involvement is or is not to take place in exercising powers under the Bill’s Article 22D(1).

Under the Bill, automated processing safeguards apply when a decision is “significant.”[25] This includes decisions that have a “similarly significant effect for the data subject” [as a legal decision]. The SoS may again determine that such a decision falls (or does not fall) within this category under her powers in Article 22D(2). Moreover, the SoS can also determine what actions constitute a “safeguard” in the context of automated decision-making.[26] As a result of the above, the SoS therefore has significant ability to affect the scope and applicability of the safeguards on automated decision-making without sufficient Parliamentary scrutiny.

The government aims to strengthen the UK’s status as a world and European leader in Science and Technology[27] and build on the UK’s status as world-leading in research and development for AI and a hub for private sector research institutes.[28] In this context, the Bill’s reduced safeguards for automated decision-making combined with the widening of the definition of, and derogations for, “scientific research”, leaves UK individuals’ data rights deprioritised and weakening the public trust that the government seeks to cultivate.

Processing in reliance on relevant international law

The Bill, as amended by government amendment NC6, makes significant changes to core provisions in the UK GDPR relating to processing of data in the public interest – namely, Article 6 (lawfulness of processing), Article 9 (processing of special categories of personal data) and Article 10 (processing of personal data relating to criminal convictions and offences).

The UK GDPR recognises that the public interest can be a legitimate reason to process personal data. However, the basis to process must be clearly set out in UK laws, which must be proportionate and provide appropriate safeguards:

  1. Under Article 6 of the UK GDPR, it is lawful to process personal data if it is “necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller”. But the basis for processing in this way must be laid down by domestic law, which meets an objective of public interest and be proportionate to the legitimate aim pursued.
  2. Special category personal data may be processed under Article 9 of the UK GDPR if it is necessary for “reasons of substantial public interest, on the basis of domestic law which shall be proportionate to the aim pursued, respect the essence of the right to data protection and provide for suitable and specific measures to safeguard the fundamental rights and the interests of the data subject”.
  3. A controller seeking to process personal data relating to criminal convictions and offences data under Article 10 of the UK GDPR must be able to identify a lawful basis under Article 6 of the UK GDPR and ensure that the processing is either under the control of official authority or “authorised by domestic law providing for appropriate safeguards for the rights and freedoms of data subjects”.

Section 10 of the Data Protection Act 2018 states that the basis to process special category data in the substantial public interest is deemed to be “on the basis of domestic law” if it falls into one of the conditions set out in Part 2 of Schedule 1 of the Act. Similarly, section 10 also states that the processing of criminal convictions data is deemed to be “authorised by domestic law” if it falls into one of the conditions in Parts 1, 2 and 3 of Schedule 1 to the Act. Schedule 1 lists a range of conditions, including where the processing is necessary for the administration of justice, equality of opportunity or treatment and preventing or detecting unlawful acts.

The Bill seeks to provide an additional route for those wishing to point to a basis in law to process data in the public interest. In addition to domestic law, the Bill amends the provisions discussed above to enable reliance on a basis in international law – i.e. treaties – to process. This is achieved by creating new Schedule A1 to the DPA, which lists the relevant treaties on which controllers may rely. Processing in the public interest will be deemed to be “authorised by relevant international law” if it falls into one of the conditions in the new Schedule. One treaty is currently listed: the UK-US Agreement relating to access to electronic data for the purpose of countering serious crime.[29]

The SoS is given an incredibly broad regulations-making power to add to this Schedule. The only fetters on this power are that (a) only treaties ratified by the UK can be added to the Schedule and (b) it is subject to the affirmative resolution procedure. Adding a treaty to the Schedule will amount to the SoS declaring that the treaty is proportionate and provides appropriate safeguards for data subjects – but the regulations-making power does not require her to turn her mind to those considerations when deciding to lay regulations. Instead, the SoS is empowered to bypass the protections required by the UK GDPR at present and designate any treaty she wishes.

No rationale is given by the government for such sweeping and significant changes to the UK GDPR, not even to address the fact that this amendment represents a significant departure from the EU’s approach under the EU GDPR (and therefore may pose a risk to EU adequacy). Without appropriate restrictions on the SoS’s power to add new treaties to the Schedule, this amendment represents a significant risk to data subject rights.

Proposed Amendments

The references to processing for the purposes of democratic engagement in the list of recognised legitimate interests in Annex 1 should be removed from the Bill.

The SoS should not be provided with the power to amend Annex 1 or Annex 2 to the UK GDPR. Such amendments should be made via separate Acts of Parliament so that they are subject to appropriate Parliamentary scrutiny.

In the alternative, the SoS’s power to amend Annex 1 (list of recognised legitimate interest) should be restricted so that the SoS must conduct an assessment of individuals’ rights and freedoms prior to amending Annex 1, and must not introduce any additional legitimate interests to the list where such interests are overridden by the rights and freedoms of individuals.

The SoS should not be provided with the power to determine the scope of key terms with respect to automated decision-making, particularly as the government has clear strategic interests that increase the risk of individuals’ rights being eroded. Such amendments should be made via separate Acts of Parliament so that they are subject to appropriate Parliamentary scrutiny.

The amendments which enable the processing of personal data in reliance on relevant international law should be deleted.  


[1] See GOV.UK. (n.d.). The UK’s International Technology Strategy. [online] Available at: https://www.gov.uk/government/publications/uk-international-technology-strategy/the-uks-international-technology-strategy

[2] See ico.org.uk. (2021). ICO response to DCMS consultation ‘Data: a new direction’. [online] Available at: https://ico.org.uk/about-the-ico/media-centre/news-and-blogs/2021/10/response-to-dcms-consultation-foreword/

[3] See GOV.UK. (n.d.). UK Science and Technology Framework. [online] Available at: https://www.gov.uk/government/publications/uk-science-and-technology-framework

[4] See ico.org.uk. (2021). ICO response to DCMS consultation ‘Data: a new direction’. [online] Available at: https://ico.org.uk/about-the-ico/media-centre/news-and-blogs/2021/10/response-to-dcms-consultation-foreword/

[5] See the EU’s adequacy decision under the GDPR in favour of the UK.  

[6] See the EU’s adequacy decision for the UK under the Law Enforcement Directive.

[7] See Article 45 of the GDPR.

[8] See Article 46 of the GDPR. 

[9] See Article 49 of the GDPR.

[10] See Article 45A(3) as inserted by the Bill

[11] See Article 45C as inserted by the Bill.

[12] Except for the two that have been made by the UK post-Brexit in respect of the Republic of Korea and the United States of America.

[13] Article 2 of the Bill inserts a new paragraph 3 of Article 4 UK GDPR to confirm that “references to processes for “scientific research purposes” includes “purposes of any research that can reasonably be described as scientific, whether publicly or privately funded and whether carried out as a commercial or non-commercial activity.”

[14] Recital 159 UK GDPR states that processing of personal data for scientific research purposes should be interpreted broadly and includes “technological development and demonstration” as well as “privately funded research.”

[15] See amendment Gov 45.

[16] Section 87 of the Bill.

[17] See the House of Commons Library report ‘Who can vote in UK elections?’ for further details.

[18] See page 82 of the Explanatory Notes.

[19] Data: a new direction – GOV.UK (www.gov.uk)

[20] DCMS, Data: a new direction (PDF) [online], 10 September 2021, para 63

[21]‘ A Pro-innovation Approach to AI Regulation,’ Department for Science, Innovation & Technology, March 2023, paragraphs 4 and 5

[22] Currently under Article 22 UK GDPR, the exemptions to decisions based solely on automated processing are where processing is (i) necessary for the purposes of a contract between the data subject and an organisation; (ii) authorised by law eg to prevent fraud or tax evasion; or (iii) based on the data subject’s explicit consent

[23] See Article 22A(1)(b)(ii)

[24] See Article 22A(1)(b)(ii)

[25] See Article 22A

[26] See Article 22D(4) and (5)

[27] The UK’s International Technology Strategy’ presented to Parliament by the Foreign Secretary of State for Science, Innovation and Technology, March 2023, para 3

[28] The UK’s International Technology Strategy’ presented to Parliament by the Foreign Secretary of State for Science, Innovation and Technology, March 2023, para 19

[29] UK/USA: Agreement on Access to Electronic Data for the Purpose of Countering Serious Crime [CS USA No.6/2019] – GOV.UK (www.gov.uk)