
1. Broader Digital Policy Landscape Raises Privacy Concerns
Civil liberties organisations are watching new data laws in the context of wider UK digital governance changes, including proposals for digital identity systems and the use of automated technologies:
Digital ID Systems
- The government’s proposed national digital ID (e.g., “BritCard”) initiative has sparked widespread criticism from privacy advocates, who argue it could centralise sensitive personal information and enable increased state monitoring of citizens’ daily activities. Critics worry a digital identity database linking employment eligibility, access to services, and other personal details could flip into a de facto surveillance tool if safeguards are weak or expansionary.
- Civil liberties groups like Big Brother Watch have said national digital ID systems pose a “serious threat to civil liberties” because they can allow the state to amass large volumes of personal data in centralised government databases — potentially trackable and actionable across contexts such as employment, housing, healthcare, and welfare.
- Parliamentary motions have explicitly flagged digital ID as posing risks of unprecedented levels of monitoring, tracking, and oversight of everyday activities by the state.
These concerns aren’t about DUAA directly but illustrate how data accessibility reforms intersect with other digital governance proposals to raise civil liberties alarms.
2. Digital Rights Groups’ Critiques of Data Law Reforms
Organisations such as the Open Rights Group and civil liberties advocates expressed unease during the bill’s parliamentary stages that some provisions could weaken rights protections or grant executive powers with limited scrutiny:
The Open Rights Group warned that certain elements of the Data Use and Access Bill (the precursor to DUAA) could lower data protection standards and erode public trust, especially in how new technologies such as AI are governed.
Other critics highlighted concerns about political oversight: the bill, in its earlier form, included clauses that might allow the Secretary of State to amend key data protection rules by statutory instrument (secondary legislation), reducing parliamentary scrutiny over significant policy changes.
While many such powers were scaled back or reframed before final passage, these debates signal civil liberties vigilance around government ability to manipulate data law flexibly.
3. Automated Decision-Making, AI, and Privacy

Civil liberties groups also flagged automated systems — especially those powered by AI — as a potential vector for unchecked data use:
During parliamentary debate, civil liberties advocates urged lawmakers to retain strong protections against automated or AI-driven decisions that significantly impact individuals (for example, in areas like benefits, law enforcement, or service eligibility). Some groups sent letters urging removal of proposals to relax those safeguards.
The Open Rights Group’s briefing highlighted that lowering important protections might weaken privacy and make systems more opaque, especially with algorithmic decision-making that isn’t transparent or accountable.
Concerns here echo wider civil society debates about automated processing, algorithmic governance, and surveillance via AI systems, especially where there isn’t clear oversight.
4. Historical Context Amplifies Worries About Surveillance
Some of the discomfort around data reforms is rooted in historical UK surveillance debates. For instance:
Previous legislative efforts like the Communications Data Bill (2008) — nicknamed the “Snooper’s Charter” — were heavily criticised by civil liberties campaigners for attempting to create extensive databases of email, web browsing, and communications metadata, seen as a step toward mass surveillance. Though that bill was defeated, the legacy of those debates still influences current reactions to data law changes.
Broader digital safety laws like the Online Safety Act 2023 provoked criticism from civil liberties organisations over expansive regulatory powers affecting speech, encryption, and platform content moderation, with some commentators warning of mission creep into surveillance realms.
This reflects a wider context where civil liberties groups scrutinise any expanded access to personal data — especially when tied into security or efficiency narratives.
5. Facial Recognition and Real-World Surveillance
While not part of DUAA itself, contemporary UK initiatives such as live facial recognition (LFR) technology used by police illustrate how government use of data and biometric systems can kindle civil liberties concern:
Expansion of LFR technologies, for example in police “surveillance vans,” has drawn criticism from campaigners over privacy invasion and lack of sufficient oversight, with groups like Big Brother Watch calling such expansion a sign of a “significant expansion of the surveillance state.”
While this is separate from DUAA, it speaks to public sensitivity about state access to biometric and personal data that feeds into concerns when data laws are updated.
