Drawn from published evidence and regulatory guidance specific to government and public sector. Each is pre-scored on a 5×5 likelihood × impact matrix in the Risk Register tool and referenced in the generated policy.
CriticalLikelihood 4 · Impact 5
AI Welfare and Benefits Automation Causing Wrongful Denial of Citizens' Entitlements
AI systems used to determine eligibility for, calculate, or administer welfare benefits, housing allocations, disability assessments, tax credits, and social support services produce wrongful denials, incorrect payment calculations, or automated debt recovery actions against citizens who are legally entitled to support — at scale and at speed that overwhelms administrative review capacity — causing destitution, homelessness, and serious physical and mental health harm to the most vulnerable citizens, as documented in the Robodebt scandal in Australia and AI-driven benefits errors in multiple jurisdictions.
CriticalLikelihood 3 · Impact 5
AI Facial Recognition and Biometric Surveillance Producing Discriminatory Misidentification
AI facial recognition systems deployed by law enforcement for suspect identification, person-of-interest tracking, or border control produce false positive identifications at significantly higher rates for Black, Asian, and minority ethnic individuals, women, and children — as demonstrated in multiple independent evaluations of deployed police facial recognition systems — leading to wrongful arrests, unlawful detention, traumatic police encounters, and potentially wrongful prosecution of misidentified innocent persons.
CriticalLikelihood 3 · Impact 5
AI Criminal Risk Scoring Perpetuating Systemic Racial Bias in Justice Outcomes
AI recidivism risk assessment, bail risk scoring, sentencing advisory, and parole recommendation tools used in the justice system are trained on historical criminal justice data encoding decades of racially and socioeconomically discriminatory policing, charging, and sentencing practices — producing risk scores that systematically overestimate re-offending risk for minority ethnic defendants and underestimate it for white defendants, influencing judicial decisions on remand, sentencing, and release in ways that perpetuate structural racial inequality in criminal justice outcomes.
CriticalLikelihood 3 · Impact 5
AI Social Scoring and Predictive Profiling Undermining Democratic Rights and Civil Liberties
AI systems used by public authorities to create comprehensive citizen risk profiles, social trustworthiness scores, or predictive threat assessments — by aggregating data across government databases, social media monitoring, financial records, and behavioural analytics — create a surveillance infrastructure that chills the exercise of democratic rights including freedom of speech, freedom of assembly, and political participation, and where used to determine access to public services, permits, or government contracts constitute the prohibited social scoring practice under EU AI Act Article 5(1)(e).
CriticalLikelihood 4 · Impact 5
AI-Enabled Disinformation and Adversarial Manipulation of Government AI Systems
State and non-state adversaries deploy AI-generated disinformation — including deepfake government communications, AI-synthesised official statements, AI-manipulated public consultation responses, and adversarial inputs designed to manipulate AI systems used in border control, law enforcement, and critical infrastructure — undermining public trust in government communications, corrupting AI-assisted government decision-making, and potentially enabling mass manipulation of democratic processes through AI-generated influence at scale.
HighLikelihood 4 · Impact 4
AI Government Service Exclusion Creating Digital Divide and Access to Justice Gaps
AI-first public service delivery — including AI chatbots replacing human advisors, AI document processing replacing accessible human review, and AI-gated online service portals — systematically excludes citizens without digital skills or internet access, elderly citizens, disabled citizens, those with low literacy, and those in rural or economically deprived areas from accessing public services, welfare entitlements, legal aid, and justice mechanisms they are legally entitled to receive.