The UK’s National Cyber Security Centre (NCSC) has released a landmark cyber threat assessment, warning that rapid advances in artificial intelligence (AI) will make cyber attacks more frequent, effective and harder to detect by 2027. The digital divide between organisations with the resources to defend against digital threats, and those without, will inevitably increase.
Published on the opening day of CYBERUK, the UK’s flagship cyber security conference, the report outlines how both state and non-state actors are already exploiting AI to increase the speed, scale and sophistication of cyber operations. Generative AI is enabling more convincing phishing attacks and faster malware development. This significantly lowers the barrier to entry for cyber crime and cyber intelligence.
Of particular concern is the rising risk to the UK’s democratic processes, Critical National Infrastructure (CNI) and commercial sectors. Advanced language models and data analysis capabilities are used to craft highly persuasive content, resulting in more frequent attacks that are difficult to detect.
Andy Ward, SVP International at Absolute Security, says, “While AI offers significant opportunities to bolster defences, our research shows 54% of CISOs feel unprepared to respond to AI-enabled threats. That gap in readiness is exactly what attackers will take advantage of.”
“To counter this, businesses must go beyond adopting new tools – they need a robust cyber resilience strategy built on real-time visibility, proactive threat detection, and the ability to isolate compromised devices at speed.”
This latest warning forms part of the UK Government’s wider cyber strategy after announcing the new AI Cyber Security Code of Practice earlier this year. This will form the basis of a new global standard to secure AI and ensure national security keeps pace with technological evolution, safeguarding the country against emerging digital threats.
For more from NCSC click here.
Head office & Accounts:
Suite 14, 6-8 Revenge Road, Lordswood
Kent ME5 8UD
T: +44 (0)1634 673163
F: +44 (0)1634 673173
© 2025 All Things Media Ltd.