Delivery partners from the Laboratory for AI Security Research (LASR) have held their first joint event after commencing their work to bring together industry, academia and HMG experts to seize the benefits and opportunities of AI for UK national security.
The programme, launched by the Chancellor of the Duchy of Lancaster in November with over £8 million of government funding, is a new public-private partnership bringing together Government departments and agencies with innovation and research teams from Plexal, University of Oxford, The Alan Turing Institute and Queen’s University Belfast.
Building on the existing world-leading expertise of UK academic institutions, LASR will act as a new centre of excellence for AI Security Research. The Lab will engage with the broader cyber and AI ecosystem nationally and internationally to support the work of dedicated researchers, who will explore vulnerabilities in AI systems.
Taking a catalytic approach to the UK’s AI security ecosystem, the programme is seeking public and private input. Programme partners and the UK Government emphasised their commitment to commercialising research outputs where appropriate, supporting wider growth and prosperity.
LASR is part of the government’s wider work to improve the UK’s cyber defences and grow the economy, which includes the forthcoming Cyber Security and Resilience Bill and recent designation of data centres as critical national infrastructure.
Plexal, the innovation company headquartered on London’s Here East campus, will convene LASR’s multi-disciplinary approach and drive collaboration between stakeholders from across the global technology ecosystem. At a dedicated space in London, they will bring industry and partners together to collaborate on AI security innovation, engaging with private industry, addressing emerging security needs driven by increasing AI adoption by connecting innovation with policy requirements to support commercialisation of these solutions.
“AI adoption presents tremendous economic and societal opportunities, but we must be mindful of threats emerging,” said Saj Huq, CCO and head of innovation at Plexal (pictured).
“Through this world-class LASR partnership, Plexal will drive the development and commercialisation of breakthrough solutions to enhance resilience of public and private sectors, creating growth vectors for the UK’s tech ecosystem.”
Oxford University will advance LASR with a multidisciplinary approach, drawing on expertise across the University’s research ecosystem. Five departments from the Mathematical, Physical and Life Sciences Division (MPLS) will support an initial cohort of ten doctoral students conducting fundamental and applied research into AI and Machine Learning security. The Global Cyber Security Capacity Centre will conduct investigations into emerging system risks, with a particular focus on AI supply chains and national cybersecurity preparedness.
Professor James Naismith, Head of the Mathematical, Physical, and Life Sciences (MPLS) Division at Oxford University, said: “LASR is an exciting and important collaboration between government and Oxford University scientists across the Mathematical, Physical and Life Sciences division (MPLS). We will work together to develop a rigorous understanding of the security and reliability of emerging AI systems. As these systems grow in power and utility, this work is extremely timely.
“Supporting an initial cohort of ten doctoral students across the mathematical and physical sciences, we are delighted to be attracting and training the next generation of researchers in this exciting initiative.”
The Alan Turing Institute, the national institute for data science and AI, will deliver cutting-edge research on AI security to help the UK stay one step ahead of hostile actors. Working with other LASR partners and academic institutions across the UK and internationally, the Institute will address challenges such as understanding vulnerabilities and detecting interference in AI models and explore how to build safeguards to prevent them from being used for malicious purposes.
Dr Jean Innes, CEO of The Alan Turing Institute, added: “The adoption of AI offers huge opportunities for businesses and public services, but we won’t realise the potential of these technologies unless we protect them from interference and attack.
“Mobilising the UK’s world-leading AI security research community to address this challenge will keep us one step ahead of those who seek to cause us harm.”
The Centre for Secure Information Technologies (CSIT), the UK’s national innovation and knowledge centre for cybersecurity based at Queen’s University Belfast, will harness its Cyber-AI hub to pioneer the integration of AI and cyber security, building on its existing facility with a dedicated maker space for Cyber-AI.
The hub will provide resources for collaboration between industry and academia to advance research and innovation, and to develop talent in the AI security domain.
Professor Paul Miller, Deputy Director CSIT, concluded: “At CSIT, Queen’s University Belfast, we view AI security as the emerging frontier for cyber security. Our current research includes ensuring AI-based malware detection models are robust against adversarial malware attacks, crucial for securing the internet-of-things; and formal verification of systems of AI models deployed on autonomous vehicles.
“Building on this, CSIT believes the LASR programme is an exciting opportunity that will help position the UK as the go-to place globally for research and innovation in AI security.”