News Hub

New terrorism laws needed to counter AI chatbot threat, says UK Government adviser Jonathan Hall

Written by Wed 3 Jan 2024

New terrorism laws are required to safeguard against the threat of extremism by artificial intelligence (AI) chatbots, according to Jonathan Hall KC, the UK Government’s Independent Reviewer of Terrorism Legislation. 

The independent reviewer of terrorism legislation warned in the Daily Telegraph of the dangers of AI in recruiting a new generation of extremists.

Hall posed as an average person to test how AI chatbots would respond. One chatbot contacted by Hall ‘did not stint in its glorification of Islamic State’. However, no legal offence was committed, as the source of the statement is not human.

“Only human beings can commit terrorism offences, and it is hard to identify a person who could in law be responsible for chatbot-generated statements that encouraged terrorism,” said Jonathan Hall.

Hall added that the revelation showed the urgent need to reassess the UK’s current terror laws. He added that the Online Safety Act that was passed in November was ‘laudable’ but was still ‘unsuited to sophisticated Generative AI’. 

Hall attributed this to the fact the Act did not consider that material generated by chatbots, as opposed to giving ‘pre-scripted responses’ that are ‘subject to human control’.

“Investigating and prosecuting anonymous users is always hard, but if malicious or misguided individuals persist in training terrorist chatbots, then new laws will be needed,” said Hall.

He emphasised the need for potential new laws to impose sanctions on both users creating radicalising AI chatbots and the tech companies hosting them.

Industry Figures Weigh in on AI Laws

CEO of cybersecurity company RiverSafe, Suid Adeyanju, said AI chatbots pose a huge risk to national security, especially when legislation and security protocols are continually playing catch-up.

“In the wrong hands, these tools could enable hackers to train the next generation of cybercriminals, providing online guidance around data theft and unleashing a wave of security breaches against critical national infrastructure. 

“It is time to wake up to the very real risks posed by AI, and for businesses and the Government to get a grip and put the necessary safeguards in place as a matter of urgency,” added Adeyanju.

Global Security Adviser at ESET, Jake Moore, said the unethical dangers of AI can be mitigated by ‘baking in’ the right level of principles into the algorithm. 

“The majority of AI is still taught by the building blocks it was designed from and therefore, the right tweaks can be adopted to steer the outputs away from becoming a beast if adopted early enough. Legislation is difficult with this constantly evolving technology but a basic structure designed to reduce the risk of recruiting extremists does not have to be problematic,” said Moore.

Moore added that AI content needs continual focus so the quicker the Government can get on top of the basics, the quicker it can be controlled.

Director at tech consultancy VeUP, Josh Boer, said the Government needs to be mindful of harmful AI without stifling innovation. 

“For a start, we need to beef up our digital skills talent pipeline, not only getting more young people to enter a career in the tech industry but empowering the next generation of cyber and AI businesses so they can expand and thrive,” said Boer. 

Boer added there are far too many businesses that are starved of cash and lack the support they need to thrive. He added that failure to address this major issue will not only damage the long-term future of UK PLC, but it will also play into the hands of cyber criminals who wish to cause harm.

In November, the 2023 ISC2 Cybersecurity Workforce Study highlighted a concerning workforce gap, with an additional 4 million professionals needed to adequately protect digital assets. This gap represents a record high.

Economic uncertainty, the rise of AI, fragmented regulations, and skills gaps were also cited as prominent issues.

A significant 75% of professionals report the current threat landscape as the most challenging in the last five years, and only 52% believe their organisation is adequately equipped with the necessary tools and personnel to respond to cyber incidents in the next few years.

Join Cloud & Cyber Security Expo

6-7 March 2024, ExCeL London

Cloud & Cyber Security Expo is one of the largest IT security events in Europe.

Don’t miss the chance to build partnerships and discover solutions to protect your business.

Written by Wed 3 Jan 2024

Send us a correction Send us a news tip