General Sir Richard Barrons warned that ‘scary’ advances in artificial intelligence could see such bioweapons used to target specific groups. He said governments must do more
The use of AI to enhance biological weapons could destroy the human race, a former top soldier has claimed. General Sir Richard Barrons spoke of a nightmare worst case scenario where “scary” advances in artificial intelligence will have the potential to cause widespread devastation.
He argued that governments must do more to counter the threat before hostile states or actors use it to more easily develop deadly pathogens, the i newspaper reported. Gen Barrons said: “The most terrifying thing is the same technology that makes individually targeted medicines could make individually or targeted pathogens.
“And one scary side of that is a state does it – so, a biowar. Essentially, this pathogen targets a characteristic of people they don’t like.
“It can be race, it can be colour, whatever – that’s horrible. Or scary, scary AI, which drives these things, produces and releases a pathogen that’s designed to kill all humans, and to which there is no antidote.
“None of that is imminently likely, but what it does say is, we need to think about the risk to our security, indeed our existence.” A report by the authoritative Washington DC-based Centre for Strategic and International Studies warned earlier this year of the potential threat of AI being adopted by rogue states or terrorists.
“It said popular large language models, such as ChatGPT, “could soon drastically lower” the barriers to “planning and executing biological attacks” caused by a lack of knowledge on the part of the would-be creators. AI could inadvertently help “novices develop and acquire bioweapons by providing critical information and step-by-step guidance”, the report said.
It added: “While some of the leading companies are voluntarily imposing safeguards, the overall trajectory nevertheless points towards a near-term future in which policymakers must confront bioterrorism risks. “Not just from sophisticated state and terrorist organisations, but potentially from individuals with little technical background but access to popular LLMs.”
Future AI biological design tools, or BDTs, could assist in “developing more harmful or even novel epidemic- or pandemic-scale pathogens”, the report found. It said: “Malicious actors might use a more advanced BDT with generative capabilities to create a new version of bird flu that is both highly lethal and highly contagious.
“Such a scenario would make the Covid-19 pandemic – which has claimed more than 27 million lives and shaken the world economy – look like the common cold.” Boffins believe that while the potential harm from such weapons is high, the likelihood of them being used to devastating effect is still low.
Dr Alexander Ghionis, research fellow in chemical and biological security at the University of Sussex Business School, said: “The moment an actor cannot reliably predict who will be affected, for how long and under what conditions, the strategic value of an incapacitating agent quickly falls away. Once you look closely at intent and utility, the number of actors for whom an uncontrollable, highly lethal biological agent makes strategic sense is very small.
“The risks – blowback, loss of control, political isolation, economic self-harm – far outweigh any plausible benefit for most states, and for many non-state groups as well.” The Government has set up a review to evaluate “how advanced AI could assist chemical and biological misuse, and worked with companies to address risks and develop their own thresholds”.
For the latest breaking news and stories from across the globe from the Daily Star, sign up for our newsletters.