‘I attempted the Government’s new AI chatbot – it did not like a number of the questions I requested’
This week businesses will be asked to trial an AI chatbot under plans to automate Government services for the public.
Thousands of staff will get the chance to test the technology this week as part of efforts to slim down “outdated and bulky” Government processes. They will be able to ask the chatbot questions about tax or support to set up a business. It is the next stage in a series of trials to check the chatbot works as it should and that users find it helpful.
The Government hopes to be able to roll out the chatbot across the full gov.uk website, which is used by more than 11 million people each week. With an average response time of seven to eight seconds, the tool would help people better access information that can sometimes be buried in the site’s 700,000 pages.
But ministers have warned that the tech must first undergo intense scrutiny to ensure the chatbot gives accurate answers. Restrictions have already been made to the chatbot to stop the tool from responding to questions that “may prompt an illegal answer, share sensitive financial information or force the chatbot to take a political position”.
The Mirror was invited to trial the chatbot at the Government Digital Service to see if it could find any major flaws. Immediately there was a run in when we asked who the Business Secretary is – it didn’t have an answer. This appears due to the fact it is programmed not to answer political questions. But then again, it had no trouble saying that Keir Starmer was the Prime Minister. Sorry Jonathan Reynolds – we know you’re the Business Secretary.
At one stage the chatbot also rejected a simple question about corporation tax regimes. It seemed the order of the words and perhaps the use of the term “regime” caused it to glitch – as it was fine answering a question on corporation tax rates. The chatbot is also well trained to answer questions solely in relation to info on gov.uk – so it sadly wouldn’t even attempt to answer our question on who would win the premier league.
Later I pretended to be a self-employed hairdresser to test out how well it would give me advice – given businesses are the target audience at this stage. It helpfully collated all the info I need, from tax guidance to legal help. But when I got a little deeper and asked how the Budget would affect me it didn’t have an answer. This is because it can only give information about policies that have been officially put in place (in other words, not just if they have been announced).
But to the tech experts’ glee, I found no catastrophic errors. Ultimately all these little hiccups should be ironed out over the multiple trials, plus experts admitted they have erred on the side of caution for now as they don’t want the chatbot providing incorrect information.
Technology Secretary Peter Kyle has admitted there are always risks with using tech in this way but that he wants safety to be baked into the product. No AI products are 100% accurate as it stands. Generative AI can sometimes produce incorrect, misleading or false information, which are known as hallucinations.
Currently the hallucinations that have emerged in the chatbot have not been harmful. They have included making up a URL link for instance. Experts are monitoring both the amount of hallucinations that occur but also how significant they are.
(
Stoke Sentinel)
Researchers have also been working with the intelligence agencies to ensure the tech is safe from any bad actors. For example this might include a hacker in China who wants to try to make the official chatbot say something rude or embarrassing to humiliate the Government.
Deliberately pursuing ways to manipulate Large Language Models into producing inappropriate or harmful content, often for malicious purposes is known as “jailbreaking”. The gov.uk AI team admits its chatbot is at risk of this, adding: “We cannot totally prevent deliberate malicious activity – but please be reassured that this will not affect your ability to try a tool we hope you will find useful as a way of accessing the information and services you need.”
Be the first with news from Mirror Politics
US ELECTION WHATSAPP: Join our US Election WhatsApp group here to be first to get all the biggest news and results as America heads to the polls. We also treat our community members to special offers, promotions, and adverts from us and our partners. If you don’t like our community, you can check out any time you like. If you’re curious, you can read our Privacy Notice.
POLITICS WHATSAPP: Be first to get the biggest bombshells and breaking news by joining our Politics WhatsApp group here. We also treat our community members to special offers, promotions, and adverts from us and our partners. If you don’t like our community, you can check out any time you like. If you’re curious, you can read our Privacy Notice.
NEWSLETTER: Or sign up here to the Mirror’s Politics newsletter for all the best exclusives and opinions straight to your inbox.
PODCAST: And listen to our exciting new political podcast The Division Bell, hosted by Mirror interim political editor Lizzy Buchan and Express political editor Sam Lister, every Thursday.
Speaking about the opportunity of a new chat bot, Mr Kyle added: “Outdated and bulky government processes waste people’s time too often, with the average adult in the UK spending the equivalent of a working week and a half dealing with public sector bureaucracy every year. With all new technology, it takes time to get it right so we’re taking it through extensive trials with thousands of real users before it is used more widely.”