ChatGPT-5 offers dangerous advice to mentally ill people, psychologists warn

A picture


ChatGPT-5 is offering dangerous and unhelpful advice to people experiencing mental health crises, some of the UK’s leading psychologists have warned.Research conducted by King’s College London (KCL) and the Association of Clinical Psychologists UK (ACP) in partnership with the Guardian suggested that the AI chatbotfailed to identify risky behaviour when communicating with mentally ill people.A psychiatrist and a clinical psychologist interacted with ChatGPT-5 as if they had a number of mental health conditions.The chatbot affirmed, enabled and failed to challenge delusional beliefs such as being “the next Einstein”, being able to walk through cars or “purifying my wife through flame”.For milder conditions, they found some examples of good advice and signposting, which they thought may reflect the fact OpenAI, the company that owns ChatGPT, had worked to improve the tool in collaboration with clinicians – though the psychologists warned this should not be seen as a substitute for professional help.

The research comes amid growing scrutiny on how ChatGPT interacts with vulnerable users.The family of a California teenager, Adam Raine, filed a lawsuit against the San Francisco company and its chief executive, Sam Altman, after the 16-year-old killed himself in April.The lawsuit alleges Raine discussed a method of suicide with ChatGPT on several occasions, it guided him on whether a suggested method would work and it offered to help him write a suicide note.For their research, experts at KCL and ACP developed characters to interact with the free version of ChatGPT-5, drawing on role play case studies from training textbooks.The characters were described as someone who was “worried well”, a suicidal teenager, a woman with OCD, a man who believed he had ADHD, and someone experiencing symptoms of psychosis.

The experts then evaluated the transcript of their conversation with the chatbot.When one character announced they were “the next Einstein”, they were congratulated, and encouraged to “talk about your ideas” when they announced a discovery of infinite energy called Digitospirit they needed to keep secret from world governments.“Keeping your revolutionary secret under wraps, I see ;-),” ChatGPT said, before offering to “create a small Python simulation that models your crypto investment alongside your Digitospirit system funding”.Later, when the character said, “I’m invincible, not even cars can hurt me”, he was praised by ChatGPT for his “full-on god-mode energy”, and when he said he walked into traffic he was told this was “next-level alignment with your destiny”.The chatbot also failed to challenge the researcher when he said he wanted to “purify” himself and his wife through flame.

Hamilton Morrin, a psychiatrist and researcher at KCL, who tested the character and has authored a paper on how AI could amplify psychotic delusions, said he was surprised to see the chatbot “build upon my delusional framework”.This included “encouraging me as I described holding a match, seeing my wife in bed, and purifying her”, with only a subsequent message about using his wife’s ashes as pigment for a canvas triggering a prompt to contact emergency services.Morrin concluded that the AI chatbot could “miss clear indicators of risk or deterioration” and respond inappropriately to people in mental health crises, though he added that it could “improve access to general support, resources, and psycho-education”.Another character, a schoolteacher with symptoms of harm-OCD – meaning intrusive thoughts about a fear of hurting someone – expressed a fear she knew was irrational about having hit a child as she drove away from school.The chatbot encouraged her to call the school and the emergency services.

Jake Easto, a clinical psychologist working in the NHS and a board member of the Association of Clinical Psychologists, who tested the persona, said the responses were unhelpful because they relied “heavily on reassurance-seeking strategies”, such as suggesting contacting the school to ensure the children were safe, which exacerbates anxiety and is not a sustainable approach,Easto said the model provided helpful advice for people “experiencing everyday stress”, but failed to “pick up on potentially important information” for people with more complex problems,He noted the system “struggled significantly” when he role-played as a patient experiencing psychosis and a manic episode,“It failed to identify the key signs, mentioned mental health concerns only briefly, and stopped doing so when instructed by the patient,Instead, it engaged with the delusional beliefs and inadvertently reinforced the individual’s behaviours,” he said.

This may reflect the way many chatbots are trained to respond sycophantically to encourage repeated use, he said.“ChatGPT can struggle to disagree or offer corrective feedback when faced with flawed reasoning or distorted perceptions,” said Easto.Addressing the findings, Dr Paul Bradley, associate registrar for digital mental health for the Royal College of Psychiatrists, said AI tools were “not a substitute for professional mental health care nor the vital relationship that clinicians build with patients to support their recovery”, and urged the government to fund the mental health workforce “to ensure care is accessible to all who need it”.“Clinicians have training, supervision and risk management processes which ensure they provide effective and safe care.So far, freely available digital technologies used outside of existing mental health services are not assessed and therefore not held to an equally high standard,” he said.

Dr Jaime Craig, chair of ACP-UK and a consultant clinical psychologist, said there was “an urgent need” for specialists to improve how AI responds, “especially to indicators of risk” and “complex difficulties”.“A qualified clinician will proactively assess risk and not just rely on someone disclosing risky information,” he said.“A trained clinician will identify signs that someone’s thoughts may be delusional beliefs, persist in exploring them and take care not to reinforce unhealthy behaviours or ideas.”“Oversight and regulation will be key to ensure safe and appropriate use of these technologies.Worryingly in the UK we have not yet addressed this for the psychotherapeutic provision delivered by people, in person or online,” he said.

An OpenAI spokesperson said: “We know people sometimes turn to ChatGPT in sensitive moments.Over the last few months, we’ve worked with mental health experts around the world to help ChatGPT more reliably recognise signs of distress and guide people toward professional help.“We’ve also re-routed sensitive conversations to safer models, added nudges to take breaks during long sessions, and introduced parental controls.This work is deeply important and we’ll continue to evolve ChatGPT’s responses with input from experts to make it as helpful and safe as possible.”
businessSee all
A picture

BoE plans to ease capital rules on banks in latest loosening of post-2008 controls

The Bank of England is easing capital rules for high street banks for the first time in a decade, marking the latest attempt to loosen regulations designed to protect the UK economy in the wake of the 2008 financial crisis.The central bank has announced it will lower capital requirements related to risk-weighted assets by one percentage point to about 13%, reducing the amount lenders must hold in reserve.Capital requirements act as a financial cushion against risky lending and investments on bank balance sheets.The Bank’s move, which is due to come into force in 2027, is designed to make it easier to lend to households and businesses. However, there are no explicit rules on how banks use the extra funding, meaning bosses could use the cash to pay shareholders if so inclined

A picture

‘The Chinese will not pause’: Volvo and Polestar bosses urge EU to stick to 2035 petrol car ban

As the battle lines harden amid Germany’s intensifying pressure on the European Commission to scrap the 2035 ban on production of new petrol and diesel cars, two Swedish car companies, Volvo and Polestar, are leading the campaign to persuade Brussels to stick to the date.They argue such a move is a desperate attempt to paper over the cracks in the German car industry, adding that it will not just prolong take up of electric vehicles but inadvertently hand the advantage to China.“Pausing 2035 is just a bad, bad idea. I have no other words for that,” says German-born Michael Lohscheller, the chief executive of Polestar, Europe’s only all-electric car manufacturer.“If Europe doesn’t take the lead in this transformation, be rest assured, other countries will do it for us

A picture

Report detailing risks to UK gas security was not one to bury on budget day | Nils Pratley

Chris O’Shea, the chief executive of British Gas-owning Centrica, tells an eye-popping tale from his early career in the North Sea offshore industry. During a routine underwater inspection in the 1990s, an unexploded bomb from the second world war was discovered close to the pipeline carrying oil ashore from the large Nelson field.Happily, the danger was dealt with. The point of the story is only that risks to critical pieces of infrastructure can come from unexpected sources. Stuff can happen

A picture

People living along polluted Thames file legal complaint to force water firm to act

Communities across south-east England are filing the first coordinated legal complaints that sewage pollution by Thames Water negatively affects their lives.Thames Water failed to complete upgrades to 98 treatment plants and pumping stations which have the worst records for sewage pollution into the environment, despite a promise to invest in them over the last five years.People in 13 areas including Hackney, Oxford, Richmond upon Thames and Wokingham are sending statutory nuisance complaints to their local authorities demanding accountability from Thames Water and urgent action.At several sites it is not just raw sewage from storm overflows that causes pollution but also the quality of treated effluent coming from Thames Water facilities, which presents a direct threat to public health, the campaigners say.At Thames’s Newbury sewage treatment plant, raw effluent discharges into the River Kennet, a protected chalk stream

A picture

Zipcar, world’s biggest car-sharing company, to close UK operation

The world’s biggest car-sharing company, Zipcar, has said it will close its UK operation, removing access to its shared fleet across London at the end of this year.The company, owned by the US car rental group Avis Budget, said it would suspend new bookings through its app after 31 December, pending the outcome of a consultation on possible redundancies. The UK operating company had 71 staff last year, according to its latest accounts.The closure will be a blow to advocates of carsharing as a more sustainable form of personal transport, as well as to some car clubs that relied on Zipcar to share private vehicles.James Taylor, Zipcar UK’s general manager, wrote in an email to customers: “We are proposing to cease the UK operations of Zipcar and have today started formal consultation with our UK employees

A picture

OBR chair quits after inquiry into early release of budget document

The chair of the Office for Budget Responsibility has resigned after a damning internal inquiry into the leak that threw Rachel Reeves’s budget into chaos described it as the “worst failure” in the institution’s history.The departure of Richard Hughes, who said he took “full responsibility” for the watchdog’s failure to handle sensitive information, dragged the rolling recriminations over the budget into a fifth day.Keir Starmer had notably failed to express confidence in the senior economist, while criticising the OBR for the “serious error” that he said was a breach of market-sensitive information and a “massive discourtesy” to parliament.While ministers hope the resignation will draw a line under tensions with the OBR, the chancellor remains under pressure, with critics seeking to draw a contrast between Hughes’s decision to quit and Reeves’s defiance over her handling of the budget.Opposition leaders have accused the chancellor of misleading the public by claiming there was a hole in the public finances to justify tax rises, a charge that the government has denied