As a communications and engagement manager for a social housing provider in south Wales, writing is Jade's second job.
Experts express concerns about chatbots around potential biases and limitations, lack of safeguarding and the security of users' information. But some believe that if specialist human help is not easily available, chatbots can be a help. So with NHS mental health waitlists at record highs, are chatbots a possible solution?Character.ai and other bots such as Chat GPT are based on "large language models" of artificial intelligence. These are trained on vast amounts of data – whether that's websites, articles, books or blog posts - to predict the next word in a sequence. From here, they predict and generate human-like text and interactions.
The way mental health chatbots are created varies, but they can be trained in practices such as cognitive behavioural therapy, which helps users to explore how to reframe their thoughts and actions. They can also adapt to the end user's preferences and feedback.Hamed Haddadi, professor of human-centred systems at Imperial College London, likens these chatbots to an "inexperienced therapist", and points out that humans with decades of experience will be able to engage and "read" their patient based on many things, while bots are forced to go on text alone."They [therapists] look at various other clues from your clothes and your behaviour and your actions and the way you look and your body language and all of that. And it's very difficult to embed these things in chatbots."
Another potential problem, says Prof Haddadi, is that chatbots can be trained to keep you engaged, and to be supportive, "so even if you say harmful content, it will probably cooperate with you". This is sometimes referred to as a 'Yes Man' issue, in that they are often very agreeable.And as with other forms of AI, biases can be inherent in the model because they reflect the prejudices of the data they are trained on.
Prof Haddadi points out counsellors and psychologists don't tend to keep transcripts from their patient interactions, so chatbots don't have many "real-life" sessions to train from. Therefore, he says they are not likely to have enough training data, and what they do access may have biases built into it which are highly situational.
"Based on where you get your training data from, your situation will completely change."People in Moscow are laughing at this idea, because the party which will suffer the most… is the American shale oil industry, the least cost-competitive oil industry in the world," Mr Milov told the BBC.
Mr Raghunandan says that Russia's cost of producing crude is also lower than in Opec countries like Saudi Arabia, so they would be hurt by lower oil prices before Russia."There is no way that Saudi Arabia is going to agree to that. This has been tried before. This has led to conflict between Saudi Arabia and the US," he says.
Ms Rosner says there are both moral and practical issues with the West buying Russian hydrocarbons while supporting Ukraine."We now have a situation in which we are funding the aggressor in a war that we're condemning and also funding the resistance to the war," she says. "This dependence on fossil fuels means that we are really at the whims of energy markets, global energy producers and hostile dictators."