John says that everyone needs groceries so the number of people whose shopping can be tracked is huge.
Hamed Haddadi, professor of human-centred systems at Imperial College London, likens these chatbots to an "inexperienced therapist", and points out that humans with decades of experience will be able to engage and "read" their patient based on many things, while bots are forced to go on text alone."They [therapists] look at various other clues from your clothes and your behaviour and your actions and the way you look and your body language and all of that. And it's very difficult to embed these things in chatbots."
Another potential problem, says Prof Haddadi, is that chatbots can be trained to keep you engaged, and to be supportive, "so even if you say harmful content, it will probably cooperate with you". This is sometimes referred to as a 'Yes Man' issue, in that they are often very agreeable.And as with other forms of AI, biases can be inherent in the model because they reflect the prejudices of the data they are trained on.Prof Haddadi points out counsellors and psychologists don't tend to keep transcripts from their patient interactions, so chatbots don't have many "real-life" sessions to train from. Therefore, he says they are not likely to have enough training data, and what they do access may have biases built into it which are highly situational.
"Based on where you get your training data from, your situation will completely change."Even in the restricted geographic area of London, a psychiatrist who is used to dealing with patients in Chelsea might really struggle to open a new office in Peckham dealing with those issues, because he or she just doesn't have enough training data with those users," he says.
Philosopher Dr Paula Boddington, who has written a textbook on AI Ethics, agrees that in-built biases are a problem.
"A big issue would be any biases or underlying assumptions built into the therapy model.""Are there even any rules informing customers before they enter? How on earth can the average person understand the extent of the tracking?
"Are children even kept out of the dataset? Who can access this data? Is it shared with police? Is my data being sold? We need answers to these questions!"Heather, 30, from Nottingham says the tech makes her feel uncomfortable and punishes shoppers who are honest and use the self-scan as intended.
"Yes, you have cameras following you everywhere in the store, but this is simply too invasive," she told the BBC."If stores are so paranoid about shoppers stealing goods, then they should go back to staffed tills instead of wasting money on this invasive technology."