Two mental health chatbot apps, Wysa and Woebot, have required updates after struggling to handle reports of child sexual abuse. In tests, neither of the apps urged an apparent victim to seek emergency help. They also had problems dealing with eating disorders and drug use.

Woebot is designed to assist with relationships, grief and addiction, while Wysa is targeted at those suffering stress, anxiety and sleep loss. Both apps let users discuss their concerns with a computer rather than a human. Their automated systems are supposed to flag up serious or dangerous situations.

The flaws mean that both the chatbots are currently not fit for purpose for use by youngsters. The Children’s Commissioner for England, Anne Longfield, said that the apps ‘should be able to recognise and flag for human intervention a clear breach of law or safeguarding of children’.

Wysa had been previously recommended as a tool to help children by an NHS Trust. Its developers have promised an update to improve the responses by the app.

The makers of Woebot have now introduced an 18+ age limit as a result of the testing outcomes, stating that the app should not be used in a crisis.

Despite the shortcomings, both apps did flag messages suggesting self-harm, directing users to emergency services and helplines. (10th December 2018)