🤖Live chat and chatbot services
Safeguarding when using phone, live chat, or chatbots to provide instant support or advice. Usually to anonymous users.Safeguarding when using phone, live chat, or chatbots to provide instant support
Take action
Keep all software up to date. Design chatbots’ user journeys and decision trees to identify safeguarding risks
Consider the needs of your users when choosing which messaging tool to use. For example, chatbots can be useful but automated responses can create risk too. Consider what works best for complex problems or those that require critical thinking
Take action to reduce the risks listed below
Key risks
1. Being witnessed or overheard leads to further harm: a user (or users) who is at risk of abuse or harm is overheard/seen contacting others in such a way as to increase harm and risk to that individual
Q. Are exit options available and obvious to the user?
2. User receives bad information: chatbots or call handlers give inappropriate or incorrect information because they haven’t been updated
Q. Are all services and organisations to which users are being signposted still functioning and appropriate?
Q. Is the chatbot updated regularly with relevant and useful information?
3. Chatbot fails to identify safeguarding issue: because they weren’t kept up to date or safeguarding journeys weren’t designed in
Q Is the chatbot’s journey and decision making process able to notify the organisation of a safeguarding concern?
4. Communication style distresses user: inappropriate online communication styles from chatbots or staff upsets users
Q. Is it clear to the user who or what they are speaking with?
Q. Do staff or chatbot use appropriate communication styles when responding to a user’s emotional states?
Q. Is language used validating and empathetic?
Take more action
Staff should limit complex or sensitive conversations which cannot be resolved or adequately discussed in the time available for the call.
See it in action
Little Window chabot (Chayn)
Using web chat to support clients (We Are With You)
Dive deeper
Last updated