DigiSafe
  • 👋🏼 Welcome to DigiSafe
  • Chapters
    • 🚀1. Start here
    • 🌳2. Build foundations
    • ⚖️ 3. Assess risk
    • 📋4. Apply principles
    • 🎗️5. Design for safety
      • 🖥️Video-based services
      • 💬Messaging-based services
      • 🤖Live chat and chatbot services
      • 👍🏼 Online groups and forums
      • ✒️ Content
      • 📱Data and devices
    • 🎓6. Educate everyone
  • ✍️ Afterword
  • 🗳️Give feedback
Powered by GitBook
On this page
  • Take action
  • Key risks
  • Take more action
  • See it in action
  • Dive deeper

Was this helpful?

  1. Chapters
  2. 5. Design for safety

Live chat and chatbot services

Safeguarding when using phone, live chat, or chatbots to provide instant support or advice. Usually to anonymous users.Safeguarding when using phone, live chat, or chatbots to provide instant support

Take action

  1. Keep all software up to date. Design chatbots’ user journeys and decision trees to identify safeguarding risks

  2. Consider the needs of your users when choosing which messaging tool to use. For example, chatbots can be useful but automated responses can create risk too. Consider what works best for complex problems or those that require critical thinking

  3. Take action to reduce the risks listed below

Key risks

1. Being witnessed or overheard leads to further harm: a user (or users) who is at risk of abuse or harm is overheard/seen contacting others in such a way as to increase harm and risk to that individual

  • Q. Are exit options available and obvious to the user?

2. User receives bad information: chatbots or call handlers give inappropriate or incorrect information because they haven’t been updated

  • Q. Are all services and organisations to which users are being signposted still functioning and appropriate?

  • Q. Is the chatbot updated regularly with relevant and useful information?

3. Chatbot fails to identify safeguarding issue: because they weren’t kept up to date or safeguarding journeys weren’t designed in

  • Q Is the chatbot’s journey and decision making process able to notify the organisation of a safeguarding concern?

4. Communication style distresses user: inappropriate online communication styles from chatbots or staff upsets users

  • Q. Is it clear to the user who or what they are speaking with?

  • Q. Do staff or chatbot use appropriate communication styles when responding to a user’s emotional states?

  • Q. Is language used validating and empathetic?

Take more action

Staff should limit complex or sensitive conversations which cannot be resolved or adequately discussed in the time available for the call.

See it in action

Dive deeper

PreviousMessaging-based servicesNext👍🏼 Online groups and forums

Last updated 4 years ago

Was this helpful?

(Chayn)

(We Are With You)

(Catalyst)

(UNICEF)

🎗️
🤖
Little Window chabot
Using web chat to support clients
How to replace your drop-in service with an online chat
Safeguarding girls and boys - when chabots answer their private questions