π4. Apply principles
Universal digital safeguarding principles and common issues that you need to design for.
Last updated
Universal digital safeguarding principles and common issues that you need to design for.
Last updated
Whether youβre creating a new service, iterating an existing one or taking an offline service online youβll be carrying out the process of design. This guide wonβt show you how to do that (other guides will). But this chapter and the next will show you how to design with the safety of your users and staff in mind.
This chapter is about common principles and universal issues you should always consider when assessing risk and making decisions. You donβt need to be an expert in any of them, but you do need to understand the basics of each.
Each one applies similarly to any digital service - both in terms of how itβs delivered to the user and how itβs enabled and supported via back office functions.
Use them to inform your risk assessment - whether you are writing it for the first time or reviewing it.
Then use Chapter 5 (Design for safety) to identify specific risks that may apply to your service.
There are some differences between working face-to-face and working online. Online support may create a barrier for some and an enabler for others. Interactions can be quite different. Think about what this means for contact styles, communication and the steps you take to safeguard your users. The best way to do this is to involve them in your decision-making.
Sometimes peopleβs social inhibitions loosen when they are online. Even when they arenβt anonymous. They might disclose or share personal things more quickly and directly than they would in a face-to-face setting. This can be empowering and therapeutic for some, but lead to regret for others.
This effect can also lead to people posting abusive comments or bullying/trolling others: toxic disinhibition.
It can be hard to gauge peopleβs emotions online. Non-verbal signs may be lacking and interactions can feel disjointed and uncertain.
Consider how:
You might pick up on usersβ emotions and mental states when you are not able to see them
You might honour their feelings
Some users will lack confidence in using technology. Staff may feel the same. And not everyone will have access to devices and data/WiFi. Some may have previously had traumatic experiences of using tech.
Consider providing:
Reassurance and extra support
Information about your data security, data sharing and confidentiality policies
Online itβs often possible to shield or hide oneβs identity. This can make it easier for people to reach out for support in stages. However, it also creates the risk of impersonation and sometimes makes it difficult to support people in a crisis, because they canβt be identified.
Many people experience home as a sanctuary and safe place. Others may not. Consider it from their point of view. Do they have a safe and confidential space? Think about how interacting with you from this place might be for them. Consider how they might feel after leaving an interaction with your service. Is there anything you can do to help make this moment easier for them? Ask your users what they need.
Privacy is about your usersβ ability to control access to information about themselves. Maintaining privacy is crucial to keeping your users safe because it means they can reveal personal details and information autonomously, at their own pace.
You put your users at risk when you donβt protect their privacy. Privacy should never be substituted for functionality.
Information you store about your service users
Information service users share with you in conversation
Data stored on a platform you run or facilitate
Your risk assessment should detail potential privacy risks. That way you can put measures in place to reduce or eliminate them.
Inform your users: tell them what data of theirs you will be collecting and storing and for what purpose
Secure their data: store usersβ information in a secure, password protected file or other secure database or server. See βDesigning for information securityβ, below
Ask for consent: be sure they are aware of where their data is stored (so they can make an informed decision) and where and/or with whom it may be shared if a safeguarding risk arises. See βDesigning for consentβ, below
Preserve anonymity: when passing a userβs information on to a third party, ensure it is adequately anonymised unless a safeguarding risk is identified. This might be relevant when you are seeking advice on a piece of casework
Respect boundaries: respect your usersβ individual preferences and limits they place on what they choose to share with you
Samaritans are famous for their work with anonymous callers in crisis. Their privacy statement provides insight into how they manage anonymity and data.
Consent is defined as βpermission for something to happen or agreement to do somethingβ (Oxford English Dictionary). Valid consent must be obtained before providing care or support and storing usersβ information. Consent is only valid when each party is informed and aware of their rights and obligations.
The UK legal age of consent for accessing online services is 13. GDPR Article 8 offers more guidance around consent when working with children.
Use opt in consent: offer βopt inβ rather than βopt outβ options to help users actively choose to give consent
Use simple and clear language: this helps users understand what they are consenting to. Keep consent forms short and to the point
Create an engaging consent process: consider language, style and format to reduce the chance of users skimming information. Read Understanding Digital Consent (MEF)
Undertake a DPIA: when offering online services to children. Design forms to be engaging
GDPR and consent (ICO)
Gaining consent when involving users in service design (Girl Effect)
Information security is important because it ensures your usersβ data is kept safe from third parties. It also gives your users confidence to share information that will help you to help them.
While there are many technical controls you can implement, staff behaviour is the biggest risk to your information security. Your risk assessment should document risks and list both technical and procedural ways in which you will keep data secure.
Provide staff training: so they understand the procedures required of them to keep data safe
Review your internal security: where are you keeping your data and in what format? Consider your offline storage practices, security of data in transit, cloud storage and data wiping procedures. Implement the National Cyber Security Centreβs Five Technical Controls.
Review access to data: consider who has access to what information and what limits should be in place
Designing for accessibility is about making a service usable for as many people and their needs as possible. Lack of accessibility increases the risk of digital platforms being used incorrectly, because their function or purpose isnβt clear to users.
When designing, consider needs based on: age, race, ethnicity, gender, sexuality, ability, religious belief, socio-economic status and location. Think about these through the twin-lenses of function and content.
Clear and structured content
Language for all abilities
Alt text for images
Hyperlinks with descriptive tags for easy navigation
Clear heading and paragraphs in any descriptive content
Font-size options
Use colour carefully: consider using toned down colour palettes for websites and apps catering to those with experiences of trauma
Diverse and inclusive images
Diverse and inclusive language (for example, inclusive gender pronouns)
Comfort: ask users which platforms they feel comfortable and capable of using
Easy set up: consider software that is easy for users with lower digital literacy to set up
Clear functions: use applications with clear functions and layouts for service users with lower digital literacy
Less is more: limit the number of software platforms you use to communicate with each user. Using too many can create confusion and overwhelm
You need to apply your organisationβs moral framework to the design and delivery of your digital services. This means building a workplace culture - as well as individual and team capacity - to take the initiative, handle risk and safeguard when delivering services this way.
You need to provide adequate policies and guidance, and include digital safeguarding in your safeguarding training. This is imperative for staff who are less comfortable with tech or less digitally literate, because they will expose their service users and themselves to potential harm by creating unsafe digital interactions.
Identify key risks: so they have a basic confidence in their awareness. Technology Safetyβs website provides various resources and toolkits to assist both in identifying and mitigating risk, especially their Toolkit for Survivors
Identify key considerations: when using any new communication tool or software. This might include asking:
Is this tool accessible (both in terms of function and content) for the service user group for whom itβs intended?
How is the data held and stored by owners of this platform?
What are the security implications of using this tool?
Are service users already using another tool which they feel more comfortable with?
Apply their skills to a digital space: affirming that they must apply their existing talents to digital by understanding new risks and adopting new communication styles
Secure devices: so the hardware they use and the information it stores is as secure as possible. Show them how to update settings, spyware and virus protection on a regular basis
Collect consent: so that staff/client relationships are open and equal
Protect confidentiality: understanding required standards and when it is acceptable to break confidentiality
Adapt communication styles: so they make it easy for users to engage through online services
Understand a userβs situation: so they know to consider risks, benefits and implications of a userβs chosen device and the space they are using it in
Discuss digital safety and security with users: so they pass on their good practice and knowledge to those they support
Link: The online disinhibition effect (Online Therapy Institute)
Laws: Most laws and regulations regarding privacy and security fall under The Data Protection Act (2018) and the General Data Protection Regulation (2018).
Link: Privacy Project (The New York Times)
Links:
Link: Designing a secure digital service (NCSC)
Links: