Create ground rules and codes of conduct to ensure that individuals are kept safe during group interactions.
Pre-moderate all forums quickly and effectively with clear escalation policies for when risk is identified.
Take action to reduce the risks listed below.
1. User overshares personal information: member shares personal information in a way that compromises their security or privacy
Q: How will groups and forums be moderated?
Q: Are group formats the safest or most suitable medium of providing this service to this group of users?
2. User discloses abuse: member discloses to the group
Q: Are procedures in place to identify risk and provide support when disclosures are made in a group setting?
Q: Does the chosen platform enable staff to directly contact users when they are at risk or following a disclosure?
3. User shares triggering content: member shares content that distresses rather than supports other members
Q: Is there a policy around acceptable content? Is it clear to all users?
Q: Have ground rules including encouraging the use of trigger warnings (TW) and content notes (CN) been established when using an open/unmoderated forum?
4. Inappropriate behaviour towards other members: this may include discussing inappropriate topics, using offensive language or bullying others, causing upset and distress
Q: Is there a conduct policy and is it clear to all users? What discussion topics are acceptable?
Q: Does the platform make it possible to display ground rules in a visible and easy to find place?
Q: Are group members able to contact each other outside of the social media group? Can this be restricted or group rules established?
5. Anonymity leading to ineffective safeguarding: forums that allow users to create an avatar or pseudonym limit a service’s ability to provide support or escalate risk. Anonymity may also provide an opportunity for impersonation
Q: Is there an organisational policy for accessing the IP address of an individual who has posted content anonymously?
6. Fake profiles aka ’catfishing’: perpetrators impersonate others or use a false profile to gain access to a private group
Q: How might staff check they are speaking with the individual(s) they think they are?
7. Grooming and inappropriate contact: unrestricted access allows for inappropriate contact from potential perpetrators of harm. Particularly relevant when running groups or creating interactive games for children and young people
Q: Is it possible to restrict access to this application or game?
Q: Is it possible to moderate interaction on this application or game?
Q: Is it possible to create a flagging system for particular words/phrases or conversations?
Always create a group agreement or ground rules. Create it in partnership with your users. Consider including: language, information sharing, communication style, image sharing, off-platform contact
Create a policy for managing content that suggests a user is at risk. Include:
A flagging system for posts that present risk (of any form)
A procedure for postremoval, responding to the member who posted and escalating where necessary
Also include definitions of risk:
High risk: threats of suicide/self-harm, threats to harm someone else, breach of a court order, child abuse, child protection concerns, domestic abuse
Medium risk: where you feel uncertain whether to reply or escalate e.g. a discussion about legal issues
Low risk: swearing, discussing irrelevant and somewhat inappropriate content (if you just want a second opinion but there is no risk)
Consider the risk and benefits of running groups on social media platforms. They increase the risk of users encountering inappropriate content and receiving inappropriate contact. However, they also offer greater reach to potential users who may not otherwise seek help, and many users already know how to use them
Plan how you will manage the situation when a member does not turn up or stops appearing in the group. Discuss with the group and include this in the group agreement. Ensure you have a safe way of contacting each group member independently
Consider the use of codewords or actions that group members can use to indicate they do not feel safe to talk
If moving a face-to-face group online consider what support your users might need to manage the change and continue participating comfortably