Skip Navigation
Main Content

9.4 Inappropriate content 


We looked in more detail at what constitutes inappropriate content in Section 8, The right to have your say. Content may be inappropriate for a variety of reasons such as because it is obscene, offensive, incites violence, infringes copyright etc. and we are using the generic term to cover all of these eventualities. Some offensive content will fall short of breaking the law but other material may contravene legislation including the: Defamation Act 2013; Equality Act 2010; Public Order Act 1986; Protection from Harassment Act 1997.

The legal position is that an institution may be vicariously liable for inappropriate content posted on a site that the institution hosts, or for remarks made by one of its employees if the institution is deemed to have authorised the activity (i.e. it is unlikely to be held responsible for remarks posted on an employee's personal blog). Institutions hosting their own social media platforms should therefore provide for a notice and takedown procedure so as to minimise liability for illegal or offensive content. Whilst liability lies with the online host, it is sensible to apply the same procedure to institutionally branded use of widely used platforms. Where inappropriate content has originated from the institution's network, (whether the site to which it is posted is institutionally or externally hosted) there should be an incident response procedure to trace the activity to specific computers and user accounts. This applies to content which infringes copyright as well as material which is illegal by its nature e.g. racist remarks. The institution is unlikely to face penalties provided it does not have actual knowledge of the content and provided it acts quickly once notified of inappropriate content. The institution will however face greater liability should it have played a part in transmitting or altering the content. Similarly, it will be more difficult to argue lack of knowledge of the content where the institution plays a role in moderating a particular platform or group. As an example, in the case of inappropriate material posted to a chat room or bulletin board used by tutorial groups and monitored by a member of the institution's teaching staff, the institution could be deemed to have sufficient editorial control to be strictly liable for the content.

Other considerations are whether the institution needs a procedure to check the identity and provenance of a complainant before taking down material and whether there should be a put back procedure if, for example, disputed material is found not to infringe copyright or if a complaint is deemed to be frivolous.

Some of the most common issues are of libel, or contempt of court, e.g. commenting on a legal case before it is permissible to do so. This is very easy to fall foul of but also very easy to regulate against i.e. institutional policy should state: “Do not discuss cases which are ...”.

Good practice tips:

  • Clarify your procedures – have a clear mechanism for reporting inappropriate content including details of the complainant, their interest in the matter and their contact details. 

  • Apply a notice and takedown procedure – you can use the template provided by Web2Rights as a basis for this. 

  • Clarify roles and responsibilities – identify the person or persons responsible for deciding whether or not content is inappropriate. 

  • Define levels of seriousness and appropriate responses – create a rubric for categorising inappropriate content and determining the appropriate action. You may like to use the model suggested below. 

  • Keep records – maintain a log of all reported incidents and the outcome. 

  • Involve your network team – have an incident response procedure to trace the source of inappropriate content posted via the institution's network. 

  • Communicate – ensure in particular that any institutional social network moderators are aware of their responsibilities.




Categorisation model for dealing with inappropriate social media posts (adapted from Rowe 2014)

Level

Category

Action

Description

1

Trivial

No action

These are relatively trivial comments that should not provoke an institutional reaction. Intervening in these cases is likely to be counterproductive.

2

Minor

Optional – possibly contact  student/staff member to discuss comment

These are comments that are not particularly offensive but display a lack of respect or judgement. There is no imperative to contact the perpetrator but the institution could choose to do so e.g. to suggest a more appropriate or constructive way of making that comment or criticism.

3

Moderate

Issue a warning

These types of comment warrant contact being made with the perpetrator. Comments in this category will generally display a distinct lack of respect and judgement and risk causing offence. They should be addressed via provision of advice and information and usually a warning regarding appropriate behaviour.

4

Serious

Formal disciplinary action

These types of comment warrant immediate contact being made with the perpetrator. Comments in this category would generally be those that break a law (physical threats, racist, sexist, homophobic or other discriminatory comments); constitute bullying; or are admissions/offers to engage in inappropriate behaviour in respect of academic matters (e.g., cheating, plagiarism, collusion). Generally a warning will be insufficient in these cases. A formal admonishment would be the minimum action and more severe penalties such as exclusion or expulsion (in the case of students) might be considered.

In some (extreme) cases, law enforcement agencies may need to be involved.




 
Blank Image