fbpx

What the High Court’s decision on liability and social media comments means for you

high court article header

In a landmark decision earlier this month, the High Court ruled organisations are now liable for third party content on their social media accounts. This means anyone who runs a page can be held responsible for defamatory comments people leave on their posts.

While this decision has naturally caused anxiety, it doesn’t mean that organisations should shy away from encouraging conversation on their channels. By leveraging moderation tools already built into social media platforms and having a robust community management strategy in place, you can still build thriving communities.

How did this case come about?

The decision is part of ongoing litigation between Dylan Voller and several Australian media outlets, regarding a series of defamatory comments that were published about Mr Voller on the public Facebook pages of certain media outlets in 2016 and 2017.

It is based on the finding of the Supreme Court of NSW that media outlets are liable for readers’ comments on their Facebook posts because a public page encourages and facilitates the publication of comments.

This ruling was upheld by the NSW Court of Appeal in June 2020, and has now been affirmed in the High Court of Australia on Wednesday 8 September 2021.

What was the outcome of the case?

The five-two majority decided that facilitating and encouraging comments on social media that may include defamatory content was considered to be participating in the communication of that content, regardless of whether the owner of the post was aware of the comment content or not.

What does this mean for you?

This does not only affect media outlets. Any organisation and individual with a social media account can be found liable. The ruling isn’t Facebook-specific either. It covers other social media platforms and websites that have comment sections.

Therefore, if you post content on your social media page and encourage or invite comments, you’re legally the publisher of those comments. Additionally, if they are defamatory, you can be sued.

How can you safeguard yourself and your business?

Since the case began, social media platforms have introduced different ways to moderate and limit the comments on posts.

Facebook: Page administrators can turn off comments, restrict who can comment or hide comments via keywords.

Instagram: Page administrators can turn off comments.

Twitter: Page administrators can turn off comments or restrict who can comment to either their followers or those tagged in the tweet.

LinkedIn: Publishers can turn off comments or restrict replies to connections only.

Using these tools alone may not be enough to protect you or your business from this decision. In fact, limiting dialogue and user interaction within comments may minimise return on investment and sentiment. This is critical given social media channels and websites are some of the biggest drivers for ROI and sentiment.

Whether you choose to turn on these restriction features or not, you need to have a community management strategy in place. This includes regular moderation, escalation processes and implementing terms of use for channels. If this is something you’d like help with, please contact us.

 

 

 

 

Thinking

In this week’s Platform Five: Snapchat shares new stats on ‘My AI’ usage

Read more

In this week’s Platform Five: Meta fined $2 billion

Read more

In this week’s Platform Five: WhatsApp adds ‘Chat Lock’ for privacy

Read more