Whether it is a late-night political rant, a spat between neighbours, or a post that does not meet group rules, Darin Sullivan carries the burden of responsibility for content on the community Facebook group he moderates.
“A well-run community group can be a really positive space … But they [can] become negative toxic places too,” he said.
Mr Sullivan is among five administrators for the Kiama Community Page, based on New South Wales’ South Coast.
The page boasts a membership of 23,300 people, nearly four times the population of the town.
When 1,260 Facebook accounts requested access to the group last month, Mr Sullivan’s radar sounded.
“Most of them looked to be scam accounts,” he said.
He described the profiles as “fake or plain-looking”, known to share “spam links … [or] ask for money.”
The firefighter said he volunteered up to 14 hours a week replying to messages, supervising content, and increasingly dealing with “dodgy” requests.
“Requests from scam accounts have been exponential,” Mr Sullivan said.
“And that poses a real risk to the group members because not everyone is savvy enough to know … that’s a scam post or message.
“It’s become a very important role to be the firewall to the community.”
Lumped admin responsibilities
Mr Sullivan said Kiama Community Page started as a force for good in 2015 but, like most online groups, it has not been devoid of misinformation, personal attacks and, in some cases, defamation.
“We’ve had situations where we’ve had people in fatal car accidents,” he said.
“The first thing the community does is go to the page for information … often before the family involved has been notified.
“That sort of thing can unravel … and can get really delicate to manage.”
Mr Sullivan said identifying a scam account had become the most difficult part of his role.
“If administrators use an automatic system, or firewall, [scam accounts] can still make it in,” he said.
“We have to manually go through … then use measures that we’ve learned along the way as to whether we think that’s a fake account or not.
“It’s tricky and sometimes not obvious.”
‘Demand better security tools’
Fiona Martin, a University of Sydney associate professor in online and convergent media, described scams as an “incessant issue” that could only be helped by media conglomerates.
“We are going to have to demand better security tools from Facebook so that ordinary people can actually keep control of their sites,” she said.
The researcher said community Facebook pages had assumed both the importance and function of traditional media.
“The administrators and the page-owners of Facebook communities have the same responsibilities as the publishers of a newspaper or a radio broadcast or TV broadcast,” Dr Martin said.
“And if you’re just a normal local with perhaps no real background education knowledge of these kinds of legalities, it’s a lot of responsibility.
“It would be really sad if it became too difficult to maintain these groups.”
Dr Martin believed anybody who managed a group larger than 10 people should seek formal training.
“Because it’s going to be more expensive in the long run … if you get tangled up in the courts,” she said.
New scam code incoming
Financial losses from scams have increased nearly five-fold since 2020 and cost the nation more than $3 billion, according to Assistant Treasurer and Minister for Financial Services Stephen Jones.
He agreed media conglomerates could “do so much more to keep their networks safe”.
“We need to kick the scammers out,” Mr Jones said.
He said he was working on a new code of practice designed to ensure content issued by banks, telecommunications and social media companies was “verified and lawful”.
“We hope to have draft legislation out over the next month or so to give both businesses, and consumers and the community an idea on where we are heading on this,” Mr Jones said.
Get our local newsletter, delivered free each Thursday