One of the advantages of the internet is it provides a wealth of knowledge to anyone who has a device that can access it.
However, one of the downsides is with so much information available, a lot of it is unverified, while some of it can even be so inaccurate it becomes harmful.
Because of this, many believe social media companies, such as Facebook and YouTube should be held accountable for the information shared on their websites.
A new research report released by watchdog group FRIENDS of Canadian Broadcasting argues these companies should be considered publishers, and thus held accountable for user-generated content published to their platforms.
- COVID-19 cases in Halton today showcase significant drop since last week
- Halton police stops “unfit vehicle” in Burlington
- Burlington man celebrating lottery win
“Our elected officials don’t need to create new laws to deal with this problem. They don’t need to define harmful content, police social media, or constrain free expression in any new way. All government needs to do is apply existing laws,” Daniel Bernhard, Executive Director for FRIENDS, said in a news release.
“But if a judge decides that content circulated on social media breaks the law, the platform which publishes and recommends that illegal content must be held liable for it,” he continued.
In their defense, social media companies have argued that they simply function as bulletin boards that display user-generated content without editorial control—they posit that it would be impossible to discover illegal content from among the 100 billion daily posts.
Platforms such as Facebook claim to advertisers that they have technology that recognize content users post before it is published and pushed out to others.
Additionally, Facebook routinely exercises editorial control by promoting content users have never asked to see, including extreme content that would land other publishers in legal trouble, as well as conceals content from users without consulting them—another form of editorial control.
“Facebook and other social media platforms have complaints processes where they are alerted to potentially illegal or otherwise objectionable content. Yet it is their own community standards, not the law, which dictates whether they will remove a post,” George Carothers, director of research for FRIENDS, said in the same release.
“Even then Facebook employees say that the company does not apply its own standards when prominent right-wing groups are involved,” he continued.