Regulation of social media a conundrum with no easy answers
A lack of cooperation from social media platforms, and the absence of a globally consistent approach, mean that determining the extent to which social media platforms can or should be regulated is incredibly difficult, argue two lawyers.
Speaking last week on The Lawyers Weekly Show, Marque Lawyers partner Hannah Marshall explained that there currently exists a “great sense of nervous tension” among governments, as well as a “desire to regulate and start to constrain social media platforms”.
This tension and corresponding desire, she said, have emerged “because we’re starting to see [social media platforms] being used for ways that are becoming increasingly pernicious, and [in ways] that people didn’t conceive early on”.
“There isn’t really a regulatory framework that naturally fits these environments, from a jurisdictional perspective and from a technological perspective, and so, we’re seeing decisions and legislation come out which are trying to grapple with how to regulate the social media platforms, [but] there’s no globally consistent approach. And I think that’s where we’re starting to see problems emerge,” she said.
There is a reluctance from platforms such as Facebook to monitor the content being posted and shared by users, Ms Marshall continued, given that said platforms derive revenue and traffic from such content.
“On the one hand they’re profiting from this, but on the other hand they’re trying to disavow responsibility. And using political advertising examples in the United States, it’s really unclear to what extent they are putting their hands up to take responsibility and regulate. It was really interesting that [Facebook CEO Mark] Zuckerberg really struggled to answer some of the questions [recently posed by the US House of Representatives] about what kind of advertising they would vet, or to what extent they would fact check political advertising,” she said.
“I think that really illustrates [that] the platforms seem to want to have their cake and eat it too. They want to disavow responsibility, but they also are making huge profits off these platforms. That’s why I think there does need to be a balance. That’s a very difficult balance to strike, particularly where people who are trying to impose those regulations don’t necessarily have that really deep understanding of how the platforms work, how their algorithms work.”
This latter point is reasonable, she mused, as platforms like Facebook and Google wouldn’t want transparency about their algorithms as that is their intellectual property.
Lawyer Sophie Ciufo remarked that Australia passed the Criminal Code Amendment (Sharing of Abhorrent Violent Material) Bill 2019, which introduced new offences applying to providers of internet, hosting or content services which fail to refer details of abhorrent violent material that records or streams conduct that has occurred, or is occurring, to Australian authorities within reasonable timeframes.
The passing of such legislation is a good example, she submitted, that governments cannot place responsibility solely at the feet of private entities, given that it holds broader duties to uphold law and order in society and promote safe environments.
“I think it’s really key that the government shouldn’t absolve itself of any responsibility to actually look at the underlying issues of where this content is coming from, why there’s an increase in violent content being shared online, looking at – politics in the US for example – the kind of proliferation of more extremist behavior on all ends of the spectrum,” she said.
“And, then, getting to the bottom of why that content is being made and who it’s being made by in the first place. This bill and a lot of the regulations are dealing with it after the fact, after it's been made, after these issues have happened in society and are being publicised, whereas I think when you need to bury back into the beginning of it and see where it’s all coming from.”
To listen to Jerome’s full conversation with Hannah Marshall and Sophie Ciufo, click below: