The government’s swift move to pass new legislation on fining or potentially jailing social media companies and executives for not quickly removing violent content from their platforms, has been met with real concern from the IT industry and law sector for lack consultation.
The Criminal Code Amendment (Unlawful Showing of Abhorrent Violent Material) Bill 2019 creates new offences for not “expeditiously” removing footage of criminal acts from content and hosting services, and for failing to notify the Australian Federal Police (AFP) “within a reasonable time” if violent material is found on provider's platforms.
This also impacts internet services providers and hosting providers, who will also be obliged to report, remove or cease hosting "abhorrent violent material". There will be two new set of offences under the law -- failure to remove abhorrent violent material "expeditiously" would see three years imprisonment or fines of up to 10 per cent of the platform's annual turnover; and failure of the social media platforms to notify the AFP will attract fines of up to $168,000 for an individual or $840,000 for a corporation.
The new legislation comes in the wake of the tragic Christchurch mosque attacks last month and the way in which social media companies such as Facebook, failed to quickly take down the footage of the attack.
Tech start-up advocacy group, StartupAUS CEO Alex McCauley said the haste and lack of consultation with which the legislation was created, is a real concern, and there will be many more thorny regulation questions that emerging technologies will present in the future.
“Between this and the Access and Assistance (AA) Act, we're starting to see a trend towards jumping into anti tech legislation in a knee jerk fashion,” McCauley said. “This legislation was drafted and rushed through the Senate in less than three weeks.
"That's not enough time to get it right. There has been virtually no consultation, which has led to a poor piece of legislation. Nowhere is this clearer than the fact that the proposed law doesn't include a public interest exemption -- something that is deeply concerning.”
McCauley said the government needs to be thoughtful and deliberate about how it approaches these issues, and engaging in consultation with industry, upskilling lawmakers on existing and emerging technologies, and doing some forward planning about what's coming and how to sensibly respond.
“If we rush it, we'll get it wrong. We then run the risk of hurting a promising Australian industry while simultaneously failing to protect the public adequately,” McCauley said. “Much like the AA Act, the Sharing of Violent and Abhorrent Media Bill is broad in scope, and with the ubiquity of businesses and citizens with an online presence, has potential applications and misapplications right across our society.
“We call on the government to learn from the mishandling of the AA Act and send the legislation to the PJCIS for review while conducting a meaningful consultation with stakeholders. A robust policy development process is critical here to make sure the law we pass will actually work.”
While the Law Council of Australia does agree that action needs to be taken in this area, its president Arthur Moses expressed disappointment and concern of the laws being "rammed" through in 24 hours, without scrutiny or consultation, saying it will have negative unintended consequences.
“We now have a situation where important news can be censored across social media platforms, which is contrary to the democratic principle of a free press, which exists to hold governments to account,” Moses said. “While the Law Council agrees that action needs to be taken in this area, consultation was required. These laws should have been subject to the committee process.”
“When parliament returns after the federal election these laws must be reviewed and amendments made to deal with the negative impacts they have the potential of causing.”
Digital Rights Watch chair Tim Singleton Norton said it was wrong of the government to assume that an amendment to the criminal code was going to solve the wider issue of content moderation on the internet.
“Poorly designed criminal intermediary liability rules are not the right approach here, which the Government would know if it had taken the time to consult properly,” Norton said.
“The reality here is that there is no easy way to stop people from uploading or sharing links to videos of harmful content. No magic algorithm exists that can distinguish a violent massacre from videos of police brutality."
Australia Policy Analyst at Access Now Lucie Krahulcova further added that regulating online speech in a few days was a tremendous mistake by the government.
“Reforming criminal law in a way that can heavily impact free expression online is unacceptable in a democracy. If Australian officials seek to ram through half-cooked fixes past Parliament without the proper expert advice and public scrutiny, the result is likely to be a law that undermines human rights. Last year’s encryption-breaking powers are a prime example of this,” she said.
“Regulating online speech in a few days is a tremendous mistake. Rather than pushing through reactionary proposals that make for good talking points, the Australian government and members of Parliament should invest in a measured, paced participatory reflection carefully aimed at achieving their legitimate public policy goals.”