MANILA, March 26 — A worrying trend has emerged in the last few years, where intermediaries around the world are being used as chokepoints to restrict freedom of expression online, and to hold users accountable for content.
“All communication across the Internet is facilitated by intermediaries: Service providers, social networks, search engines, and more,” said Electronic Frontier Foundation (EFF) senior global policy analyst Jeremy Malcolm.
“These services are all routinely asked to take down content, and their policies for responding are often muddled, heavy-handed, or inconsistent.
“That results in censorship and the limiting of people’s rights,” he told Digital News Asia (DNA) on the sidelines of RightsCon, an Internet and human rights conference hosted in Manila from March 24-25.
This year, the government of France is moving to implement regulation that makes Internet operators “accomplices” of hate-speech offences if they host extremist messages.
In February, the Motion Picture Association of America (MPAA) and the Recording Industry Association of America (RIAA) urged ICANN (the Internet Corporation for Assigned Names and Numbers) to ensure that domain name registries and registrars “investigate copyright abuse complaints and respond appropriately.”
Closer to home, the Malaysian Government passed a controversial amendment to the Evidence Act 1950 – Section 114A – back in 2012.
Under Section 114A, an Internet user is deemed the publisher of any online content unless proven otherwise. The new legislation also makes individuals and those who administer, operate or provide spaces for online community forums, blogging and hosting services, liable for content published through their services.
Due to the potential negative impact on freedom of expression, a roadmap called the Manila Principles on Internet Liability was launched during RightsCon.
The EFF, Centre for Internet Society India, Article 19, and other global partners unveiled the principles, whose framework outlines clear, fair requirements for content removal requests and details how to minimise the damage a takedown can do.
For example, if content is restricted because it’s unlawful in one country or region, then the scope of the restriction should be geographically limited as well.
The principles also urge adoption of laws shielding intermediaries from liability for third-party content, which encourages the creation of platforms that allow for online discussion and debate about controversial issues.
“Our goal is to protect everyone’s freedom of expression with a framework of safeguards and best practices for responding to requests for content removal,” said Malcolm.
Jyoti Panday from the Centre for Internet and Society India noted that people ask for expression to be removed from the Internet for various reasons, good and bad, claiming the authority of myriad local and national laws.
“It’s easy for important, lawful content to get caught in the crossfire. We hope these principles empower everyone – from governments and intermediaries, to the public – to fight back when online expression is censored,” she said.
The Manila Principles can be summarised in six key points:
Intermediaries should be shielded by law from liability for third-party content
Content must not be required to be restricted without an order by a judicial authority
Requests for restrictions of content must be clear, be unambiguous, and follow due process
Laws and content restriction orders and practices must comply with the tests of necessity and proportionality
Laws and content restriction policies and practices must respect due process
Transparency and accountability must be built in to laws and content restriction policies and practices
“Right now, different countries have differing levels of protection when it comes to intermediary liability, and we’re saying that there should be expansive protection across all content,” said Malcolm.
“In addition, there is no logic in distinguishing between intellectual property (IP) and other forms of content as in the case in the United States for example, where under Section 230 of the Communications Decency Act, intermediaries are not liable for third party content but that doesn’t apply to IP,” he added.
The Manila Principles have two main targets: Governments and intermediaries themselves. The coalition, led by EFF, will be approaching governments to present the document and discuss the recommendations on how best to establish an intermediary liability regime.
This includes immunising intermediaries from liability and requiring a court order before any content can be taken down.
With intermediaries, the list includes companies such as Facebook, Twitter and Google, to discuss establishing transparency, responsibility and accountability in any actions taken.
“We recognise that a lot of the time, intermediaries are not waiting for a court order before taking down content, and we’re telling them to avoid removing content unless there is a sufficiently good reason and users have been notified and presented that reason,” said Malcolm.
The overall aim with the Manila Principles is to influence policy changes for the better.
Malcolm pointed out that by coincidence, some encouraging developments have taken place in India. On the same day the principles were released, the Indian Supreme Court struck down the notorious Section 66A of the country’s Information Technology Act.
Since 2009, the law had allowed both criminal charges against users and the removal of content by intermediaries based on vague allegations that the content was “grossly offensive or has menacing character,” or that false information was posted “for the purpose of causing annoyance, inconvenience, danger, obstruction, insult, injury, criminal intimidation, enmity, hatred or ill will.”
Calling it a “landmark decision,” Malcolm noted that the case shows why the establishment and promotion of the Manila Principles are important.
“Not only is the potential overreach of this provision obvious on its face, but it was, in practice, misused to quell legitimate discussion online, including in the case of the plaintiffs in that case – two young women, one of whom made an innocuous Facebook post mildly critical of government officials, and the other who ‘liked’ it,” he said.
The court however, upheld section 69A of the Act, which allows the Government to block online content; and Section 79(3), which makes intermediaries such as YouTube or Facebook liable for not complying with government orders for censorship of content. — Digital News Asia
This article was first published here.