OP-EDs

PART FOUR: PROPOSED LCA RULES MUST BE DISCARDED AND OVERHAULED TOTO CAELO

By Advocates Tekane Maqakachane and Katleho Nyabela

We can only conclude on this polemic issue of internet content regulation by referring to the research-based empirical findings and recommendations by the Association of Progressive Communications (APC). The APC recommended that in considering to regulate online content, an independent regulator, such as LCA, must start:

moving towards a process-based co-regulatory approach to online content regulation, which does not make platforms liable for hosting individual pieces of content, but instead imposes a legal obligation on them to fully disclose their self-regulatory efforts to address illegal and harmful content on their services. Such disclosures should be combined with mandatory oversight by an independent regulator in order to allow for independent scrutiny of the necessity, proportionality and effectiveness of these measures.

Given the human rights and social and political freedoms issues at stake, LCA must be alive to the fact that, as APC puts it,

… any state-driven intervention to regulate content online should be subject to particular precautions. Any legitimate intervention must (1) aim for a minimum level of intervention in accordance with the principles of necessity and proportionality, (2) be based on an inclusive consultation process with all relevant stakeholders, and (3) not strengthen the dominant position of the large incumbents by creating barriers for new entrants to the market. Many regulatory proposals to counter online harms run the risk of unintentionally creating more harm than the initial harms they try to combat.

In approaching online content regulation question, LCA must attempt to benchmark with emerging international regulatory best practices and principles informed by the UK, French, German and European Commission approaches, and certainly others.

What emerges from these approaches is a governance approach to internet regulation that focuses on the regulation of company processes rather than content as such.

According to APC, just as a Central Bank would not punish a money-launderer or terrorist financier (a customer of the bank) who manipulates the bank’s system for illegal activity, but would intervene when a bank has not taken sufficient due diligence measures to prevent their infrastructure from being used for the illegal purposes, in the similar manner, [t]his governance structure requires platforms to develop processes that ensure a systematic approach to controlling and minimising well-defined risks, and which take as a starting point the precautions outlined above.

Shifting scrutiny towards these processes would help address some of the causal factors that give rise to illegal and harmful content online, without unduly interfering with individuals’ rights and the open architecture of the internet…. This approach steers us away from the disproportionate attention that is given to the removal of individual pieces of content.

Rather than trying to regulate the impossible, i.e. removal of individual pieces of content that are illegal or cause undefined harm, we need to focus on regulating the behaviour of platform-specific architectural amplifiers of such content: recommendation engines, search engine features such as autocomplete, features such as “trending”, and other mechanisms that predict what we want to see next. These are active design choices over which platforms have direct control, and for which they could ultimately be held liable.

Regulating these architectural elements is a more proportionate response than regulating content as such…. This approach reserves the right of platforms to determine how they promote, demote, demonetise – or take any other procedural measure – regarding content on their platforms.

Digital service providers are likely to prove more effective in identifying hazards and developing technical solutions than a central regulatory authority. But this prerogative cannot be unchecked any longer: an independent regulator should be able to assess the effectiveness of these procedural measures against a set of statutory objectives that go beyond simplistic content related benchmarks such as removal rates and response times.

The regulator should have a foundational mandate to respect and indeed vindicate individuals’ rights online, and platforms should always have recourse to judicial review to challenge disproportionate regulatory action.

In “How (Not) to Regulate Internet”, Peter Pomerantsev, arguing for more protection of fundamental human rights than the policing of internet content itself, gave a resounding warning to communications independent regulators (and the LCA better be warned in self-same terms) that:

Get the regulatory approach right and it will help formulate rights and democracy in a digital age; get it wrong and it will exacerbate the very problems it is trying to solve, and play into the games of authoritarian regimes all agog to impose censorship and curb the free flow of actual information across borders.

RECOMMENDATIONS

On the basis of the above, we recommend the following:

Discarding and overhauling of the proposed Draft Rules 2020 toto caelo.

LCA to issue an extensive consultation paper substantially similar to the UK Online Harm White Paper model incorporating the justifications for the proposed new internet content regulatory framework; the choice of regulatory model selected by the LCA as relevant to Lesotho; scope of the regulatory framework on both internet content and legal personae falling within the scope of the intended regulatory framework; content and legal personae that are excluded from the framework; the expanded role and responsibilities of LCA in internet content regulatory framework; the role of Parliament in ensuring LCA compliance with its role and responsibilities; etc.

Thorough, transparent, inclusive, participatory and credible consultative process must be conducted by the LCA preferably through webinars, and workshops or pitsos for rural communities or their representatives. At the initial stages, all stakeholders must be consulted.

This must be followed by a series of engagement webinars and workshops to convene representatives of communication service providers, civil society actors and user groups.

Finally, LCA must engage and draw on advice from legal, regulatory, technical, online safety and law enforcement experts, to inform the further development of those proposals.

The proposed substantive enactment in the form of a Bill (later to be an Act) – and not Rules – must deal with the substantive and procedural matters which will form the new internet content and activity regulatory framework consistent with international regulatory best practices and principles.

The Bill must establish an independent regulator for online harm (which functions may be reposed on the LCA itself, for the purpose of that Bill). We propose that the Bill may be named Lesotho Internet Regulation and Safety Bill 2020. NW

Tekane Maqakachane and Katleho Nyabela are practising advocates of the courts of Lesotho. This is part four of their submission to the Lesotho Communications Authority (LCA) regarding the proposed promulgation of the LCA (Internet Broadcasting) Rules, 2020. Read part one here, part two here, and part three here.

Comment here

This site uses Akismet to reduce spam. Learn how your comment data is processed.