Building a Consistent, Transparent, and Proportionate Legal Framework for Internet Content Governance — Calling on the Executive Yuan to Ensure Internet Content-Related Laws Adhere to Common Legal Principles
Section I: Background
In recent years, various government agencies in Taiwan have, in response to the governance demands posed by unlawful internet content, incrementally introduced into their governing statutes provisions imposing content restriction obligations on internet service providers (hereinafter “internet service providers” or “service providers”). This trend toward fragmented legislation has, however, progressively generated systemic problems: inconsistent regulatory standards across different laws, inadequate protection of user rights, and the compounding of compliance costs. Some agencies have cross-referenced existing regulations as templates for their own legislative work without adequately accounting for differences in policy objectives and the characteristics of the services being regulated. Regulatory instruments of high intensity — which ought to be exceptional — have been inappropriately transplanted into other subject-matter domains. This not only constitutes a disproportionate interference with users’ freedom of expression and communication privacy, but also undermines the predictability of regulatory enforcement and service providers’ confidence in the institutional framework, to the detriment of the long-term development of Taiwan’s digital environment.
The National Communications Commission (NCC) has undertaken work on systemic recommendations for the legislative framework governing internet-related laws. The principles it has articulated — including the prohibition on general monitoring obligations, the Good Samaritan spirit, safe harbor mechanisms, and the principle of proportionality — have been endorsed by industry, academia, and civil society. Meaningful progress has been made in establishing these principles at the conceptual level. The central task of the next phase is to institutionalize them, ensuring that they are consistently and effectively implemented across the legislative and enforcement practices of all agencies.
In light of the foregoing, internet service providers, academic institutions, civil society organizations, and relevant professional communities set out the following positions and calls to action.
Section II: Our Positions
- Content restrictions constitute an interference with fundamental rights and must be subject to strict legal constraints.
People’s exercise of expression, communication, and access to information via the internet is protected under the Constitution and international human rights norms. Any restriction on content circulating online must be clearly prescribed by law, serve a legitimate purpose, comply with the principle of proportionality, and provide effective avenues for remedy.
- Different types of services should be governed by different regulatory logics.
Online advertising, product listings on e-commerce platforms, public posts on social media, private conversations on messaging applications, and data transmitted through internet access services differ fundamentally in their technical architecture, the extent to which service providers can access content, and their implications for freedom of expression. They should not be subjected to a single uniform regulatory framework. Paid online advertising broadcast to unspecified audiences is substantively different from user-generated content (UGC) or interpersonal communications, both in terms of the level of protection afforded to free expression and the feasibility of provider-side review. Legal design should draw these distinctions clearly.
- The international mainstream is “notice-and-action,” not prior surveillance.
Imposing general prior monitoring obligations on service providers will cause them to adopt over-blocking as a risk-avoidance strategy, thereby impairing users’ freedom of expression and access to information. Even with respect to harmful content of universally recognized severity — such as child sexual abuse material — legal frameworks in different countries do not impose general prior monitoring obligations on service providers, but instead address the problem through limited means such as mandatory reporting obligations, specific technical matching, and time-bound removal requirements.
- Existing laws lack a common set of legal principles to serve as a shared reference baseline.
The outcome of the current agency-by-agency approach to legislation is not a differentiated and rationally designed regulatory structure calibrated to the characteristics of different subject matters, but rather a patchwork of unjustified inconsistencies arising from the absence of shared guiding principles. Some laws have established safe harbor mechanisms under which service providers are shielded from liability after procedurally compliant handling; others impose fines on service providers even after they have completed removal pursuant to a notice, and contain no exemption provisions. These inconsistencies undermine the incentive for service providers to cooperate with internet content governance and erode trust in the institutional framework.
- Internet content governance should be a multi-stakeholder endeavor, not a transfer of responsibility.
In domains where the standard of unlawfulness is ambiguous or requires deep semantic judgment, unilaterally shifting onto service providers the investigative and adjudicative responsibilities that ought to be borne by competent authorities or the judiciary is tantamount to requiring private enterprises to exercise public powers as proxy. This transfer of responsibility not only exceeds the proper role of service providers as intermediaries, but also lacks democratic legitimacy and runs counter to international norms.
Section III: Our Calls to Action
- Prohibit the imposition of general prior monitoring obligations.
Service providers bear no obligation to proactively monitor user content or the content of interpersonal communications. This is a baseline of internet governance law — not a policy option subject to balancing. Content moderation measures that service providers implement voluntarily and autonomously in order to maintain user trust and safety should be permitted by law, and the fact that a provider has acted proactively should not increase its legal liability; however, such autonomous measures should remain subject to reasonable public scrutiny.
- Attribution of content liability should revert to the actor; service providers’ obligations are supplementary in nature.
That “internet service providers are neither the creators nor publishers of unlawful content” should be a foundational premise for all agencies engaged in legislation. When agencies introduce content-handling provisions into their governing statutes, the primary regulatory target should be the actual person who created or published the unlawful content. Imposing content restriction obligations on service providers is a supplementary measure — one that applies when existing mechanisms prove ineffective in holding the actual actor accountable — and not the default regulatory pathway. Moreover, the imposition of such supplementary obligations must still be proportionate, assessed against the nature of the legal interest protected and the severity of the harm.
- Safe harbors and protection for good-faith proactive measures should be a common institution across all content governance laws.
After receiving formal notification from competent authorities or judicial bodies, service providers who complete processing in accordance with legal procedures within a reasonable timeframe should not face administrative penalties or bear civil compensation liabilities due to the existence of said illegal content. What service providers are obligated to do is to establish and implement reasonable content moderation processes — not to guarantee that no unlawful content will ever appear on their services. That is, service providers bear responsibility for process, not for outcome. These safe harbor protections should not be provisions specific to individual statutes but should constitute a common baseline institution across all relevant laws. In addition, service providers that, in good faith and for the purpose of maintaining user trust and safety, take self-regulatory content moderation measures that incidentally cause harm to third parties should, as a general matter, not be held liable in damages. This protection is designed to encourage service providers to take initiative in maintaining a safe online environment, and to prevent them from being deterred from active management by concerns about potential third-party legal exposure.
- Adopt differentiated design based on the subject matter of regulation and the type of service.
Regulatory intensity should be calibrated according to: the clarity with which unlawfulness can be determined; the severity and irreversibility of the harm to protected legal interests; the degree to which the type of service allows the provider to access and intervene in content; and technical feasibility. Higher obligations may be imposed in respect of content whose unlawfulness is clear, whose harm to protected interests is serious, and where technical intervention is practicable. For content where the determination of unlawfulness is prone to gray areas, notice-and-action should constitute the basic framework. The regulatory logic applicable to public content should not be directly extended to interpersonal communications, as doing so is not technically feasible and constitutes a serious infringement of the confidentiality of correspondence. Regarding advertising content broadcasted to unspecified persons for which service providers collect fees versus user-generated content, different management standards should apply. With respect to advertising, the fact that service providers receive remuneration to serve as the broadcast medium provides a reasonable basis for imposing on them higher ex ante content review obligations, as well as obligations to verify and vet advertisers’ identities. With respect to user-generated content, service providers are neither the creators nor publishers of such content, and it is inappropriate to impose on them ex ante review obligations; notice-and-action should be the default.
- Ensure due process and effective remedies.
Decisions imposing content restrictions, whether made by competent authorities or judicial bodies, must cite their legal basis and contain: a specific factual description of the content or user activity alleged to be unlawful (including information sufficient to uniquely identify the specific unlawful content or user); a reasoned application of the cited law to the described facts; specification of the concrete measures required and their scope; and the remedies available to affected users and service providers. Both affected users and service providers should have the right to seek remedy with respect to individual decisions. Instances under some current regulations where service providers are still penalized after complying within a reasonable timeframe must be addressed through institutional reform.
- Enhance transparency and accountability on the government side.
Transparency obligations should not fall exclusively on service providers. Law enforcement agencies should periodically publish statistics on their requests to internet service providers for content or user activity restrictions and for data access, explain how these requests relate to enforcement outcomes, and enable society to assess whether the cost to rights imposed in pursuit of the public interest is reasonable.
- Establish an institutional cross-agency regulatory harmonization mechanism.
The Executive Yuan should establish a formal cross-agency review mechanism to conduct consistency-of-principles review of regulatory drafts, enforcement guidelines, and administrative measures involving internet content management across all agencies. This mechanism should be jointly driven by the NCC and the Ministry of Digital Affairs (MODA), and should specifically encompass: assessment to ensure that draft legislation meets safe harbor standards; requirements for transparent regulatory impact analyses addressing proportionality and potential adverse effects on innovation and freedom of expression; and obligations on agencies to initiate multi-stakeholder consultation processes in the course of legislation and amendment.
Furthermore, building on the NCC’s existing analytical work, the government should develop standardized model provisions — including model texts for safe harbor clauses, notice-and-action procedures, Good Samaritan protections, and due process safeguards — for agencies to draw upon when revising their governing statutes. This would accelerate legal harmonization, improve regulatory predictability, and ensure that rules governing emerging domains do not depart from coherent digital governance principles.
Section IV: Conclusion
We fully understand the government’s resolve and responsibility to combat cybercrime, protect the safety of children and adolescents, prevent fraud, and maintain public order. This statement advocates that reasonable regulations on internet content must conform to the principles of a state governed by the rule of law: clarity, consistency, predictability, proportionality, and the provision of adequate procedural safeguards and relief avenues for rights restrictions. Only in this way can we protect the internet’s role as an open, free, and trustworthy public information infrastructure while maintaining the public interest, and prevent Taiwan’s internet governance legal framework from deviating from the consensus of international democratic rule of law.
We hope that the principles articulated in this statement can serve as the starting point for ongoing dialogue among government, industry, academia, and civil society — dialogue that will ultimately, through the establishment of consistent legal review frameworks, standardized legislative templates, and multi-stakeholder consultation processes, enable all parties to jointly build a digital environment that is safe, open, fully rights-protective, principled in its consistency, and faithful to democratic values.
Signatories
Initiating Organization: Taiwan Internet Governance Forum (TWIGF)
Supporting Signatories:
- Internet Service Providers: [to be listed]
- Industry Associations in the Digital Economy: [to be listed]
- Academia: [to be listed]
- Civil Society Organizations: [to be listed]
- Relevant Professional Communities: [to be listed]
This statement welcomes endorsement by all stakeholders who share the principles set out above.