I. Introduction: Thailand’s Regulatory Crossroads
Thailand’s digital regulatory model is undergoing a structural realignment. For most of the past two decades, online enforcement rested primarily on the Computer Crime Act (CCA), criminal defamation, and traditional Criminal Code provisions. These regimes addressed harmful content but did not impose architectural or systemic duties on intermediaries.
In 2025, Thailand stands at the threshold of a new paradigm—one that blends substantive content restrictions with emerging platform-governance obligations. This shift aligns with global movements toward transparency, accountability, and oversight of algorithmic tools, yet retains distinctly Thai characteristics rooted in national-security and public-order priorities.
II. Constitutional Guarantees and Their Practical Limits
Although Thailand’s Constitution recognizes freedom of expression, constitutional claims rarely override statutory restrictions. Courts consistently prioritize legislation—especially laws aimed at safeguarding national security, public order, or the monarchy. Therefore, compliance is primarily determined through statutory interpretation rather than constitutional balancing.
III. The Content-Liability Baseline
A. Computer Crime Act (CCA)
The CCA remains Thailand’s primary law governing online speech. Section 14 prohibits false or misleading information, content that endangers national security, and information that incites public panic. Section 15 introduces intermediary liability if a platform fails to remove unlawful content after receiving a valid notice. Enforcement is becoming more practical: regulators expect documented workflows, quick escalation procedures, and cooperation. (see our overview here)
B. Criminal Code
Criminal defamation, sedition, obscenity, and offenses involving national institutions apply equally to online expression. Criminal defamation remains widely used, reflecting Thailand’s continued reliance on criminal sanctions in reputational disputes. (see our 2023 article here)
C. Lèse-Majesté (Section 112)
Section 112 imposes significant penalties for defamatory or insulting statements concerning the monarchy. Courts regularly issue takedown orders under CCA Section 20. Multinational platforms must balance strict local compliance with global corporate policies and internal expression standards.
D. Election-Period Rules
Election cycles introduce heightened scrutiny, with the Election Commission closely monitoring political advertising, misinformation, and content that may affect public order.
IV. Operational and Procedural Obligations: The New Compliance Baseline
A. 2022 MDES Takedown Procedure
Platforms must remove unlawful content within 24 hours of receiving a valid court or MDES order. They must also preserve traffic data, maintain evidence logs, and document their compliance process.
B. 2025 Safe-Harbor Notification
This notification applies to content facilitating technological crime. Platforms must remove flagged content within 24 hours and maintain traceable compliance records. The framework reflects a broader move toward sector-specific safe harbors. (see our analysis here)
C. Copyright Safe Harbor
The 2022 amendments established structured notice-and-takedown procedures, including role-based obligations and repeat infringer requirements, thereby aligning Thailand more closely with international standards.
D. Traffic Data and Investigatory Support
Platforms must retain traffic data for at least 90 days and up to two years upon order. Law-enforcement requests may include subscriber information, logs, and limited decryption assistance. (See our comparison for GDPR and PDPA here)
V. The Draft Platform Economy Act (PEA): Toward Structural Governance
A. Overview
The Draft PEA is Thailand’s most ambitious effort to create a unified digital-governance regime. Unlike content-based statutes, the PEA emphasizes platform design, governance systems, and systemic-risk mitigation.
B. Tiered Classification
The draft adopts a proportional framework that distinguishes among intermediary services, online platforms, and very large online platforms (VLOPs), consistent with global trends.
C. Core Obligations
Under current drafts, platforms must implement structured notice-and-action systems, publish clear moderation policies, disclose ranking principles, ensure seller traceability, and provide meaningful complaint mechanisms.
D. VLOP Duties
Platforms with significant reach may be required to conduct systemic-risk assessments, cooperate with ETDA-certified trusted flaggers, and publish enhanced transparency reports.
E. Enforcement Model
Draft versions contemplate revenue-based fines, though final penalty structures await legislative enactment.
VI. AI, Deepfakes, and the Emerging Governance Frontier
A. Policymaker Perspective
Regulators increasingly view AI as both an innovation driver and a potential amplifier of harm. Synthetic media, algorithmic ranking, and automated filtering raise concerns relating to impersonation, misinformation, and public-order disruption.
B. Draft AI Law Principles
The Draft Principles align with ISO/IEC 42001:2023 and NIST frameworks, suggesting a risk-tiered oversight model. High-risk systems—including biometric identification, automated moderation, and political deepfake tools—will likely face heightened responsibilities.
C. Accountability
Responsibility for AI-generated outputs remains with a human actor: the developer, deployer, or platform host.
D. Interaction with Existing Law
AI-related harms are currently addressed through the CCA, Penal Code, and civil tort law.
VII. Strategic Implications for Global Platforms
- Timelines and Workflows
- Documentation as a Compliance Asset
- National-Security Sensitivities
- Preparing for Structural Oversight
VIII. Comparative Positioning: Thailand vs. EU and U.S.
Thailand’s approach differs materially from Western frameworks through its statutory speech restrictions, fragmented safe-harbor systems, involvement of administrative and police authorities, and absence of First Amendment-type protections. While sharing structural elements with the EU Digital Services Act, Thailand’s model remains anchored in national-security priorities.
IX. Conclusion
Thailand is transitioning toward a hybrid regulatory model that preserves traditional content restrictions while introducing platform-governance obligations emphasizing transparency, systemic risk, and operational accountability.
Key Takeaways
• Thailand is shifting from content-only regulation toward systemic platform governance.
• Draft PEA introduces tiered duties, transparency rules, and systemic-risk obligations.
• CCA and Penal Code remain dominant in defining substantive speech restrictions.
• Lèse-Majesté enforcement continues to shape platform moderation expectations.
• 24-hour takedown requirements and documentation duties form the new operational baseline.
• AI and synthetic media oversight will expand under forthcoming frameworks.
About the Authors

M.L. Numlapyos Sritawat is a founding partner of Formichella & Sritawat and leads the firm’s Litigation and Dispute Resolution practice. With over thirty years of courtroom experience, he has appeared before nearly every level of Thailand’s judiciary, including the Supreme Court.

Naytiwut Jamallsawat is a Partner at Formichella & Sritawat Attorneys at Law, heading the firm’s Regulatory Practice.

John P. Formichella is the founding partner of the law firm Formichella & Sritawat and heads the firm’s Technology, Media, and Telecommunications (TMT) practice.
Disclaimer: The content herein is for information purposes only, is not guaranteed to be up to date, and is not legal advice.