Legal framework

Legal regime

Does your jurisdiction have a legal regime governing or addressing online safety? If so, how does it operate?

Portugal operates a multi-tiered regulatory framework for online safety. At its foundation lies the European Union's Digital Services Act (Regulation (EU) 2022/2065 (DSA)), which applies to all providers of intermediary services, including very large online platforms (VLOPs) or search engines (VLOSEs).

Additional intermediary-service obligations relevant to online safety arise under Decree-Law No. 7/2004 of 7 January, which transposed the e-Commerce Directive. In parallel, sector-specific online safety requirements to audiovisual services under the Audiovisual Media Services Directive (EU) 2018/1808, as transposed by the Television and On-Demand Audiovisual Services Act (Law No. 27/2007 of 30 July).

In parallel, Portugal applies sector-specific online safety obligations to audiovisual services under the Audiovisual Media Services Directive (EU) 2018/1808, as transposed by the Television and On-Demand Audiovisual Services Act (Law No. 27/2007). Notably, the rules applicable to video-sharing platform providers require that audiovisual commercial communications be clearly identifiable and comply with substantive restrictions, including prohibitions on hidden advertising and subliminal techniques, as well as safeguards against harmful content: particularly concerning minors (eg, content likely to cause physical, mental or moral harm, or that exploits children's inexperience or credulity).

Online harms covered

Which online harms are covered under the relevant legislation and how are these harms defined?

Under the DSA, online harms are not defined as a closed set of prohibited content categories. Instead, it addresses harms through a risk-based approach: VLOPs and VLOSEs must identify, assess and mitigate systemic risks linked to the design and use of their services, including the dissemination of illegal content and actual or foreseeable negative effects on fundamental rights, civic discourse, public security, public health and individuals’ physical or mental well-being.

In parallel, in the audiovisual framework sets sector-specific protections for video-sharing platforms. Law No. 27/2007 addresses, in particular, content that may impact the physical, mental or emotional development of minors, and also content that may impair the physical, mental or moral development of minors, content involving incitement to violence or hatred and certain prohibited commercial practices (including manipulative subliminal techniques). Decree-Law No. 7/2004 remains relevant as it also references unlawful and criminal content affecting human dignity, public order and the protection of minors, including child sexual abuse material, incitement to hatred or violence and, in certain circumstances, serious violations of sexual privacy.

Online services covered

Which online services are covered under the law and how are these services defined?

Both the DSA and Decree‑Law No. 7/2004 apply to defined categories of information society services: that is, services normally supplied for remuneration, at a distance, by electronic means and at the individual request of the service recipient. Within this framework, they cover several types of intermediary services, including online platforms and online search engines, through the following categories:

  • mere conduit: the transmission of information in a communication network or the provision of access to such a network (eg, internet service providers);
  • caching: the automatic, intermediate and temporary storage of information provided for the sole purpose of making onward transmission to other recipients more efficient (eg, content delivery networks); and
  • hosting: the storage of information provided by and at the request of the service recipient (eg, social media platforms).

In the digital audiovisual sector, Law No. 27/2007 also regulates virtual audiovisual commercial communications, on-demand audiovisual services and video sharing platforms. Video sharing platforms are defined as services whose principal purpose (or essential functionality) is to provide user-generated videos or programmes to the public, organised by the platform provider, for information, education, information or entertainment, via electronic networks.

Territorial scope

What is the territorial scope of the relevant law?

The DSA, as the E-Commerce Directive, applies to intermediary services offered to recipients in the EU, including where the provider is not established in the EU. Differently, Law No. 27/2007 is applicable to video-sharing platform services when the provider is established in Portugal under the Audiovisual Media Services Directive.

Codes of practice

Are there any codes of practice or other non-binding guidelines or recommendations relating to online safety in your jurisdiction?

Under the DSA, the European Commission and the European Board for Digital Services have promoted voluntary codes of conduct relevant to online safety, such as the Revised Code of Conduct on Countering Illegal Hate Speech Online (Code of Conduct+). The Code of Conduct+ is intended to strengthen the handling of content qualifying as illegal hate speech under EU and national law and, while adherence is voluntary, it may support platforms in demonstrating compliance with DSA risk-mitigation obligations relating to the dissemination of illegal content.

Portugal also participates in the Better Internet for Kids (BIK+) initiative, and has developed complementary soft-law and awareness programmes, particularly in relation to minors and digital literacy.

Harmful versus illegal content

How does the law in your jurisdiction distinguish between harmful and illegal content?

The DSA defines illegal content as information that, in itself or in relation to an activity, does not comply with union law or the law of a member state. This concept is central in delineating the scope of providers’ content moderation and enforcement decisions under the DSA.

By contrast, harmful content is not formally defined in the DSA. It is commonly understood as content that may produce adverse effects on users. Given the inherent vagueness of this category, relying on it as a basis for removal decisions may raise freedom of expression concerns and, in practice, leave significant discretion to online service providers to identify and remove content they deem harmful. The European Commission has emphasised that illegal content should be distinguished from harmful content in the context of online content moderation.

Extremist and terrorism-related content

How does your jurisdiction regulate the dissemination of extremist and terrorism-related content online?

Under the DSA, terrorism content qualifies as illegal content. In addition, the regulation provides for an exceptional crisis response mechanism under which VLOPs and VLOSEs may, following a recommendation from the European Board for Digital Services, be required to adopt proportionate and time-limited measures to address serious cross-border threats, including terrorism.

More specifically, the dissemination of terrorist content online is regulated by Regulation (EU) 2021/784, which requires hosting service providers to comply with removal or blocking orders issued by competent authorities. The Regulation also contains safeguards, including an exclusion for content disseminated for educational, journalistic, artistic or research purposes, or for the prevention or countering of terrorism, based on an assessment of the actual purpose of the dissemination.

In addition, Law No. 60/2025 of 22 October granted the government a legislative authorisation to adopt implementing legislation and designate the Judiciary Police as the competent authority to issue removal or blocking decisions, subject to subsequent validation by a Court decision within 48 hours.

Disinformation versus misinformation

How, if at all, does the law in your jurisdiction distinguish between misinformation and disinformation online? Does it include malinformation?

Portuguese law does not establish a general, binding legal classification defining misinformation, disinformation and malinformation. Instead, the DSA addresses these issues through a systemic-risk framework, particularly concerning risks to society and democracy, and fundamental rights.

Law No. 27/2021 of 17 May (Portuguese Charter of Human Rights in the Digital Age) previously contained a statutory definition of disinformation in its article 6, but those provisions have since been repealed.

The text described disinformation as false or misleading narrative created, presented and disseminated to obtain an economic advantage or deliberately deceive the public, where it may cause public harm (including impacts on democratic political processes, public policy-making and public goods). It also provided listed examples (such as manipulated or fabricated texts or videos, inbox-flooding practices and networks of fictitious followers); and established a complaint mechanism involving the media regulator. This approach raised concerns, particularly from a freedom of expression perspective, regarding the design and potential implications of the mechanism.