Content Policy

Version 1, dated September 19, 2025

Introduction

Purpose

At Merlin Software, we strive for transparent, reliable, and secure communication. This policy describes how we handle content that we publish or host via our digital platforms, within the framework of the Digital Services Act (DSA).

This policy is part of our Information Security Management System (ISMS) and aligns with processes compliant with the ISO/IEC 27001:2022 standard.

Reference Framework

Digital Services Act

Scope

This policy applies to all content published or hosted by Merlin Software via:

  • Our website: https://www.merlincrisis.com;
  • Our SaaS solutions: CrisisSuite and the Simulated Media Tool (SMT);
  • Additional communication channels used to reach large audiences, such as email newsletters.

Examples of content:

  • Website content, product information, blogs, and news articles;
  • User manuals and knowledge base articles;
  • Content entered or generated within our platforms, including:
  • Factual information about (preparation for) crises and incidents, entered and managed by customers in CrisisSuite;
  • Simulated content, such as training scenarios, messages, or media items in the SMT.

Guidelines for Content Publication

We ensure that content is:

  • is accurate and up-to-date, to the extent that this falls within our responsibility;
  • is not misleading, except for simulated content (see below);
  • does not contain illegal or harmful content, such as hate speech, incitement, defamation, or disinformation;
  • has relevance and function within the framework of incident and crisis management,
  • crisis communication or training.

Exception for Simulated Content

CrisisSuite the SMT may contain intentionally inaccurate or misleading information. This is essential for simulating crisis situations, training crisis response teams, and developing information literacy skills under pressure. Such content is clearly identified within the platform and is exempt from the usual requirements for accuracy and factuality.

Illegal Content

Illegal content refers to any information that violates applicable laws or regulations. This may include, for example, statements or material that are punishable by law, infringe upon the (property) rights of others, or are otherwise unlawful.

The assessment depends on the context and relevant legislation (Dutch and European law). We adhere to the definitions as stipulated in the Digital Services Act and consider signals from competent authorities.

Not all inappropriate or undesirable content is by definition illegal. However, we can assess reports regarding such content based on this policy and the terms of use of our services.

Restrictions and Measures

In the event of a violation of this policy, laws or regulations, or contractual agreements, we may take measures.

Possible measures include:

  • Temporary or permanent removal of content;
  • Temporary restriction of access to certain functionalities;
  • Suspension or termination of access to the platform or a specific environment.

Measures are always proportionate. The user and/or organization involved will be informed and may object. 

Management and Moderation

Content on our website is published and maintained by authorized personnel.

For content within our platforms:

  • CrisisSuite: Merlin Software does not play an active role in moderating factual content, but facilitates reporting and escalation if content appears to violate the law.
  • SMT: Simulated content may be reviewed automatically or manually if there is cause to do so.

All actions regarding reports, moderation, and any measures are carried out within the framework of our ISMS.

Reporting Inappropriate or Illegal Content

Reports can be submitted via the report illegal content form.

This form is also intended for authorities.

We will process reports of suspected illegal or inappropriate content within 48 hours. The reporter will receive an acknowledgment of receipt from us, unless the report pertains to criminal offenses involving sexual abuse and no contact details have been provided. Subsequently, the report will be carefully assessed by the responsible party within the ISMS. Afterwards, the reporter will be informed of the outcome of the report, if possible.

Please note: If we receive a report indicating content within our platforms that involves (a suspicion of) a criminal offense threatening the life or safety of individuals, we will inform the responsible organization (the client). This will not apply if the nature of the situation indicates that the client is already aware.

Transparency in Moderation

In the event of content removal or modification:

  • We will inform the responsible party (if possible) with the reason and an explanation;
  • We will provide an opportunity to object or offer further clarification;
  • All moderation actions and decisions are logged and stored in accordance with our information security policy, with appropriate access restrictions.

Other Complaints or Reports

Complaints regarding our handling of content or moderation decisions (outside of formal reports) can be submitted via Contact.

We take all reports seriously and handle them with care and discretion.

Processing of Personal Data

Personal data processed by us in the context of reports or moderation is handled in accordance with the GDPR. More information on how we manage personal data can be found in our Privacy Statement.