Deploy a Business Event

Last modified: April 29, 2026

Introduction

Once you have created a service in Studio Pro 9.24 and above, you can start modeling with it in your app and deploy your business event.

Modeling with Business Events (All Supported Studio Pro Versions)

Business events are defined using entities that specialize the PublishedBusinessEvent entity included in the Mendix Business Events service.

  1. In your domain model, double-click the entity you want to publish as a business event to display the entity properties.
  2. In the Generalization field, click Select and choose the PublishedBusinessEvent entity.

The base values for your entity are taken from PublishedBusinessEvent, and your entity behaves like a specialized entity. For more information, see Generalization, Specializations, and Inheritance.

The text with the blue background above the entity tells you it is a specialized entity based on the PublishedBusinessEvent entity in the BusinessEvents service:

Using the Publish Business Event Activity

After defining your business events and adding them to a published service, you can publish the events in your microflows whenever a noticeable event occurs.

Do this using the Publish business event activity:

  1. Open the microflow in which the business events will be published.
  2. Create an object of the business events you want to publish.
  3. In the Toolbox, search for the Publish business event action, drag it, and place it in your microflow.
  4. Double-click Publish business event to display the Publish Business Event property box.
  5. Enter the following information:
    • Subject – This can be anything you consider useful, such as a short description of what to expect in the payload, similar to an email subject. It helps subscribed apps decide whether the event is useful to them.
    • Event Data – Enter the entity representing the business event that you want to publish.
    • Task Queue/Output – These values are not currently used for business events and should be left unchanged.

Business Event Entities

The PublishedBusinessEvent and ConsumedBusinessEvent entities are necessary to include in your domain model to publish business events. The DeadLetterQueue and Outbox are part of the Mendix Business Events service.

  • PublishedBusinessEvent – This non-persistable entity has the field settings that every published event includes. Every published business event inherits from this entity. The three fields can be set from the Java action. This is used to define what your published business events look like.
  • ConsumedBusinessEvent – This entity has the fields that every consumed event includes. Every consumed business event inherits from this entity. These fields are set from the service, as are any additional fields that match the payload of the event. This defines what you want to receive from the business events you subscribe to.
  • DeadLetterQueue – This persistable entity in the domain model of the Business Events service is used for generating a historical record of events generated for business event activities that were not successful or had errors when received by the consumer. It can be referred to for troubleshooting. You can query the DeadLetterQueue entity to determine which received events could not be processed.
  • Outbox – This entity is used to store the event before it is sent. This entity is connected to the microflow where a business event is triggered. If the microflow fails, the entity is removed as part of the same transaction. If the event broker is down at runtime, business events accumulate in the Outbox. They are retried at increasing intervals for 48 hours and fail after that time. Once an event is successfully delivered, it is deleted from the Outbox.

Dead Letter Queue for Failed Messages

Every time a business event is received, it is transformed to match the entity created as part of the subscription. When the entity in the business event has changed based on the imported AsyncAPI document, it can render the entity unable to be processed. In such a scenario, the business event fails into a Dead Letter Queue, which contains the representation of the entity in the data column.

The most important fields in this entity to be checked when there are errors include the following:

  • type
  • source
  • subject
  • data

Use these fields to transform the payload back into a Mendix entity. If the subject is missing from the original event, the value is an empty string. If the consumed event does not have the correct format, the event does not go to the Dead Letter Queue but throws an error.

Mendix Event Broker

Within Mendix Cloud, a Mendix Event Broker is available for easy application deployment using the Mendix Business Events module. For more information, see Mendix Event Broker.

Topics and Channels

Events are placed in channels (also known as topics). Apps subscribed to a channel receive events published to that channel.

Events published by Free Apps are published to one shared company channel on a multitenant free Event Broker. Events published by apps running on licensed nodes are published to their own channels on the company Event Broker. These channels, implemented as topics on Kafka, are automatically created when the app publishing the events is deployed.

For information on setting topics and channels for your own Kafka clusters (Bring Your Own Kafka), see Configuring Deployment Constants for Your Own Kafka Cluster.

Error Handling

Event publishing is part of the transaction where the publishing occurs. This means if you decide that something has gone wrong in your microflow logic and you roll back all changes, the publishing of your events is also rolled back. No event is sent to other apps.

This is implemented as follows:

  • Published events are stored in a temporary entity table
  • When your transactions are completed successfully, the events are delivered to the Mendix Event Broker
  • If the publishing microflow fails and changes are rolled back, this also includes published events

Deployment

Business Events offers four different deployment models:

  • Deploying locally with the Local Setup Tool
  • Free apps using a free multitenant event broker
  • Production apps using the Mendix Event Broker running in Mendix Cloud
  • Apps running their own Kafka cluster (Bring Your Own Kafka)

Local Deployment

Use the Local Setup Tool for local deployments. For more information, see Using the Business Events Local Setup Tool.

When you deploy your apps to the free cluster, a free event broker is provided and configured automatically. In the Mendix Free App environment, there is a limit of 1000 events per app per day.

Free App Deployment

When you deploy your apps to the free cluster, a free event broker is provided and configured automatically. In the Mendix Free App environment, there is a limit of 1000 events per app per day.

Any free app in your organization can receive any event published by a free app in your organization, as all free apps share a single free channel for your company.

Production Deployment

To deploy to production, you must have a subscription to the Mendix Event Broker. For more information, see the Mendix Event Broker License section of Mendix Event Broker.

Make sure you enable the Mendix Event Broker for every app and environment before deploying. See Mendix Event Broker for more information.

Warning Message When Enabling Mendix Event Broker

If you enabled the Mendix Event Broker for an environment, you may receive a warning that it is not possible to enable the event broker service. If you see this message, do the following in the Services tab of the production environment screen:

  1. Enable the checkbox for the environment.
  2. Transport the .mda file to an environment.
  3. Restart the environment.

Deploy Order

The app that defines a business event service (app A) must be deployed and run before the app that uses that business event service (app B) is run.

When this requirement is not met, app B either terminates or, when using Business Events service version 3.7.0 and higher, produces errors in the log.

When this occurs, do the following:

  1. Ensure app A has started in the same space as app B.
  2. Restart app B.

Apps Running in Your Own Kafka Cluster (Bring Your Own Kafka)

Business events are powered by Apache Kafka (see Mendix Event Broker). If you want to use your own Kafka cluster instead of the Mendix Event Broker, see Configuring Deployment Constants for Your Own Kafka Cluster. Running your own cluster is referred to as Bring Your Own Kafka (BYOK).

Configuring Deployment Constants for Your Own Kafka Cluster

Business Events service exposes configuration via constants. These are set up during deployment to connect to your Kafka cluster.

All the constants are part of the Mendix Business Events service.

  • BusinessEvents.ServerUrl – Configure your Kafka bootstrap servers here as host1:port1,host2:port2,.... This setting is used to connect the app.

  • BusinessEvents.Username and BusinessEvents.Password – The service supports various Kafka authentication mechanisms. Below version 3.12.0, only the SASL/SCRAM SHA-512 authentication mechanism is supported.

  • BusinessEvents.EventBrokerSpace – This setting helps you group events into Kafka topics. With this setting, each business event will be put in its own topic. Set the EventBrokerSpace value to your environment names (or Kubernetes namespaces) like test or production. Doing so ensures that when each business event that is defined in an app is deployed to a specific environment, it will have its own topic. For example, an OrdersReceived business event defined in an app when deployed to two different environments will have two topics. A topic is named in the form of businessevents.<channel>.<EventBrokerSpace>. A channel is written as a UUID and is used to group events.

    For further explanation on topics and channels, see Topics and Channels, above.

  • TruststoreLocation and TruststorePassword (optional) – The service supports adding a Truststore and password to allow for SSL verification of the server.

  • ConsumerStartupDelaySeconds (optional) – Business Event consumers are started automatically as part of the after startup microflow. You can delay their startup by setting this constant. The startup happens in a separate thread, which means the after startup microflow can finish even though the Business Event consumers are still waiting to be started. Only values above 1 have any effect.

Additional Authentication Mechanisms

In versions 3.12.0 and above of the module, you can configure different authentication mechanism using additional constants.

A typical Kafka client authentication details may look like the following:

security.protocol=SASL_SSL
sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username='my-user' password='my-password';
sasl.mechanism=PLAIN

You would use the following constants to configure this in the module:

  • AuthnOverrideSecurityProtocol with value SASL_SSL
  • AuthnOverrideSaslJaasConfigClassName with value org.apache.kafka.common.security.plain.PlainLoginModule
  • AuthnOverrideSaslMechanism with value PLAIN
  • Username with value my-user
  • Password with value my-password

DevOps Tasks Not Covered When Running Your Own Kafka Cluster

Operating your own Kafka cluster falls outside the scope of the Mendix Cloud environment. The following DevOps tasks should be taken into consideration (this list is not exhaustive):

  • Client user name and password provision on Kafka – The creation of usernames and passwords on the Kafka cluster must be managed by the customer.
  • Topic creation on Kafka – Unless the Kafka cluster is configured with auto.create.topics.enable set to true (default setting in Apache Kafka), topics must be created by the customer. See Topics and Channels for more details.
  • Access Control – Unless the Kafka cluster is configured with allow.everyone.if.no.acl.found set to true (default setting in Apache Kafka), the ACLs must be maintained by the customer.

Managing Topics and Consumer Groups on Your Own Kafka Cluster

The channel UUID can be retrieved by inspecting the exported AsyncAPI document under the channels section of the document.

A topic is named in the form of businessevents.<channel>.<EventBrokerSpace>. A channel is written as a UUID and is used to group events.

In version 3.12.0 and above of the module, additional constants are exposed to make further configuration of topics and consumers easier:

  • ByokTopicPrefix – This constant can be added when the EventBrokerSpace prefix is configured. It ensures that all topics are prefixed by the value of this constant followed by a dot and the rest of the topic name.

    For example, if the value of ByokTopicPrefix is myawesomeproject and EventBrokerSpace has the value acceptance, then you can expect topic name(s) to be of the form myawesomeproject.businessevents.<channel>.acceptance.

  • CustomConsumerGroupIdPrefix – If your app is consuming business events and you require your consumer groups to have a certain fixed prefix value, you can configure this constant.

  • OverrideHeartbeatTopic – When the business events module is producing events, it checks its connection to Kafka by producing ping messages to a topic called the heartbeat topic. This defaults to topic _mx_heartbeat_producer_connection. You can configure this constant to override the default heartbeat topic.

Local Testing

For development and testing, it is useful to run all your apps on your local workstation, including the event broker. You can do this by running Kafka through docker-compose.

Using the Business Events Local Setup Tool

The Mendix Business Events Local Setup Tool helps you deploy locally by setting up a Docker container with Kafka. This repository includes the required docker-compose.yml file.

Start your Docker cluster using the command docker-compose up. This downloads or updates all the required Docker images and starts Kafka.

Using PostgreSQL Database (Optional)

You can configure the app running in Studio Pro to use the Postgres database created using Docker. Remember to use a different database name for every app.

Below is an example of a Postgres service that you can add to your docker-compose.yml file.

  postgres:
    image: postgres:latest
    environment:
      POSTGRES_DB: cspdb-dev
      POSTGRES_USER: mendix
      POSTGRES_PASSWORD: mendix
      PGPASSWORD: mendix
    ports:
      - "25432:5432"

Read More