Maia in Private Mendix Platform

Last modified: February 13, 2026

Introduction

Mendix AI Assistance (Maia) refers to Mendix Platform capabilities that leverage artificial intelligence (AI) and machine learning (ML) to assist developers in application development. Maia is designed to help development teams in modeling and delivering Mendix applications faster, more consistently, and with higher quality.

Maia Capabilities in Private Mendix Platform

Mendix AI Assistance (Maia) in Private Mendix Platform has the following capabilities:

Starting point for app creation:

  • Start with Maia - a starting point in Studio Pro that helps you to start the app development process. Based on a required text description and an optional image or PDF, it generates an app that includes a domain model, data management overview pages, test data, and a tailored homepage. For more information, see Start with Maia.

Recommenders:

  • Best Practice Recommender – helps you inspect your app against Mendix development best practice detecting and pinpointing development anti-patterns and, in some cases, automatically fixing them. For more information, see Best Practice Recommender.
  • UI Recommender – helps you easily add new widgets to a page in Mendix Studio Pro without losing the context of what you are currently working on. For more information, see UI Recommender.
  • Workflow Recommender – helps you model and configure workflows in Mendix Studio Pro. It gives you contextualized recommendations on the next best activity in your workflow based on context-related information. For more information, see Workflow Recommender.

Generators:

  • Maia for Domain Model – helps you generate new domain models, and explain and provide suggestions for existing domain models. For more information, see Maia for Domain Model.

    For Private Mendix Platform, this feature is supported for version 11.6 and newer.

  • Maia for Pages – helps you generate a page. It helps you add and configure widgets based on a text input and an optional image. After a page is generated, you can continue in the same session to ask Maia for further improvements and explanations. For more information, see Maia for Pages.

    For Private Mendix Platform, this feature is supported for version 11.6 and newer.

  • Maia for Workflows – helps you generate a Workflow. By providing a use case via text input or an image, Maia can help you start creating your workflows. For more information, see Maia for Workflows.

  • Validation Assist – helps you build validation microflows in a more automated way using pre-built expressions. For more information, see Validation Assist.

Changing the LLM Provider for Maia

Starting in Private Mendix Platform 2.6, instead of usign the default Large Language Model, you can connect Maia to several different models of your choice. Currently this includes the following LLMs:

  • Anthropic
    • Small text model - Claude Haiku 4.5
    • Large text model - Claude Opus 4.6
  • AWS Bedrock
    • Small text model - Claude Haiku 4.5
    • Large text model - Claude Sonnet 4.5
    • Fallback model - Claude Sonnet 4
  • Azure
    • o3-mini
  • OpenAI
    • GPT-5-Mini

Feature Comparison

The following table shows which Maia capabilities are supported by various models.

Maia CapabilityAWS BedrockAnthropicAzureOpenAI
Maia for PagesFull supportFull supportSimple requests onlySimple requests only
Maia for Domain ModelFull supportFull supportFull supportFull support
Maia for WorkflowsFull supportFull supportSimple requests onlyDoes not support workflow creation
Start with MaiaFull supportFull supportFull supportFull support
Best Practice RecommenderFull supportFull supportFull supportFull support
UI RecommenderFull supportFull supportFull supportFull support
Workflow RecommenderFull supportFull supportFull supportFull support
Validation AssistFull supportFull supportFull supportFull support

In summary, Maia on AWS Bedrock and Anthropic modules can handle complex requests with high accuracy and reliability. For Azure and OpenAI models, the output quality for some request types may be low or inconsistent.

Configuring a Custom LLM for Maia

To configure Maia to use your own Large Language Model, perform the following steps:

  1. Log in to Private Mendix Platform as a user with Company Administrator access rights.

  2. Switch to Admin Mode by clicking the profile picture in the top right corner of the screen and selecting Switch to Admin Mode.

  3. In the left navigation menu, open the Manage section.

  4. Fill out the following information:

    • Maia Appgen URL - Enter the URL where Maia is installed. For more information, see Private Mendix Platform Quick Start Guide: Installing Maia.

    • LLM Provider - Select your LLM provider.

    • Small Text Model - Enter one of the following models, depending on your chosen LLM:

      • For AWS Bedrock Claude models, enter <your regional model ID>.anthropic.claude-haiku-4-5-20251001-v1:0
      • For Anthropic models, enter claude-haiku-4-5-20251001
      • For Azure models, enter o3-mini
      • For OpenAI models, enter gpt-5-mini-2025-08-07
    • Small Files Model - Enter one of the following models, depending on your chosen LLM:

      • For AWS Bedrock Claude models, enter <your regional model ID>.anthropic.claude-haiku-4-5-20251001-v1:0
      • For Anthropic models, enter claude-haiku-4-5-20251001
      • For Azure models, enter o3-mini
      • For OpenAI models, enter gpt-5-mini-2025-08-07
    • Large Text Model - Enter one of the following models, depending on your chosen LLM:

      • For AWS Bedrock Claude models, enter <your regional model ID>.anthropic.claude-sonnet-4-5-20250929-v1:0
      • For Anthropic models, enter claude-opus-4-6
      • For Azure models, enter o3-mini
      • For OpenAI models, enter gpt-5-mini-2025-08-07
    • Large Files Model - Enter one of the following models, depending on your chosen LLM:

      • For AWS Bedrock Claude models, enter <your regional model ID>.anthropic.claude-sonnet-4-5-20250929-v1:0
      • For Anthropic models, enter claude-opus-4-6
      • For Azure models, enter o3-mini
      • For OpenAI models, enter gpt-5-mini-2025-08-07
    • Fallback Model - Enter one of the following models, depending on your chosen LLM:

      • For AWS Bedrock Claude models, enter <your regional model ID>.anthropic.claude-sonnet-4-20250514-v1:0
      • For Anthropic models, enter claude-opus-4-6
      • For Azure models, enter o3-mini
      • For OpenAI models, enter gpt-5-mini-2025-08-07
    • API Key - Specify your secret source.