Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ weight: 20

## Introduction

The **GenAI Resources** section provides a detailed overview of all Mendix GenAI resources available within your company, allowing Mendix Admins to seamlessly provision and deprovision GenAI resources as needed. With this feature, Mendix Admins can efficiently manage all GenAI resources directly within the [Control Center](https://controlcenter.mendix.com/index.html) through a self-service capability, ensuring streamlined operations and improved governance. For more information, refer to [Accessing GenAI Resources](/appstore/modules/genai/mx-cloud-genai/resource-packs/#accessing-genai-resources).
The **GenAI Resources** section provides a detailed overview of all Mendix GenAI resources available within your company, allowing Mendix Admins to seamlessly provision and deprovision GenAI resources as needed. With this feature, Mendix Admins can efficiently manage all GenAI resources directly within the [Control Center](https://controlcenter.mendix.com/index.html) through a self-service capability, ensuring streamlined operations and improved governance. For more information, refer to [Accessing GenAI Resources](/appstore/modules/genai/v2/mx-cloud-genai/resource-packs/#accessing-genai-resources).

## Prerequisites

Expand Down Expand Up @@ -44,7 +44,7 @@ When provisioning a new resource, enter the following information:
* **Display Name** – The name of the resource.
* **Environment** – The environment for which the resource is created, such as Test, Acceptance, or Production.
* **Mendix Cloud Region** – The cloud region where the resource will be hosted.
* **Cross-region inference** – Specifies whether the selected model supports cross-region inference. For more information, refer to the [Settings](/appstore/modules/genai/mx-cloud-genai/Navigate-MxGenAI/#settings) section of *Navigate through the Mendix Cloud GenAI Portal*.
* **Cross-region inference** – Specifies whether the selected model supports cross-region inference. For more information, refer to the [Settings](/appstore/modules/genai/v2/mx-cloud-genai/Navigate-MxGenAI/#settings) section of *Navigate through the Mendix Cloud GenAI Portal*.
* **Available Text Generation Models** – A list of the supported models you can choose from, for example, Anthropic Claude Sonnet V4.
* **Size** – The subscription plan with the tokens used for resources.
* **User** – The name of the user for whom the provisioning was initially created.
Expand Down
90 changes: 90 additions & 0 deletions content/en/docs/genai/_index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,90 @@
---
title: "Enrich Your Mendix App with GenAI Capabilities"
url: /appstore/modules/genai/
linktitle: "Agentic AI"
description: "Describes how to use Mendix's generative AI capabilities to build agentic applications."
weight: 16
---

## Introduction {#introduction}

With Mendix generative AI (GenAI) capabilities, you can create engaging, intelligent experiences with a variety of AI models and your own data. Build AI-powered applications with Agents Kit, a set of components that support implementations ranging from simple text generation to complex multi-step agentic workflows.

Agents Kit 2.0 is available for Studio Pro 11.12 and above. Agents Kit 1.0 is available for Studio Pro 10.24 and above. Older versions of some Marketplace modules and the GenAI Showcase App are available in Studio Pro 9.24.2.

These pages document the modules, connectors, and apps for building agentic applications with models from Amazon Bedrock, OpenAI, Mistral, Google Gemini, and other platforms.

{{% alert color="info" %}}
These pages focus on building agentic applications with Agents Kit. For AI assistance while building apps, see [Mendix AI Assistance (Maia)](/refguide/mendix-ai-assistance/). For pre-trained machine learning models, see [Mendix Runtime](/refguide/runtime/).
{{% /alert %}}

### Typical Use Cases

Mendix supports a variety of generative AI tasks by integrating with tools such as Amazon Bedrock or Microsoft Foundry. Typical use cases include the following:

* Create conversational UIs for AI-powered chatbots and integrate them into your Mendix applications.
* Connect any model through GenAI connectors or by integrating your connector into the GenAI Commons interface.
* Connect your data to ground GenAI systems with data from your application and the rest of your IT landscape.

### Getting Started

To familiarize yourself with the GenAI capabilities of Mendix, explore the sections below based on your experience level:

#### Familiar with GenAI

If you are already familiar with GenAI and want to start building, see the [How to Build Smarter Apps Using GenAI](/appstore/modules/genai/how-to/) guide to start building your first GenAI-powered application and access additional resources.

#### New to GenAI

If you are new to GenAI, follow the steps below:

1. Familiarize yourself with the [concepts](/appstore/modules/genai/get-started/) such as prompt engineering, Retrieval Augmented Generation (RAG), and function calling (ReAct).
2. Select the right architecture to support your use case.
3. Obtain the required credentials for your selected architecture.

## Available Models {#models}

Mendix connectors offer direct support for the following models:

| Architecture | Models | Category | Input | Output | Additional capabilities |
| -------------- | --------------------- | --------------------- | ------------------- | ----------- | ----------------------- |
| Mendix Cloud GenAI | [Anthropic Claude Sonnet Models](/appstore/modules/genai/v2/mx-cloud-genai/resource-packs/#supported-models) | Chat Completions | text, image, document | text | Function calling |
| | [Cohere Embed Models](/appstore/modules/genai/v2/mx-cloud-genai/resource-packs/#supported-models) | Embeddings | text | embeddings | |
| Microsoft Foundry (OpenAI) / OpenAI | gpt-4, gpt-4-turbo, gpt-4o, gpt-4o mini, gpt-4.1, gpt-4.1-mini, gpt-4.1-nano, gpt-5.0, gpt-5.0-mini, gpt-5.0-nano, gpt-5.1, gpt-5.2, o1, o1-mini, o3, o3-mini, o4-mini | Chat completions | text, image, document (OpenAI only) | text | Function calling |
| | DALL·E 2, DALL·E 3, gpt-image-1 | Image generation | text | image | |
| | text-embedding-ada-002, text-embedding-3-small, text-embedding-3-large | Embeddings | text | embeddings | |
| Mistral | Mistral Large 3, Mistral Medium 3.1, Mistral Small 3.2, Ministral 3 (3B, 8B, 14B), Magistral (Small, Medium) | Chat Completions | text, image | text | Function calling |
| | Codestral, Devstral (Small, Medium), Open Mistral 7B, Mistral Nemo 12B | Chat Completions | text | text | Function calling |
| | Mistral Embed, Codestral Embed | Embeddings | text | embeddings | |
| Google Gemini | Gemini 2.5 Flash (+ Preview Sep 2025), Gemini 2.5 Flash-Lite (+ Preview Sep 2025), Gemini 2.5 Pro, Gemini Flash Latest, Gemini Flash-Lite Latest, Gemini Pro Latest| Chat Completions | text, image | text | Function calling |
| | Gemini 3 Flash Preview, Gemini 3 Pro Preview | Chat Completions | text, image | text | |
| Amazon Bedrock | Amazon Titan Text G1 - Express, Amazon Titan Text G1 - Lite, Amazon Titan Text G1 - Premier | Chat Completions | text, document (except Titan Premier) | text | |
| | AI21 Jamba-Instruct | Chat Completions | text | text | |
| | AI21 Labs Jurassic-2 (Text) | Chat Completions | text | text | |
| | Amazon Nova Pro, Amazon Nova Lite | Chat Completions | text, image, document | text | Function calling |
| | Amazon Titan Image Generator G1 | Image generation | text | image | |
| | Amazon Titan Embeddings Text v2 | Embeddings | text | embeddings | |
| | Anthropic Claude 3 Sonnet, Anthropic Claude 3.5 Sonnet, Anthropic Claude 3.5 Sonnet v2, Anthropic Claude 3 Haiku, Anthropic Claude 3 Opus, Anthropic Claude 3.5 Haiku, Anthropic Claude 3.7 Sonnet, Anthropic Claude 4.5 Sonnet, Anthropic Claude 4.5 Haiku, Anthropic Claude 4.5 Opus | Chat Completions | text, image, document | text | Function calling |
| | Cohere Command | Chat Completions | text, document | text | |
| | Cohere Command Light | Chat Completions | text | text | |
| | Cohere Command R, Cohere Command R+ | Chat Completions | text, document | text | Function calling |
| | Cohere Embed English, Cohere Embed Multilingual | Embeddings | text | embeddings | |
| | DeepSeek, DeepSeek-R1 | Text | text | document | |
| | Meta Llama 2, MetaLlama 3 | Chat Completions | text, document | text | |
| | Meta Llama 3.1 | Chat Completions | text, document | text | Function calling |
| | Mistral AI Instruct | Chat Completions | text, document | text | |
| | Mistral Large, Mistral Large 2 | Chat Completions | text, document | text | Function calling |
| | Mistral Small | Chat Completions | text | text | Function calling |
| | OpenAI gpt-oss-20B, gpt-oss-120b | Chat Completions | text | text | |

For more details on limitations and supported model capabilities for the Bedrock Converse API used in the ChatCompletions operations, see [Supported models and model features](https://docs.aws.amazon.com/bedrock/latest/userguide/conversation-inference-supported-models-features.html) in the AWS documentation.

The available showcase applications offer implementation inspiration for many of the listed models.

#### Connecting to Other Models

In addition to the models listed above, you can also connect to other models by implementing one of the following options:

* To connect to other [foundation models](https://docs.aws.amazon.com/bedrock/latest/userguide/models-features.html) and implement them in your app, use the [Amazon Bedrock connector](/appstore/modules/aws/amazon-bedrock/).
* To connect to [Snowflake Cortex LLM](https://docs.snowflake.com/en/sql-reference/functions/complete-snowflake-cortex) functions, [configure the Snowflake AI Data Connector for Snowflake Cortex Analyst](/appstore/connectors/snowflake/snowflake-ai-data-connector/#cortex-analyst).
* To implement your own connector that is compatible with other components, use the [GenAI Commons](/appstore/modules/genai/commons/) interface and follow the instructions in [Build Your Own GenAI Connector](/appstore/modules/genai/how-to/byo-connector/).
Original file line number Diff line number Diff line change
Expand Up @@ -142,6 +142,6 @@ This pattern is supported both by [OpenAI](https://platform.openai.com/docs/guid

The agent concept combines prompts, RAG (Retrieval Augmented Generation), and ReAct patterns in a single call. These components of agent-based logic are all supported by our Agents Kit. Using LLMs, business logic can be enriched by enabling AI agents to reason and autonomously execute actions while being grounded in domain-specific knowledge. With Mendix's Agents Kit, agents become a seamless part of your application's logic.

For an overview of the components that help you get started, refer to [the Agents Kit overview](/appstore/modules/genai/#architecture).
For an overview of the components that help you get started, refer to [the Agents Kit components](/appstore/modules/genai/v2/#components).

In addition, you can integrate agentic behavior in a Mendix app by leveraging external agents through cloud infrastructure providers. In this case, the Mendix app does not store the agent definition. Instead, it only calls the external agent. For example, [Agents for Amazon Bedrock](https://aws.amazon.com/bedrock/agents/) provides this functionality for Amazon Bedrock. You can find out how to use this in your Mendix application in [Invoking an Agent with the InvokeAgent Operation](/appstore/modules/aws/amazon-bedrock/#invokeagent) section of the *Amazon Bedrock* module documentation.
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ For more general information on this topic, see [OpenAI: Function Calling](https

All platform-supported connectors ([Mendix Cloud GenAI](/appstore/modules/genai/mx-cloud-genai/MxGenAI-connector/), [OpenAI](/appstore/modules/genai/openai/), and [Amazon Bedrock Connector](/appstore/modules/aws/amazon-bedrock/)) support function calling by leveraging the [GenAI Commons module](/appstore/modules/genai/commons/). Function calling is supported for all chat completions operations. All entity, attribute, and activity names in this section refer to the GenAI Commons module.

Functions in Mendix are essentially microflows that can be registered within the request to the LLM​. The LLM connector takes care of handling the tool call response as well as executing the function microflows until the LLM returns the final assistant's response. Function microflows can have none, a single, or multiple primitive input parameters such as Boolean, Datetime, Decimal, Enumeration, Integer or String. Additionally, they may accept the [Request](/appstore/modules/genai/genai-for-mx/commons/#request) or [Tool](/appstore/modules/genai/genai-for-mx/commons/#tool) objects as inputs. The microflow can only return a String value.
Functions in Mendix are essentially microflows that can be registered within the request to the LLM​. The LLM connector takes care of handling the tool call response as well as executing the function microflows until the LLM returns the final assistant's response. Function microflows can have none, a single, or multiple primitive input parameters such as Boolean, Datetime, Decimal, Enumeration, Integer or String. Additionally, they may accept the [Request](/appstore/modules/genai/v2/genai-for-mx/commons/#request) or [Tool](/appstore/modules/genai/v2/genai-for-mx/commons/#tool) objects as inputs. The microflow can only return a String value.

To enable function calling, a `ToolCollection` object must be added to the request, which is associated to one or many `Function` objects.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ To understand the basics of MCP, it is important to know the common terminology.

### MCP Host

The MCP host is typically the application that facilitates interaction with LLMs. While a chat interface is the most common use case, the host can support a variety of interaction use cases. The host takes care of the communication between users and models, while enabling users to manage their AI use, for example, managing credentials or historical chat conversations. A host can be a Mendix application that uses [GenAI Commons](/appstore/modules/genai/genai-for-mx/commons/) and a compatible connector to interact with LLMs, for example, a chat interface built with [Conversational UI](/appstore/modules/genai/genai-for-mx/conversational-ui/).
The MCP host is typically the application that facilitates interaction with LLMs. While a chat interface is the most common use case, the host can support a variety of interaction use cases. The host takes care of the communication between users and models, while enabling users to manage their AI use, for example, managing credentials or historical chat conversations. A host can be a Mendix application that uses [GenAI Commons](/appstore/modules/genai/commons/) and a compatible connector to interact with LLMs, for example, a chat interface built with [Conversational UI](/appstore/modules/genai/conversational-ui/).

### MCP Client

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -37,9 +37,9 @@ A user prompt is another fundamental type. It is the user’s input, question, o

### Context Prompt

Depending on the project or use case, adding contextual information to the model may be necessary. Normally, this information, called context prompt or conversation history, is sent in the same interaction as the system and user prompt. It captures the historical information of the conversation to maintain coherence with the end user and be context aware. In the Mendix app chatbot setup, developers configure this within their application, and it is included in the request sent to the LLM using the [Chat Completions (with history)](/appstore/modules/genai/genai-for-mx/commons/#chat-completions-with-history) operation.
Depending on the project or use case, adding contextual information to the model may be necessary. Normally, this information, called context prompt or conversation history, is sent in the same interaction as the system and user prompt. It captures the historical information of the conversation to maintain coherence with the end user and be context aware. In the Mendix app chatbot setup, developers configure this within their application, and it is included in the request sent to the LLM using the [Chat Completions (with history)](/appstore/modules/genai/v2/genai-for-mx/commons/#chat-completions-with-history) operation.

To understand this concept, imagine a user interacting with a chatbot while asking, *How should I start?*. If in previous interactions, the user asked about Mendix, the LLM will understand that the question refers to the Mendix apps. In cases where the context is not needed, such as in command-based interactions where the inquiry could be: *Turn on the lights* and the LLM does not need any historical conversation, developers can use operations like [Chat Completions (without history)](/appstore/modules/genai/genai-for-mx/commons/#chat-completions-without-history).
To understand this concept, imagine a user interacting with a chatbot while asking, *How should I start?*. If in previous interactions, the user asked about Mendix, the LLM will understand that the question refers to the Mendix apps. In cases where the context is not needed, such as in command-based interactions where the inquiry could be: *Turn on the lights* and the LLM does not need any historical conversation, developers can use operations like [Chat Completions (without history)](/appstore/modules/genai/v2/genai-for-mx/commons/#chat-completions-without-history).

## Typical Components of a Prompt

Expand Down
Loading