We’re excited to unveil a powerful new feature in Business Automation Manager Open Edition (BAMOE)—the BAMOE MCP (Model Context Protocol) server. This addition marks a significant step forward in making BAMOE’s decision automation capabilities seamlessly consumable within AI agentic environments.
As AI agents become increasingly autonomous and context-aware, the demand for dynamic, rule-driven decision-making systems is growing rapidly. These agents require not just raw intelligence, but the ability to reason, evaluate, and act based on structured logic and business rules—precisely where BAMOE shines.
The BAMOE MCP server acts as a dedicated and isolated server layer that exposes Business Services—such as rule evaluation, decision models, and process orchestration—through a standardized interface tailored for agentic workflows. Whether you're building intelligent assistants, autonomous systems, or orchestrating complex business logic, the BAMOE MCP server empowers your agents to reason, decide, and act with precision.
The current exposed Business Services available in 9.3.0 are:
- Decision (DMN) model execution.
- Rule models execution.
- Process API, which includes the following actions:
- Create new process instances.
- Retrieve a list or a single process instance's information;
- Modify process instances;
- Delete process instances.
This new capability will be available as a Tech Preview in BAMOE version 9.3.0, giving early adopters a chance to explore and shape the future of agentic AI integration. We invite you to try it out and share your feedback!
Prerequisites and Setup
- A running BAMOE Business Services instance, which:
- is Quarkus-based;
- exposes OpenAPI v3 endpoints;
-
The Business services should have meaningful models, fields, and nodes names, to better to improve the effectiveness of iteration with AI agents
- The "bamoe / mcp-server" image is available in your Container Orchestration Platform. The only required parameter is the BAMOE OpenAPI v3 endpoints.
Execution
Once the setup is complete, the MCP server is ready to run. At startup, it parses all exposed BAMOE Business Services, translating their capabilities into MCP Tools. These tools can then be consumed by any MCP-compatible client or agentic AI environment that supports the MCP protocol—enabling end users to leverage BAMOE’s capabilities seamlessly within AI-driven workflows.
After startup, the MCP Server begins serving requests from both MCP Clients and AI Agents. Its internal state is immutable, meaning that any changes made to the underlying Business Services are not automatically reflected. To apply updates or modifications, a new MCP Server bootstrap is required.
Please refer to the IBM documentation to have more details about the MCP Server setup and execution.
Any MCP-compatible clients that supports the MCP protocol is a valid choice to consume the BAMOE MPC Server. Examples are: Goose, 5ire, LM Studio and Claude Desktop.
To enable communication between an MCP Client and an MCP Server, typically a new extension should be registered in the MCP Client of your choice, since the MCP Server is treated as an extension of the client. Required data are:
SSE type is supported but deprecated. STDIO (Standard IO) is not supported.
Examples
One of the most natural and impactful use cases for the exposed MCP Tools is their integration with AI Agents, such as large language models (LLMs). This interaction enables agents to dynamically invoke decision models, rule engines, and process APIs—bringing BAMOE’s automation capabilities directly into agentic workflows.
Below is an example of a DMN model execution via LLM chat interaction, where an agent processes a traffic violation scenario using the TrafficViolation decision model (dmn-quarkus-example):
Goose is used as the MCP-Compatible client, and Ollama with Granite4.0-preview:tiny as LLM model.
Note: Results may vary depending on the specific MCP client implementation and the LLM model used. Different agents may interpret inputs or invoke tools in slightly different ways, leading to variations in execution flow or output. Any LLM model that supports MCP Tools should return similar results. Not all LLM models supports MCP tools.
Below, the same example using a different LLM model (qwen3:1.7b)
Below is an example of a BPMN model life-cycle management via LLM chat interaction, where a process instance is started, updated, and removed, based on the Order.bpmn model (process-quarkus-example):
Goose is used as the MCP-Compatible client, and Ollama with Granite4.0-preview:tiny as LLM model.
Conclusion
With this new feature, BAMOE officially enters the AI Agentic era — marking the first step in a broader strategy to integrate AI capabilities into the platform.
The MCP Server will continue to evolve, with enhanced capabilities designed to better address business needs and increase overall effectiveness — reinforcing IBM’s commitment to advancing and expanding BAMOE.
References
https://modelcontextprotocol.io/