Instana

Instana

The community for performance and observability professionals to learn, to share ideas, and to connect with others.

 View Only

Introducing the Instana MCP server: Conversational AI for observability

By Madhu Tadiparthi posted 6 days ago

  

Authors: @Madhu Tadiparthi, @Elina priyadarshinee, @Guangya Liu

The Model Context Protocol (MCP) is an innovative framework that enables connecting AI assistants, such as Claude, GitHub Copilot, and other LLM-powered tools, with external tools and data sources through standardized interfaces. MCP servers transform complex API interactions into natural language conversations. This approach makes enterprise tools accessible through simple queries. Integrating MCP with an enterprise tool, such as Instana, lays a new foundation for AI-driven observability.

Instana is a leading application performance monitoring (APM) and observability platform that provides real-time insights into application performance, infrastructure health, and user experience. With automatic discovery, distributed tracing, and comprehensive monitoring across cloud-native and traditional environments, Instana helps organizations maintain optimal application performance and quickly resolve issues before they impact users. In this blog, you will get an overview of the Instana MCP server. You’ll learn how to configure the MCP server to communicate with MCP clients, such as Claude Desktop, using both the standard input and output (stdio) and streamable HTTP communication modes.

Instana MCP server

The Instana MCP server is built as a comprehensive wrapper around Instana’s public REST APIs. It translates conversational queries into precise API calls and formats responses for AI assistants. This approach eliminates the traditional barriers between monitoring data and development workflows, allowing teams to access critical observability insights without leaving their preferred AI-powered environments.

As an open-source project on GitHub, the Instana MCP server encourages community contributions and customizations. This makes enterprise-grade observability accessible to development teams of all sizes while maintaining flexibility to adapt to organizational needs.

Architecture

The Instana MCP server uses a clean, modular architecture designed for scalability and ease of use.

Key architectural components:

  • MCP host integration: Seamlessly connects popular AI platforms, such as Claude Desktop, GitHub Copilot, and Visual Studio Code.
  • Dual transport support: Supports both stdio and streamable HTTP transport modes for maximum compatibility.
  • Modular tool organization: Organizes tools by functional categories, such as application resources, infrastructure monitoring, and alert configuration.
  • Instana API abstraction: Provides a clean abstraction layer over Instana’s REST APIs with intelligent response formatting.

Supported Instana tools

The Instana MCP server offers more than 40 specialized tools covering Instana’s full observability capabilities. These tools are organized into logical categories, enabling you to monitor and analyze systems through natural language interactions.

Overview of tool categories

🚀 Application performance monitoring

  • Application metrics: Provides real-time and historical performance data for applications, endpoints, and services.
  • Application resources: Supports discovery and management of applications, services, and their dependencies.
  • Alert configuration: Enables complete lifecycle management of Application smart alert configurations.

🏗️ Infrastructure monitoring

  • Infrastructure resources: Includes host monitoring, snapshot analysis, and software inventory management
  • Infrastructure catalog: Supports exploration of available metrics and
  • Infrastructure topology: Provides dependency mapping and relationship analysis between components.
  • Infrastructure analysis: Enables deep metrics analysis.

Key Capabilities

· Comprehensive API coverage: Access to Instana’s complete REST API suite, including metrics retrieval, resource discovery, alert management, and infrastructure analysis.

· Intelligent query processing: Features automatic parameter validation, pagination handling, response formatting, and contextual error messaging optimized for AI interactions.

· Flexible filtering: Offers advanced filtering options, such as, time-based queries, tag-based searches, metric thresholds, and text-based discovery across your observability stack.

· Natural language interface: Each tool is designed to understand conversational queries such as:

  • “Show me applications with high error rates in the last hour”
  • “What infrastructure plugins are available for monitoring databases?”
  • “List all active alert configurations for our production services.”

Getting started

Explore available tools through simple queries, such as:

  • “What Instana monitoring capabilities are available?”
  • “Show me tools for application performance analysis.”
  • “Help me understand my infrastructure topology.”

For detailed tool documentation, parameter specifications, and advanced usage examples, visit the GitHub repository. You can find comprehensive API references and integration guides there.

After exploring available tools, you can integrate the Instana MCP server with Claude Desktop to enable conversational observability in your environment.

Connecting Claude Desktop

Claude Desktop provides a straightforward way to interact with the Instana MCP server, offering a seamless chat interface for querying observability data.

For more details refer Claude Desktop Setup.

Using Instana tools in Claude Desktop

Once connected, you can see the Instana tools available in the Claude Desktop interface. You can now use natural language to query your observability data:

Examples

Query application performance: “Show me all applications monitored by Instana and their current health status.”

Investigate infrastructure issues: “Get me the recent Kubernetes events from the last 6 hours that might indicate deployment problems.”

Analyze alert configurations: “List all smart alert configurations for our production applications and show which ones are currently active”

Explore available metrics: “What monitoring capabilities does Instana have for PostgreSQL databases? Show me the available metrics and payload keys.”

Streamable HTTP mode

Use streamable HTTP mode for web-based deployments or when you need multiple concurrent connections,

  1. Start the MCP server:
uv sync && uv run src/core/server.py --transport streamable-http --debug

2. Configure Claude Desktop for streamable HTTP:

{
"mcpServers": {
"Instana MCP Server": {
"command": "npx",
"args": [
"mcp-remote", "http://0.0.0.0:8000/mcp/",
"--header", "instana-base-url: https://your-instana-instance.instana.io",
"--header", "instana-api-token: your_instana_api_token"
]
}
}
}

Note that this set-up allows more robust connections and easier debugging of server-side issues.

The MCP server supports selective tool loading to optimize performance and reduce resource usage. You can enable only the tool categories you need for your specific use case.

Available Tool Categories

infra — Infrastructure monitoring tools

app — Application performance tools

events — Event monitoring tools

Usage

uv run src/core/server.py --tools infra,events --transport streamable-http

uv run src/core/server.py --tools app --transport streamable-http

Advantages of using the Instana MCP server

🚀 Accelerated troubleshooting

Transform complex API queries into natural language requests. Instead of crafting intricate REST API calls, you can ask: “Show me recent alerts for the robot-shop application” or “What Kubernetes events occurred in the last 24 hours?”

🛠️ Developer-friendly integration

Integrate observability data directly into your development environment through VS Code, Claude Desktop, or custom MCP clients. This eliminates context switching between tools.

📊 Comprehensive API coverage

Access more than 40 Instana API endpoints covering:

  • Application performance: Metrics, endpoints, services, and data analytics
  • Infrastructure monitoring: Host details, snapshots, and topology mapping
  • Alert management: Smart alert configurations, baselines, and notifications
  • Catalog information: Available metrics, plugins, and search capabilities

🔧 Flexible deployment options

Deploy using stdio mode for direct integration or streamable HTTP mode for web-based applications. Note that streamable HTTP mode supports multiple concurrent connections.

🌐 Open-source collaboration

Instana MCP server is built as an open-source project, encouraging community contributions and customisations for specific enterprise needs.

Getting started with contributions

Check out the contribution guidelines for details on how to contribute.

Contribution guidelines

  • Follow existing code patterns and naming conventions.
  • Write comprehensive tool descriptions to assist the LLM in understanding when to use each tool
  • Include error handling and meaningful error messages.
  • Add tests for the new functionality.
  • Update documentation, including README.md and inline comments.
  • Submit pull requests with clear descriptions of your changes.

Known issues

In the current MCP setup, each time an LLM agent wants to invoke a tool, it must first evaluate which tool to use. This typically requires:

  • Injecting all available tool descriptions (name, purpose, input/output schema) into the prompt.
  • Asking the LLM to reason over the full list to select the right tool.

However, this introduces the following issues:

  • High token consumption occurs, especially when the number of tools increases.
  • Increased latency and cost for every tool selection step.
  • Repetitive prompt construction, even when the tool choice is obvious or reused across queries.

To address this issue, the MCP host or client should implement strategies that reduce token usage when selecting tools from the MCP server.

Get started today

Ready to revolutionise your observability workflow?

  1. Explore the project: Visit the GitHub repository.
  2. Try the examples: Follow the quick start guide to connect with Claude Desktop or GitHub Copilot.
  3. Join the community: Contribute new tools, report issues, or suggest enhancements.
  4. Share your experience: Share how you use the Instana MCP server in your workflow.

The future of observability is conversational, and with the Instana MCP server, that future is available today. Use it to transform how you interact with your monitoring data and unlock higher levels of productivity in your development and operations workflows.

We welcome contributions from the community! The Instana MCP server is designed to be extensible, making it easy for developers to add new tools and enhance existing functionality.

Additional resources

1. Cline MCP Rules

2. MCP Use Tool Access Control

The Instana MCP server is an open-source project. Visit the GitHub repository to get started, contribute, or learn more about integrating AI-powered observability into your development workflow.


#RESTAPI
#Tracing
0 comments
10 views

Permalink