MCP Server for AI Agents
Why?
The Model Context Protocol (MCP) is an open standard that lets AI assistants and agents connect to external tools and data sources. Instead of writing custom integration code, AI applications discover and call tools through a standardized protocol.
Infinispan ships a built-in MCP server that exposes cache operations as MCP tools. This means AI agents like Claude, GitHub Copilot, or custom LLM-powered applications can directly:
- Read and write cache entries — store and retrieve data without writing any Infinispan client code.
- Query and search — run Ickle queries and vector similarity searches through natural language.
- Manage caches — create, configure, and inspect caches as part of an agentic workflow.
- Access cluster state — check cluster health, member nodes, and cache statistics.
How it works
Use cases
AI-assisted development and debugging — connect your IDE’s AI assistant to a running Infinispan cluster. Ask questions like “show me all sessions older than 1 hour” or “what’s the hit ratio for the products cache?” without writing queries manually.
Autonomous agents with persistent state — agents that need to store and retrieve structured data can use Infinispan through MCP without any client library. The MCP protocol handles serialization, error handling, and discovery automatically.
Multi-agent collaboration — multiple AI agents connected to the same Infinispan cluster can share state through cache operations exposed via MCP, enabling coordinated workflows.


