Looking to extend your LLMs with real-world capabilities like web fetching and file access? Start with Jeremias Werner's blog-a hands-on guide to deploying Model Context Protocol (MCP) servers on IBM Cloud Code Engine, backed by IBM Cloud Object Storage (COS).
What You'll Learn:
- How to deploy MCP servers (
mcp-npx-fetch
, server-filesystem
) as containers on Code Engine.
- How to expose them via SSE using
supergateway
for remote LLM access.
- How to integrate with COS for persistent, scalable, and secure file storage.
Why It Matters:
- Code Engine provides serverless, auto-scaling infrastructure-perfect for MCP tools that idle until called.
- COS enables LLMs to read/write files across sessions, share context between agents, and store outputs securely.
- Supergateway bridges local MCP tools to HTTP endpoints, making them usable by clients like Claude Desktop and LangChain.
Use Cases:
- Summarize live web pages with
fetch_txt
.
- Read/write shared documents with
read_txt
and write_txt
.
- Build enterprise-grade tools that wrap internal services like IBM MQ or Process Mining.
🔗 Ready to build smarter agents with scalable infrastructure?
Read the full blog here
------------------------------
Danielle Kingberg
Sr. GTM Product Manager, IBM Cloud Object Storage
------------------------------