If you're using Claude Desktop with MCP (Model Context Protocol) servers, you've probably noticed something frustrating: each tool wants its own runtime. One needs Node.js 20, another requires Python 3.12 with specific packages, and suddenly your clean development machine is cluttered with conflicting dependencies.
There's a better way. In this guide, I'll show you how to containerize your MCP servers so that nothing gets installed on your host system except Docker itself.
The Problem with Direct Installation
A typical Claude Desktop MCP configuration looks like this:
{
"mcpServers": {
"some-tool": {
"command": "npx",
"args": ["-y", "some-mcp-package"],
"env": {
"API_KEY": "your-key"
}
}
}
}This approach has several problems:
- npx downloads packages to your npm cache on every run
- Different tools need different Node versions — version managers help, but add complexity
- Python-based MCPs need virtual environments or pollute your system Python
- Updates can break things — no easy rollback mechanism
- Onboarding a new teammate means walking them through the whole mess again
The Solution: Docker All The Things
The fix is straightforward: run each MCP server inside its own Docker container. Claude Desktop doesn't care where the server actually lives — it communicates over stdio, so as long as the container is running and talking the right protocol, everything works.
Here's the same configuration, Dockerized:
{
"mcpServers": {
"some-tool": {
"command": "docker",
"args": [
"run", "-i", "--rm",
"-e", "API_KEY=your-key",
"some-tool-mcp"
]
}
}
}The key flags:
-i— Interactive mode (required for stdio communication)--rm— Remove container after exit (no cleanup needed)-e— Pass environment variables (API keys, credentials)
Step-by-Step: Dockerizing an MCP Server
Let's walk through a real example. We'll containerize a Google Slides MCP server.
Step 1: Create the Dockerfile
FROM node:20-slim
# Install git (needed to clone the repo)
RUN apt-get update && apt-get install -y git && \
rm -rf /var/lib/apt/lists/*
# Clone and build the MCP server
WORKDIR /app
RUN git clone https://github.com/matteoantoci/google-slides-mcp.git . && \
npm install && \
npm run build
# Run via stdio
ENTRYPOINT ["node", "/app/build/index.js"]Step 2: Build the Image
docker build -t google-slides-mcp .Step 3: Configure Claude Desktop
Edit ~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or the equivalent on your platform:
{
"mcpServers": {
"google-slides": {
"command": "docker",
"args": [
"run", "-i", "--rm",
"-e", "GOOGLE_CLIENT_ID=your-client-id",
"-e", "GOOGLE_CLIENT_SECRET=your-secret",
"-e", "GOOGLE_REFRESH_TOKEN=your-token",
"google-slides-mcp"
]
}
}
}Step 4: Restart Claude Desktop
Quit completely (Cmd+Q on macOS) and reopen. Your MCP server now runs in a container — if Claude shows the tool in its toolbar, you're done.
Handling Output Files with Volume Mounts
Some MCP servers generate files — images, diagrams, documents. These need to be accessible on your host system. Use Docker volume mounts:
{
"mcpServers": {
"image-generator": {
"command": "docker",
"args": [
"run", "-i", "--rm",
"-v", "/path/on/host:/output",
"-e", "OUTPUT_DIR=/output",
"-e", "API_KEY=your-key",
"image-generator-mcp"
]
}
}
}The -v flag maps a host directory to a container path. Files written to /output inside the container appear in /path/on/host on your machine.
On macOS, make sure the host path is under a directory Docker Desktop is allowed to access (check Settings → Resources → File Sharing). Otherwise the mount silently fails.
Project Structure for Multiple MCP Servers
When managing several containerized MCPs, organization matters. Here's a structure that scales:
mcp-docker/
├── servers/
│ ├── google-slides/
│ │ └── Dockerfile
│ ├── image-generator/
│ │ └── Dockerfile
│ └── diagram-tool/
│ └── Dockerfile
├── data/
│ ├── images/ # image-generator output
│ └── diagrams/ # diagram-tool output
└── README.mdEach server gets its own Dockerfile. Shared output goes in data/. The README documents how to rebuild images.
Common Patterns
Pattern 1: npm Package MCP
For MCP servers published to npm:
FROM node:20-slim
RUN npm install -g @some-org/some-mcp-server
ENTRYPOINT ["some-mcp-server"]Pattern 2: GitHub Repository MCP
For MCP servers only available as source:
FROM node:20-slim
RUN apt-get update && apt-get install -y git && rm -rf /var/lib/apt/lists/*
WORKDIR /app
RUN git clone https://github.com/org/repo.git . && npm install && npm run build
ENTRYPOINT ["node", "dist/index.js"]Pattern 3: Python MCP
For Python-based MCP servers:
FROM python:3.12-slim
RUN pip install some-mcp-package
ENTRYPOINT ["python", "-m", "some_mcp_module"]Pattern 4: Multi-Runtime MCP
Some MCPs need both Node.js and Python:
FROM node:20-slim
RUN apt-get update && apt-get install -y python3 python3-pip && \
rm -rf /var/lib/apt/lists/*
RUN ln -s /usr/bin/python3 /usr/bin/python
# --break-system-packages needed on Debian 12+ based images
RUN pip3 install --break-system-packages required-python-package
RUN npm install -g required-npm-package
ENTRYPOINT ["the-mcp-command"]The --break-system-packages flag is required on newer Debian-based images (like node:20-slim) where pip refuses to install into the system Python by default. It's safe here because the container is disposable.
Troubleshooting
"Server disconnected" errors
Check that your Dockerfile's ENTRYPOINT runs the correct command. Test manually:
echo '{"jsonrpc":"2.0","method":"initialize","params":{},"id":1}' | \
docker run -i --rm your-mcp-imageYou should get a JSON response back.
Container starts but tools don't appear
The MCP server is probably printing something non-JSON to stdout — startup banners, log lines, anything. Even one extra line breaks the protocol because Claude is expecting pure JSON from the first byte. Check if the server has a "quiet" mode or redirect logs to stderr.
Permission denied on volume mounts
See the volume mounts section above — Docker Desktop needs explicit permission to access host directories via Settings → Resources → File Sharing.
Wrapping Up
Dockerizing your MCP servers takes maybe an hour of upfront work per server. After that, adding a new MCP to your setup is a Dockerfile and two config lines. Rolling back a bad update is one command. Onboarding someone new is 'install Docker, clone the repo, done.'
The MCP ecosystem is moving fast — new servers are appearing weekly. Having containers as your deployment unit means you can try something new on Friday afternoon without spending Monday morning cleaning up your system.
Need Help?
If your team is trying to get AI tooling into a production workflow without it turning into a dependency nightmare, that's exactly the kind of thing I help with.
[Get in touch on Upwork] | [Get in Touch Directly →]