A production-ready MCP (Model Context Protocol) server that exposes a Python REPL as an AI tool.
Plug it into Claude, Cursor, or any MCP-compatible AI β and watch it execute real Python code on demand.
You / AI Agent β MCP Tool Call β This Server β Live Python Output β Back to AI
π΄ Live Demo: This server is actively running and connected to Claude.ai as a custom tool.
- β‘ One tool, infinite power β
run_pythonexecutes any Python code and returns output - π Cloud-ready β deploy to Render, Railway, or any platform in minutes
- π Plug & Play β works with Claude, Cursor, and any MCP-compatible client
- π§° Built on FastMCP β the fastest way to build MCP servers in Python
- π¦ Minimal setup β 3 dependencies, ~10 lines of core code
git clone https://github.com/Pokemon455/fastmcp-python-repl-server.git
cd fastmcp-python-repl-server
pip install -r requirements.txtpython main.pyServer starts at: http://localhost:10000/mcp
Add this to your claude_desktop_config.json:
{
"mcpServers": {
"python-repl": {
"url": "http://localhost:10000/mcp",
"transport": "streamable-http"
}
}
}- Fork this repo
- Go to render.com β New Web Service
- Connect your GitHub repo
- Set start command:
python main.py - Done! Your MCP server is live π
| Package | Purpose |
|---|---|
fastmcp |
MCP server framework |
langchain-experimental |
Python REPL execution |
uvicorn |
Lightning-fast ASGI server |
Executes any Python code string and returns the output.
Example:
# Input
code = "print(sum([1, 2, 3, 4, 5]))"
# Output
"15"- Python REPL via MCP
- Render/Railway deployment support
- API key authentication
- Sandboxed execution environment
- Support for file I/O & pip installs
- Docker support
PRs are welcome! If you have ideas to make this better β open an issue or submit a pull request.
Pokemon455 β AI/ML & MCP Developer π GitHub
β If this saved you time, drop a star β it keeps the project alive! β