Gantz Run

v0.0.0

Turn your local scripts and APIs into MCP tools that AI agents can use

Gantz Run - Expose local tools to AI agents in seconds | Product Hunt
๐Ÿš€

Quick Start

Get up and running in under a minute

1
Install Gantz Run
Recommended
Install Script
macOS/Linux
Homebrew
Go developers
go install
curl -fsSL https://gantz.run/install.sh | sh
2
Create gantz.yaml
name: my-tools
version: "1.0.0"

tools:
  - name: hello
    description: Say hello to someone
    parameters:
      - name: name
        type: string
        required: true
    script:
      shell: echo "Hello, {{name}}!"
name: my-tools
version: "1.0.0"

tools:
  - name: analyze
    description: Run analysis script
    parameters:
      - name: input
        type: string
        required: true
    script:
      command: python3
      args: ["./scripts/analyze.py", "{{input}}"]
      working_dir: "/path/to/project"

  - name: deploy
    description: Run deploy script
    parameters:
      - name: env
        type: string
        default: "staging"
    script:
      command: ./scripts/deploy.sh
      args: ["{{env}}"]
      timeout: "5m"
name: api-tools
version: "1.0.0"

tools:
  - name: list_products
    description: Get all products
    parameters: []
    http:
      method: GET
      url: "https://mockserver.gantz.ai/products"
      extract_json: "data"

  - name: get_inventory
    description: Get inventory (API key auth)
    parameters: []
    http:
      method: GET
      url: "https://mockserver.gantz.ai/inventory"
      headers:
        X-API-Key: "demo-api-key-123"
      extract_json: "data"
3
Start the Server
$ gantz run

Gantz Run v0.1.0
Loaded 2 tools from gantz.yaml

Connecting to relay server...

  MCP Server URL: https://happy-panda.gantz.run

Press Ctrl+C to stop
โœจ

Features

Everything you need to build AI-powered tools

โšก
Script Tools
Run shell commands with {{param}} placeholders
๐ŸŒ
HTTP Tools
Call REST APIs with headers and JSON extraction
๐Ÿ”—
Instant Tunneling
Get a public URL without port forwarding
๐Ÿ”
Environment Vars
Use ${VAR} to inject secrets securely
๐Ÿ“ฆ
JSON Extraction
Extract nested data with dot notation
๐Ÿ’ป
Cross-Platform
Pre-built binaries for all platforms
๐Ÿค–

AI Client Examples

Connect your MCP server to AI models

mcp_client_anthropic.py pip install anthropic
#!/usr/bin/env python3
"""Gantz MCP Client for Anthropic Claude"""
import anthropic

ANTHROPIC_API_KEY = "your-anthropic-api-key"
MCP_URL = "https://YOUR-TUNNEL.gantz.run/sse"

def main():
    client = anthropic.Anthropic(api_key=ANTHROPIC_API_KEY)

    response = client.beta.messages.create(
        model="claude-sonnet-4-5-20250929",
        max_tokens=2000,
        messages=[{"role": "user", "content": "list all files"}],
        mcp_servers=[{
            "type": "url",
            "url": MCP_URL,
            "name": "gantz",
        }],
        betas=["mcp-client-2025-04-04"]
    )

    for content in response.content:
        if hasattr(content, 'text'):
            print(content.text)

if __name__ == "__main__":
    main()
mcp_client_gemini.py pip install google-genai httpx
#!/usr/bin/env python3
"""Gantz MCP Client for Google Gemini"""
import httpx
from google import genai
from google.genai import types

GEMINI_API_KEY = "your-gemini-api-key"
MCP_URL = "https://YOUR-TUNNEL.gantz.run"

class MCPClient:
    def __init__(self, url):
        self.url = url
        self.tools = []

    def list_tools(self):
        resp = httpx.post(self.url, json={
            "jsonrpc": "2.0", "id": 1, "method": "tools/list"
        })
        self.tools = resp.json().get("result", {}).get("tools", [])
        return self.tools

    def call_tool(self, name, arguments):
        resp = httpx.post(self.url, json={
            "jsonrpc": "2.0", "id": 1, "method": "tools/call",
            "params": {"name": name, "arguments": arguments}
        })
        content = resp.json().get("result", {}).get("content", [])
        return content[0].get("text", "") if content else ""

    def get_gemini_tools(self):
        return [types.FunctionDeclaration(
            name=t["name"], description=t.get("description", ""),
            parameters=t.get("inputSchema", {"type": "object"})
        ) for t in self.tools]

def main():
    mcp = MCPClient(MCP_URL)
    mcp.list_tools()

    client = genai.Client(api_key=GEMINI_API_KEY)
    tool_config = types.Tool(function_declarations=mcp.get_gemini_tools())

    response = client.models.generate_content(
        model="gemini-2.0-flash",
        contents="list all files",
        config=types.GenerateContentConfig(tools=[tool_config])
    )

    for part in response.candidates[0].content.parts:
        if part.function_call:
            result = mcp.call_tool(part.function_call.name, dict(part.function_call.args))
            print(result)

if __name__ == "__main__":
    main()
mcp_client_openai.py pip install openai-agents
#!/usr/bin/env python3
"""Gantz MCP Client for OpenAI Agents SDK"""
import asyncio
from openai import AsyncOpenAI
from agents import Agent, Runner, set_default_openai_client
from agents.mcp import MCPServerStreamableHttp

OPENAI_API_KEY = "your-openai-api-key"
MCP_URL = "https://YOUR-TUNNEL.gantz.run/sse"

async def main():
    set_default_openai_client(AsyncOpenAI(api_key=OPENAI_API_KEY))

    async with MCPServerStreamableHttp(
        name="Gantz MCP Server",
        params={"url": MCP_URL, "timeout": 30},
    ) as server:
        agent = Agent(
            name="Assistant",
            instructions="You are a helpful assistant.",
            mcp_servers=[server],
        )
        result = await Runner.run(agent, "list all files")
        print(result.final_output)

if __name__ == "__main__":
    asyncio.run(main())

๐Ÿ”’ We don't store your request or response data. Tunnel URLs are temporary.