Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
-
Updated
Aug 4, 2025 - Python
Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
Build Secure and Compliant AI agents and MCP Servers. YC W23
MCP Gateway and Registry
MCP Gateway and Registry
CLI and Flask‑based web application that transforms plain‑English prompts into production‑ready, multi‑agent AI workflows. It generates native YAML for IBM WatsonX Orchestrate, Python code for CrewAI, CrewAI Flow, LangGraph, or ReAct, and includes a built‑in FastAPI MCP server wrapper for seamless deployment to the MCP Gateway.
Add a description, image, and links to the mcp-gateway topic page so that developers can more easily learn about it.
To associate your repository with the mcp-gateway topic, visit your repo's landing page and select "manage topics."