Freely convert any LLM provider API to another. OpenAI ↔ Claude ↔ DeepSeek ↔ Gemini. One codebase, unlimited flexibility.
Dealing with multiple LLM providers is painful
Each provider has its own request/response format. OpenAI uses messages, Claude uses content blocks.
Switching providers means rewriting your entire codebase. Migration takes weeks.
Supporting multiple providers multiplies your code complexity and testing effort.
Choose your path to LLM freedom
SDK for seamless LLM API conversion
Zero-dependency TypeScript library that converts between any LLM provider format using an Intermediate Representation (IR) pattern.
npm install @amux.ai/llm-bridgeView DocumentationLocal LLM proxy with GUI
Cross-platform desktop app that runs a local proxy server. Manage API keys, monitor requests, and switch between providers - all without code.
Everything you need for LLM integration
Convert any format to any other format seamlessly
Core package has no runtime dependencies
Create custom adapters for any provider
Native SSE support for real-time responses
Full function/tool calling support across providers
Vision and image support where available
See how easy it is
// Use OpenAI format, call Claude API
const bridge = createBridge({
inbound: openaiAdapter,
outbound: anthropicAdapter,
config: { apiKey: process.env.ANTHROPIC_API_KEY }
});
// Your existing OpenAI-style code works as-is
const response = await bridge.chat({
model: 'gpt-4',
messages: [{ role: 'user', content: 'Hello!' }]
});Who benefits from Amux
Integrate multiple LLM providers with a single SDK. Switch models without code changes.
Implement provider failover, optimize costs, and maintain vendor flexibility.
Compare models easily. Run the same prompts across different providers for benchmarking.
Get started in minutes with our SDK or Desktop app