Open Source

Bidirectional
LLM API Adapter

Freely convert any LLM provider API to another. OpenAI ↔ Claude ↔ DeepSeek ↔ Gemini. One codebase, unlimited flexibility.

OpenAI
Anthropic
DeepSeek
Moonshot
Zhipu
Qwen
Google
MiniMax
OpenAI
Anthropic
DeepSeek
Moonshot
Zhipu
Qwen
Google
MiniMax
OpenAI
Anthropic
DeepSeek
Moonshot
Zhipu
Qwen
Google
MiniMax
Open WebUI
Cline
Dify
Cherry Studio
Codex
Obsidian
Mate GPT
Sider
Claude Code
Open Code
Open WebUI
Cline
Dify
Cherry Studio
Codex
Obsidian
Mate GPT
Sider
Claude Code
Open Code
Open WebUI
Cline
Dify
Cherry Studio
Codex
Obsidian
Mate GPT
Sider
Claude Code
Open Code

The Problem

Dealing with multiple LLM providers is painful

Different API Formats

Each provider has its own request/response format. OpenAI uses messages, Claude uses content blocks.

Vendor Lock-in

Switching providers means rewriting your entire codebase. Migration takes weeks.

Maintenance Burden

Supporting multiple providers multiplies your code complexity and testing effort.

The Solution

Choose your path to LLM freedom

For Developers

LLM Bridge

SDK for seamless LLM API conversion

Zero-dependency TypeScript library that converts between any LLM provider format using an Intermediate Representation (IR) pattern.

  • Zero runtime dependencies
  • Full TypeScript support
  • 7+ official adapters
  • Streaming & tool calling
  • Custom adapter support
  • Battle-tested in production
npm install @amux.ai/llm-bridgeView Documentation
For Everyone

Amux Desktop

Local LLM proxy with GUI

Cross-platform desktop app that runs a local proxy server. Manage API keys, monitor requests, and switch between providers - all without code.

  • Beautiful GUI interface
  • Local proxy server
  • API key management
  • Real-time monitoring
  • Request logging
  • Multi-language support
Download Now

Core Capabilities

Everything you need for LLM integration

Bidirectional

Convert any format to any other format seamlessly

Zero Dependencies

Core package has no runtime dependencies

Extensible

Create custom adapters for any provider

Streaming

Native SSE support for real-time responses

Tool Calling

Full function/tool calling support across providers

Multimodal

Vision and image support where available

Simple & Intuitive

See how easy it is

example.ts
// Use OpenAI format, call Claude API
const bridge = createBridge({
  inbound: openaiAdapter,
  outbound: anthropicAdapter,
  config: { apiKey: process.env.ANTHROPIC_API_KEY  }
});

// Your existing OpenAI-style code works as-is
const response = await bridge.chat({
  model: 'gpt-4',
  messages: [{ role: 'user', content: 'Hello!' }]
});

Use Cases

Who benefits from Amux

App Developers

Integrate multiple LLM providers with a single SDK. Switch models without code changes.

Enterprises

Implement provider failover, optimize costs, and maintain vendor flexibility.

Researchers

Compare models easily. Run the same prompts across different providers for benchmarking.

Ready to simplify your LLM integration?

Get started in minutes with our SDK or Desktop app

Open SourceMIT LicenseZero Dependencies