Documentation
Getting Started
Absolute Router is a smart LLM proxy that automatically routes requests between fast and slow models based on prompt complexity. It's designed to be easy to use and integrate with existing OpenAI-compatible APIs.
Installation
You can use Absolute Router directly with your existing OpenAI client libraries. There's no need to install any additional packages.
Usage
There are two ways to configure the models for Absolute Router:
- Using a single "model" parameter with a string format
- Using separate "fastModel" and "slowModel" parameters with detailed configuration
1. Single Model Parameter
Single Model Parameter Example
import OpenAI from 'openai';
const openai = new OpenAI({
apiKey: 'your-api-key',
baseURL: 'https://router.abso.ai/v1',
});
const response = await openai.chat.completions.create({
model: 'fast:gpt-4o|slow:o1-mini', // This defines both fast and slow models
messages: [{ role: 'user', content: 'What is the meaning of life?' }],
stream: true,
});
for await (const chunk of response) {
process.stdout.write(chunk.choices[0]?.delta?.content || '');
}
2. Separate Model Parameters
Separate Model Parameters Example
import OpenAI from 'openai';
const openai = new OpenAI({
apiKey: 'your-api-key',
baseURL: 'https://router.abso.ai/v1',
});
const response = await openai.chat.completions.create({
model: 'gpt-4o', // This can be any model name, it will be overridden
messages: [{ role: 'user', content: 'What is the meaning of life?' }],
stream: true,
metadata: {
fastModel: {
name: "gpt-4o",
apiUrl: "https://api.openai.com/v1",
apiKey: "your-openai-api-key"
},
slowModel: {
name: "o1-mini",
apiUrl: "https://api.anthropic.com/v1",
apiKey: "your-anthropic-api-key"
}
}
});
for await (const chunk of response) {
process.stdout.write(chunk.choices[0]?.delta?.content || '');
}
API Reference
Absolute Router is compatible with the OpenAI API. For detailed information on available parameters and response formats, please refer to the OpenAI API documentation.