TypeScript SDK
Simple, typed, and extensible LLM library
Abso makes calling various LLMsโsimple, typed, and extensible. It provides a unified interface while maintaining full type safety and streaming capabilities.
๐ OpenAI-compatible
๐ Fully Typed
๐ฆ Streaming
๐งฎ Embeddings
import { abso } from "abso"
const result = await abso.chat.create({
messages: [{ role: "user", content: "Say this is a test" }],
model: "gpt-4o"
})
console.log(result.choices[0].message.content)
Smart Router
Route LLM requests intelligently
A smart LLM proxy that automatically routes requests between fast and slow models based on prompt complexity. Very fast and low latency.
๐ OpenAI Compatible
๐ค Automatic Model Selection
๐ Fallbacks
โก Low Latency
curl https://router.abso.ai/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $OPENAI_API_KEY" \
-d '{
"model": "fast:gpt-4o|slow:o1-mini",
"messages": [
{"role": "user", "content": "What is the meaning of life?"}
],
"stream": true
}'