๐Ÿฌ Abso AI
TypeScript SDK

Simple, typed, and extensible LLM library

Abso makes calling various LLMsโ€”simple, typed, and extensible. It provides a unified interface while maintaining full type safety and streaming capabilities.

๐Ÿ” OpenAI-compatible
๐Ÿ” Fully Typed
๐Ÿ“ฆ Streaming
๐Ÿงฎ Embeddings
import { abso } from "abso"

const result = await abso.chat.create({
  messages: [{ role: "user", content: "Say this is a test" }],
  model: "gpt-4o"
})

console.log(result.choices[0].message.content)
Smart Router

Route LLM requests intelligently

A smart LLM proxy that automatically routes requests between fast and slow models based on prompt complexity. Very fast and low latency.

๐Ÿš€ OpenAI Compatible
๐Ÿค– Automatic Model Selection
๐Ÿ”„ Fallbacks
โšก Low Latency
curl https://router.abso.ai/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $OPENAI_API_KEY" \
-d '{
  "model": "fast:gpt-4o|slow:o1-mini",
  "messages": [
    {"role": "user", "content": "What is the meaning of life?"}
  ],
  "stream": true
}'