Skip to content

Overview

Siraya AI provides OpenAI-compatible API endpoints, letting you use multiple AI providers through a familiar interface. You can use existing OpenAI client libraries, switch to the Siraya AI with a URL change, and keep your current tools and workflows without code rewrites.

The OpenAI-compatible API implements the same specification as the OpenAI API.

Base URL

The OpenAI-compatible API is available at the following base URL:

https://llm.siraya.pro/v1

Authentication

The OpenAI-compatible API supports the same authentication methods:

  • API key: Use your Siraya AI API key with the Authorization: Bearer <token> header

Supported endpoints

The AI Gateway supports the following OpenAI-compatible endpoints:

  • GET /models - List available models
  • POST /chat/completions - Create chat completions with support for streaming, attachments, tool calls, and structured outputs
  • POST /embeddings - Generate vector embeddings
  • POST /rerank - Generate vector embeddings

Integration with existing tools

You can use the Siraya AI's OpenAI-compatible API with existing tools and libraries like the OpenAI client libraries. Point your existing client to the Siraya AI's base URL and use your Siraya AI API key for authentication.

OpenAI client libraries

from openai import OpenAI

client = OpenAI(
  base_url="https://llm.siraya.pro/v1",
  api_key="<API_KEY>",
)

completion = client.chat.completions.create(
  model="claude-3-5-sonnet@20240620",
  messages=[
    {
      "role": "user",
      "content": "What is the meaning of life?"
    }
  ]
)

print(completion.choices[0].message.content)
import OpenAI from 'openai';

const openai = new OpenAI({
  baseURL: 'https://llm.siraya.pro/v1',
  apiKey: '<API_KEY>',
});

async function main() {
  const completion = await openai.chat.completions.create({
    model: 'claude-3-5-sonnet@20240620',
    messages: [
      {
        role: 'user',
        content: 'What is the meaning of life?',
      },
    ],
  });

  console.log(completion.choices[0].message);
}

main();

List models

Retrieve a list of all available models that can be used with the Siraya AI.

Endpoint

GET /models

Example request

import os
from openai import OpenAI

client = OpenAI(
    api_key='<API_KEY>',
    base_url='https://llm.siraya.pro/v1'
)

models = client.models.list()
print(models)
import OpenAI from 'openai';

const openai = new OpenAI({
  apiKey: '<API_KEY>',
  baseURL: 'https://llm.siraya.pro/v1',
});

const models = await openai.models.list();
console.log(models);

Error handling

The API returns standard HTTP status codes and error responses:

Common error codes

  • 400: Bad Request (invalid or missing params, CORS)
  • 401: Invalid credentials (OAuth session expired, disabled/invalid API key)
  • 402: Your account or API key has insufficient credits. Add more credits and retry the request.
  • 403: Your chosen model requires moderation and your input was flagged
  • 408: Your request timed out
  • 429: You are being rate limited
  • 502: Your chosen model is down or we received an invalid response from it
  • 503: There is no available model provider that meets your routing requirements

Error response format

{
    "error": {
        "message": "",
        "type": "",
        "param": "",
        "code": 429
    }
}