API Documentation

Learn how to integrate with our AI proxy API

Endpoint
Send requests to our proxy API
POST /api/proxy
Request Body
Required and optional parameters
{
  "provider": "openai",           // "openai" | "anthropic"
  "model": "gpt-3.5-turbo",      // Model name
  "messages": [                   // Required: Array of messages
    {
      "role": "user",
      "content": "Hello, world!"
    }
  ],
  "stream": false,               // Optional: Enable streaming
  "temperature": 0.7,            // Optional: 0-2
  "maxTokens": 1000             // Optional: Max response tokens
}
Example Usage
JavaScript/TypeScript example
const response = await fetch('/api/proxy', {
  method: 'POST',
  headers: {
    'Content-Type': 'application/json',
  },
  body: JSON.stringify({
    provider: 'openai',
    model: 'gpt-3.5-turbo',
    messages: [
      { role: 'user', content: 'Hello!' }
    ],
    temperature: 0.7
  })
});

const data = await response.json();
console.log(data.text);
Supported Providers
Available LLM providers and models

OpenAI

Models: gpt-3.5-turbo, gpt-4, gpt-4-turbo, gpt-4o

Anthropic

Models: claude-3-haiku, claude-3-sonnet, claude-3-opus, claude-3-5-sonnet