Guide to FastRouter's OpenAI-compatible Chat Completion endpoint, including SDK and direct API examples in Python, TypeScript, Curl, and JavaScript for seamless LLM interactions.
Introduction
FastRouter provides an OpenAI-compatible endpoint that lets you interact with a wide range of large language models (LLMs) using familiar schemas and SDKs. This makes it simple to generate text, answer questions, build conversational agents, or integrate LLMs into your applications — with zero vendor lock-in.
Whether you're using the OpenAI SDK or making direct HTTP calls, FastRouter routes your requests intelligently and efficiently, with full support for streaming.
Before you begin, replace placeholders like <FASTROUTER_API_KEY> with your actual API key, available in your dashboard (see Keys & Settings docs).
OpenAI-Compatible Chat Request Format
FastRouter uses the same schema as OpenAI’s /v1/chat/completions endpoint.
You can specify models from multiple providers, such as:
from openai import OpenAI
client = OpenAI(
base_url="https://go.fastrouter.ai/api/v1", # FastRouter base URL
api_key="<FASTROUTER_API_KEY>", # Replace with your FastRouter API key
)
completion = client.chat.completions.create(
model="anthropic/claude-4.5-sonnet", # Replace with your model ID
messages=[
{"role": "user", "content": "What is the meaning of life?"}
]
)
print(completion.choices[0].message.content)
import OpenAI from 'openai';
const openai = new OpenAI({
baseURL: 'https://go.fastrouter.ai/api/v1', // FastRouter base URL
apiKey: '<FASTROUTER_API_KEY>', // Replace with your FastRouter API key
});
async function main() {
const completion = await openai.chat.completions.create({
model: 'anthropic/claude-4.5-sonnet', // Replace with your model ID
messages: [
{ role: 'user', content: 'What is the meaning of life?' }
]
});
console.log(completion.choices[0].message);
}
main();