Chat Completion API
Guide to FastRouter's OpenAI-compatible Chat Completion endpoint, including SDK and direct API examples in Python, TypeScript, Curl, and JavaScript for seamless LLM interactions.
Introduction
FastRouter provides an OpenAI-compatible endpoint that lets you interact with a wide range of large language models (LLMs) using familiar schemas and SDKs. This makes it simple to generate text, answer questions, build conversational agents, or integrate LLMs into your applications — with zero vendor lock-in.
Whether you're using the OpenAI SDK or making direct HTTP calls, FastRouter routes your requests intelligently and efficiently, with full support for streaming.
Before you begin, replace placeholders like <FASTROUTER_API_KEY> with your actual API key, available in your dashboard (see Keys & Settings docs).
OpenAI-Compatible Chat Request Format
FastRouter uses the same schema as OpenAI’s /v1/chat/completions endpoint.
You can specify models from multiple providers, such as:
openai/gpt-5.1google/gemini-3-pro-previewanthropic/claude-4.5-sonnetand more
Endpoint
POST https://go.fastrouter.ai/api/v1/chat/completionsBase URL
Payload fields include:
model: Model ID (provider/model)messages: Array of chat messagesstream(optional): Enables streaming responses
Using the OpenAI SDK
FastRouter is fully compatible with the OpenAI SDK. Just point the SDK to FastRouter’s base URL.
Calling FastRouter Directly (No SDK)
Perfect for scripts, testing, or minimal environments.
Key Takeaway
FastRouter’s Chat Completion endpoint makes AI integration simple by:
Mimicking OpenAI’s schema
Supporting SDKs across languages
Allowing direct HTTP calls
Supporting streaming
Letting you route requests to the best models from multiple providers
Last updated