Chat Completion API

Guide to FastRouter's OpenAI-compatible Chat Completion endpoint, including SDK and direct API examples in Python, TypeScript, Curl, and JavaScript for seamless LLM interactions.

Introduction

FastRouter provides an OpenAI-compatible endpoint that lets you interact with a wide range of large language models (LLMs) using familiar schemas and SDKs. This makes it simple to generate text, answer questions, build conversational agents, or integrate LLMs into your applications — with zero vendor lock-in.

Whether you're using the OpenAI SDK or making direct HTTP calls, FastRouter routes your requests intelligently and efficiently, with full support for streaming.

Before you begin, replace placeholders like <FASTROUTER_API_KEY> with your actual API key, available in your dashboard (see Keys & Settings docs).


OpenAI-Compatible Chat Request Format

FastRouter uses the same schema as OpenAI’s /v1/chat/completions endpoint. You can specify models from multiple providers, such as:

  • openai/gpt-5.1

  • google/gemini-3-pro-preview

  • anthropic/claude-4.5-sonnet

  • and more


Endpoint

POST https://go.fastrouter.ai/api/v1/chat/completions

Base URL

Payload fields include:

  • model: Model ID (provider/model)

  • messages: Array of chat messages

  • stream (optional): Enables streaming responses


Using the OpenAI SDK

FastRouter is fully compatible with the OpenAI SDK. Just point the SDK to FastRouter’s base URL.


Calling FastRouter Directly (No SDK)

Perfect for scripts, testing, or minimal environments.


Key Takeaway

FastRouter’s Chat Completion endpoint makes AI integration simple by:

  • Mimicking OpenAI’s schema

  • Supporting SDKs across languages

  • Allowing direct HTTP calls

  • Supporting streaming

  • Letting you route requests to the best models from multiple providers

Last updated