All systems operational

Premium AI API Access

Lightning-fast inference with enterprise-grade reliability. Access the world's most advanced AI models through a single, OpenAI-compatible endpoint.

Get Started → View Models
Status: Operational
Latency: Measuring...
Protocol: OpenAI Standard
Security: AES-256
⚡ Connection

API Base Endpoint

https://ai.fluxai.fun/v1
● STABLE
📖 Documentation

How to Use Your API Key

1

Get Your API Key

Contact us on Telegram to get your personal API key with a chosen model and request limit.

Get Key →
2

Set Base URL

Configure your OpenAI-compatible client to use our endpoint:

https://ai.fluxai.fun/v1
3

Authenticate

Pass your API key in the Authorization header:

Authorization: Bearer <your-api-key>
4

Choose a Model

Specify the model you were assigned. Use the :cloud suffix for automatic routing:

"model": "kimi-k2.6:cloud"
💻 Code Examples

Quick Start

Python
cURL
Node.js
import openai client = openai.OpenAI( base_url="https://ai.fluxai.fun/v1", api_key="your-api-key-here" ) response = client.chat.completions.create( model="kimi-k2.6:cloud", messages=[ {"role": "user", "content": "Hello!"} ] ) print(response.choices[0].message.content)
curl https://ai.fluxai.fun/v1/chat/completions \ -H "Authorization: Bearer your-api-key-here" \ -H "Content-Type: application/json" \ -d '{ "model": "kimi-k2.6:cloud", "messages": [{"role": "user", "content": "Hello!"}] }'
import OpenAI from 'openai'; const client = new OpenAI({ baseURL: 'https://ai.fluxai.fun/v1', apiKey: 'your-api-key-here', }); const response = await client.chat.completions.create({ model: 'kimi-k2.6:cloud', messages: [{ role: 'user', content: 'Hello!' }], }); console.log(response.choices[0].message.content);
🔑 Note: Your API key is tied to a specific model. Using a different model will return a 403 error. The :cloud suffix tells our proxy to auto-route to the correct model version.
🧠 Models

Available Models