FAQ
Common questions about endpoint formats, model access, billing, and support.
What is this service?
It is a unified LLM API gateway for API key purchase, quota top-up, and integration through common API-compatible formats.
Which endpoint formats are supported?
Chat Completions-compatible, Responses-compatible, and Messages-compatible endpoints are available.
What Base URL should I use?
Most clients can start with https://gpt-agent.cc/v1. If a client asks for a full endpoint, choose the matching chat/completions, responses, or messages path.
How is quota billed?
Most models are billed by actual token usage. Purchased quota does not expire.
Can high-volume teams request a custom plan?
Yes. Teams with higher volume or procurement needs can contact support for custom quota and support arrangements.