MODRO
Model Router

Quick Start

Go from zero to real traffic in a few minutes.

Use your own OpenAI API key, point your app at our hosted /v1, choose a preset, and open the signed private dashboard URL returned in the response header.

1

Bring your own OpenAI key

Your application still uses an OpenAI API key. We do not ask you to create a new billing system or move token spend into a separate account.

2

Send traffic to hosted /v1

Replace your base URL with our hosted endpoint. Keep the same OpenAI-style chat completions request format.

3

Open your private dashboard

On the first request, read the X-Client-Dashboard-Url response header and open it. That signed URL preserves tenant isolation automatically.

Copy-Paste Snippets

Recommended first move

Start with the balanced preset. It gives most teams the cleanest first experience: low friction, strong quality, and clear savings without making the settings page feel risky.

Preset meanings

  • Cheapest: lowest-cost setup for aggressive cost control.
  • Balanced: sensible default for most traffic.
  • Highest quality: strongest deep path when output quality matters most.

Tier roles

  • Utility: cheapest tasks and low-stakes requests.
  • Fast: default tier for routine production traffic.
  • Deep: expensive tier for prompts that need more reasoning or detail.

What happens after the first request

Once your app sends traffic successfully, open the signed dashboard URL. From there, you can confirm traffic volume, see savings accumulate, and adjust model policy from /client/settings without exposing any admin-only controls.