Skip to content

Commit ba3b847

Browse files
authored
docs: updates for #2593 (#2594)
1 parent e32f0f8 commit ba3b847

File tree

1 file changed

+33
-0
lines changed

1 file changed

+33
-0
lines changed

docs/models/index.md

Lines changed: 33 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -108,6 +108,39 @@ result = await Runner.run(
108108

109109
If you need prefix-based model routing (for example mixing `openai/...` and `litellm/...` model names in one run), use [`MultiProvider`][agents.MultiProvider] and set `openai_use_responses_websocket=True` there instead.
110110

111+
`MultiProvider` keeps two historical defaults:
112+
113+
- `openai/...` is treated as an alias for the OpenAI provider, so `openai/gpt-4.1` is routed as model `gpt-4.1`.
114+
- Unknown prefixes raise `UserError` instead of being passed through.
115+
116+
When you point the OpenAI provider at an OpenAI-compatible endpoint that expects literal namespaced model IDs, opt into the pass-through behavior explicitly. In websocket-enabled setups, keep `openai_use_responses_websocket=True` on the `MultiProvider` as well:
117+
118+
```python
119+
from agents import Agent, MultiProvider, RunConfig, Runner
120+
121+
provider = MultiProvider(
122+
openai_base_url="https://openrouter.ai/api/v1",
123+
openai_api_key="...",
124+
openai_use_responses_websocket=True,
125+
openai_prefix_mode="model_id",
126+
unknown_prefix_mode="model_id",
127+
)
128+
129+
agent = Agent(
130+
name="Assistant",
131+
instructions="Be concise.",
132+
model="openai/gpt-4.1",
133+
)
134+
135+
result = await Runner.run(
136+
agent,
137+
"Hello",
138+
run_config=RunConfig(model_provider=provider),
139+
)
140+
```
141+
142+
Use `openai_prefix_mode="model_id"` when a backend expects the literal `openai/...` string. Use `unknown_prefix_mode="model_id"` when the backend expects other namespaced model IDs such as `openrouter/openai/gpt-4.1-mini`. These options also work on `MultiProvider` outside websocket transport; this example keeps websocket enabled because it is part of the transport setup described in this section. The same options are also available on [`responses_websocket_session()`][agents.responses_websocket_session].
143+
111144
If you use a custom OpenAI-compatible endpoint or proxy, websocket transport also requires a compatible websocket `/responses` endpoint. In those setups you may need to set `websocket_base_url` explicitly.
112145

113146
Notes:

0 commit comments

Comments
 (0)