Skip to content

Commit ed08789

Browse files
committed
docs: clarify responses websocket transport resolution
1 parent ef34644 commit ed08789

2 files changed

Lines changed: 7 additions & 1 deletion

File tree

docs/models/index.md

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -85,6 +85,8 @@ set_default_openai_responses_transport("websocket")
8585

8686
This affects OpenAI Responses models resolved by the default OpenAI provider (including string model names such as `"gpt-5.2"`).
8787

88+
Transport selection happens when the SDK resolves a model name into a model instance. If you pass a concrete [`Model`][agents.models.interface.Model] object, its transport is already fixed: [`OpenAIResponsesWSModel`][agents.models.openai_responses.OpenAIResponsesWSModel] uses websocket, [`OpenAIResponsesModel`][agents.models.openai_responses.OpenAIResponsesModel] uses HTTP, and [`OpenAIChatCompletionsModel`][agents.models.openai_chatcompletions.OpenAIChatCompletionsModel] stays on Chat Completions. If you pass `RunConfig(model_provider=...)`, that provider controls transport selection instead of the global default.
89+
8890
You can also configure websocket transport per provider or per run:
8991

9092
```python
@@ -106,9 +108,11 @@ result = await Runner.run(
106108

107109
If you need prefix-based model routing (for example mixing `openai/...` and `litellm/...` model names in one run), use [`MultiProvider`][agents.MultiProvider] and set `openai_use_responses_websocket=True` there instead.
108110

111+
If you use a custom OpenAI-compatible endpoint or proxy, websocket transport also requires a compatible websocket `/responses` endpoint. In those setups you may need to set `websocket_base_url` explicitly.
112+
109113
Notes:
110114

111-
- This is the Responses API over websocket transport, not the [Realtime API](../realtime/guide.md).
115+
- This is the Responses API over websocket transport, not the [Realtime API](../realtime/guide.md). It does not apply to Chat Completions or non-OpenAI providers unless they support the Responses websocket `/responses` endpoint.
112116
- Install the `websockets` package if it is not already available in your environment.
113117
- You can use [`Runner.run_streamed()`][agents.run.Runner.run_streamed] directly after enabling websocket transport. For multi-turn workflows where you want to reuse the same websocket connection across turns (and nested agent-as-tool calls), the [`responses_websocket_session()`][agents.responses_websocket_session] helper is recommended. See the [Running agents](../running_agents.md) guide and [`examples/basic/stream_ws.py`](https://github.com/openai/openai-agents-python/tree/main/examples/basic/stream_ws.py).
114118

docs/running_agents.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -54,6 +54,8 @@ If you enable the OpenAI Responses websocket transport, you can keep using the n
5454

5555
This is the Responses API over websocket transport, not the [Realtime API](realtime/guide.md).
5656

57+
For transport-selection rules and caveats around concrete model objects or custom providers, see [Models](models/index.md#responses-websocket-transport).
58+
5759
##### Pattern 1: No session helper (works)
5860

5961
Use this when you just want websocket transport and do not need the SDK to manage a shared provider/session for you.

0 commit comments

Comments
 (0)