Skip to content

Commit 149f5ce

Browse files
authored
docs: add non-OpenAI provider code example (#2792)
1 parent dddbce1 commit 149f5ce

File tree

1 file changed

+11
-0
lines changed

1 file changed

+11
-0
lines changed

docs/models/index.md

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -199,6 +199,17 @@ You can integrate other LLM providers with these built-in paths:
199199

200200
In cases where you do not have an API key from `platform.openai.com`, we recommend disabling tracing via `set_tracing_disabled()`, or setting up a [different tracing processor](../tracing.md).
201201

202+
``` python
203+
from agents import Agent, AsyncOpenAI, OpenAIChatCompletionsModel, set_tracing_disabled
204+
205+
set_tracing_disabled(disabled=True)
206+
207+
provider = AsyncOpenAI(api_key="Api_Key", base_url="Base URL of Provider")
208+
model = OpenAIChatCompletionsModel(model="Model_Name", openai_client=provider)
209+
210+
agent= Agent(name="Helping Agent", instructions="You are a Helping Agent", model=model)
211+
```
212+
202213
!!! note
203214

204215
In these examples, we use the Chat Completions API/model, because many LLM providers still do not support the Responses API. If your LLM provider does support it, we recommend using Responses.

0 commit comments

Comments
 (0)