What Changed
The OpenAI provider configuration guidance has been updated to include more options for customizing the API usage. Specifically, the default OpenAI provider now reads OPENAI_BASE_URL and OPENAI_WEBSOCKET_BASE_URL for environment-based endpoint configuration. Additionally, the set_default_openai_api function allows for overriding the default OpenAI Responses API to use the Chat Completions API instead.
Why This Matters for GEO
This update matters for GEO as it can impact how content is generated and cited in AI responses. By customizing the OpenAI API and provider configuration, content creators can better optimize their content for AI-generated answers. For example, using the Chat Completions API can provide more conversational and engaging responses, while environment-based endpoint configuration can improve the security and flexibility of API usage.
What To Do
- Review OpenAI provider configuration: Check your current OpenAI provider configuration and update it to use environment-based endpoint configuration and customize the OpenAI API as needed.
- Use the
set_default_openai_apifunction: Override the default OpenAI Responses API to use the Chat Completions API for more conversational and engaging responses. - Configure tracing separately: If your model traffic uses one key or client but tracing should use a different OpenAI key, pass
use_for_tracing=Falsewhen setting the default key or client, and configure tracing separately. - Use
MultiProviderfor prefix-based model routing: If you need to mix different model names in one run, useMultiProviderand setopenai_use_responses_websocket=Trueto enable prefix-based model routing.