Skip to content

Conversation

@OiPunk
Copy link
Contributor

@OiPunk OiPunk commented Feb 10, 2026

Summary

Fixes an issue where prompt-managed built-in tools can fail with:

Tool choice 'web_search_preview' not found in 'tools' parameter.

This happens when a prompt is supplied, local tools are empty, and a tool_choice value is still forwarded in the request.

What changed

  • In OpenAIResponsesModel._fetch_response(...), when prompt mode is active and local tools are omitted, tool_choice is also omitted.
  • Added regression test coverage to ensure tool_choice is omitted in this prompt-managed-tools path.

Why this is safe

  • Only affects the prompt path where SDK-local tools are absent and prompt config is expected to manage tools.
  • Existing behavior for non-prompt flows and prompt flows with explicit SDK tools remains unchanged.

Validation

  • uv run --with ruff ruff check src/agents/models/openai_responses.py tests/test_openai_responses.py
  • uv run --with mypy --with socksio mypy src/agents/models/openai_responses.py tests/test_openai_responses.py
  • uv run --with pytest --with socksio pytest -q tests/test_openai_responses.py
  • Targeted trace coverage run to confirm modified lines were exercised.

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 8e93bffd17

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

should_omit_model = prompt is not None and not self._model_is_explicit
model_param: str | ChatModel | Omit = self.model if not should_omit_model else omit
should_omit_tools = prompt is not None and len(converted_tools_payload) == 0
should_omit_tool_choice = should_omit_tools and tool_choice is not omit

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Respect explicit none tool_choice with prompt-managed tools

This condition now omits every non-default tool_choice whenever prompt is set and local tools are empty, which also drops valid literals like "none"/"required" rather than only problematic named choices. In prompt-template flows where tools are defined server-side, ModelSettings(tool_choice="none") will be silently ignored, so callers can no longer reliably disable tool use per request and behavior can change unexpectedly.

Useful? React with 👍 / 👎.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant