Skip to content

thought_signature missing error when using SkillToolset + LiteLlm + Gemini thinking models (regression from 1.23.0 → 1.26.0) #4650

@jxd010203

Description

@jxd010203

🔴 Required Information

Describe the Bug:

When using google-adk>=1.26.0 with LiteLlm and a Gemini thinking model (gemini-3-flash-preview), any function call made by the agent that involves a load_skill tool (registered internally by SkillToolset) fails with a 400 Bad Request error from the Gemini API. The error states:

function call load_skill in the 2. content block is missing a thought_signature

This suggests that ADK 1.26.0 does not properly relay the thought_signature field back to the Gemini API when constructing the follow-up request after a function call result, which is required by Gemini thinking models.

The same code works correctly with google-adk==1.23.0.

Steps to Reproduce:

  1. Install google-adk==1.26.0, litellm==1.81.6
  2. Create an agent that uses LiteLlm with gemini-3-flash-preview (via OpenAI-compatible endpoint) and SkillToolset:
from google.adk.agents.llm_agent import Agent as LlmAgent
from google.adk.models.lite_llm import LiteLlm
from google.adk.skills import load_skill_from_dir
from google.adk.tools.skill_toolset import SkillToolset

model = LiteLlm(
    model="gemini-3-flash-preview",
    api_base="<your-api-base>",
    api_key="<your-api-key>",
    custom_llm_provider="openai",
)

skill_toolset = SkillToolset(
    skills=[load_skill_from_dir("path/to/your-skill")]
)

root_agent = LlmAgent(
    model=model,
    name="test_agent",
    description="Test agent",
    instruction="You are a test agent. Use the skill when asked.",
    tools=[skill_toolset],
)

Send a message that triggers the agent to call load_skill
The Gemini API returns a 400 error about missing thought_signature
Expected Behavior:
The agent should successfully call the load_skill function and return its result, exactly as it does with google-adk==1.23.0. The thought_signature from the model's thinking response should be preserved and sent back in subsequent API calls.
Observed Behavior:
The request fails with a 400 Bad Request:
openai.BadRequestError: Error code: 400 - {'error': {'code': 400, 'message': 'Unable to submit request because function call load_skillin the 2. content block is missing athought_signature. Learn more: https://docs.cloud.google.com/vertex-ai/generative-ai/docs/thought-signatures', 'status': 'INVALID_ARGUMENT'}}

Full traceback:
Traceback (most recent call last): File "/opt/app/.venv/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 840, in acompletion headers, response = await self.make_openai_chat_completion_request( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/app/.venv/lib/python3.12/site-packages/litellm/litellm_core_utils/logging_utils.py", line 190, in async_wrapper result = await func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/app/.venv/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 460, in make_openai_chat_completion_request raise e File "/opt/app/.venv/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 437, in make_openai_chat_completion_request await openai_aclient.chat.completions.with_raw_response.create( File "/opt/app/.venv/lib/python3.12/site-packages/openai/_legacy_response.py", line 384, in wrapped return cast(LegacyAPIResponse[R], await func(*args, **kwargs)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/app/.venv/lib/python3.12/site-packages/openai/resources/chat/completions/completions.py", line 2700, in create return await self._post( ^^^^^^^^^^^^^^^^^ File "/opt/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1884, in post return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1669, in request raise self._make_status_error_from_response(err.response) from None openai.BadRequestError: Error code: 400 - {'error': {'code': 400, 'message': 'Unable to submit request because function call load_skillin the 2. content block is missing athought_signature. Learn more: https://docs.cloud.google.com/vertex-ai/generative-ai/docs/thought-signatures', 'status': 'INVALID_ARGUMENT'}}

Environment Details:
ADK Library Version: google-adk==1.26.0
Desktop OS: Linux (containerized, Python 3.12)
Python Version: 3.12
Model Information:
Are you using LiteLLM: Yes (litellm==1.81.6)
Which model is being used: gemini-3-flash-preview (via OpenAI-compatible endpoint with custom_llm_provider="openai")

Metadata

Metadata

Assignees

Labels

models[Component] Issues related to model support

Type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions