Skip to content

Ollama integration fails with tool calls: "role" field missing in Message validation #2

@HECer

Description

@HECer

Bug Description

When using OpenCook with an Ollama provider (specifically qwen3.5:9b), the agent crashes during the iterative loop (typically Step 3) when attempting to send a tool call result back to the model. The crash is triggered by a Pydantic validation error because the message object passed to the Ollama client lacks a mandatory role field.

Error Traceback snippet:
pydantic_core._pydantic_core.ValidationError: 1 validation error for Message role Field required [type=missing, input_value={'call_id': 'e8a01082-5c3... 'function_call_output'}, input_type=dict]

Steps to Reproduce

  1. Install OpenCook: pip install git+https://github.com/OpenDataBox/OpenCook.git
  2. Create a configuration file (config.json) with an Ollama provider:
{
  "provider": "ollama",
  "model": "qwen3.5:9b",
  "model_base_url": "http://localhost:11434",
  "max_steps": 50,
  "console_type": "simple",
  "agent_type": "code_agent"
}
  1. Run the agent:
    opencook run "some task" --config-file config.json --provider ollama --model qwen3.5:9b --api-key dummy --working-dir .
  2. Observe the crash: The agent successfully initiates a tool call but fails immediately when processing the output of that tool for the next LLM turn.

Expected Behavior

The tool call result should be formatted as a valid message object containing a role field (e.g., "role": "tool") before being sent back to the Ollama API, allowing the model to process the observation.

Actual Behavior

The ollama_client.py constructs a message dictionary containing the call_id and output, but omits the role key. The underlying Ollama Python library uses Pydantic for schema enforcement and rejects the malformed dictionary.

Environment

  • OpenCook version: 0.1.0
  • Python: 3.10
  • OS: Windows 11
  • Ollama version: latest
  • Model: qwen3.5:9b

Root Cause

The issue is located in code_agent/utils/llm_clients/ollama_client.py at approximately line 83. The method _create_ollama_response (or the logic handling the message history assembly) fails to inject the required role attribute into the message dictionary representing a tool's output.

Suggested Fix

Ensure that the dictionary for tool responses includes the "role": "tool" key and maps the output to the "content" key as expected by the Ollama/OpenAI-compatible schema.

Proposed Message Structure:

{
    "role": "tool",
    "tool_call_id": "...", # or call_id depending on library version
    "content": "..."
}

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions