Skip to content

remove hardcoded agent temp#2068

Open
filip-michalsky wants to merge 4 commits intomainfrom
fm/stg-1826-remove-default-temp
Open

remove hardcoded agent temp#2068
filip-michalsky wants to merge 4 commits intomainfrom
fm/stg-1826-remove-default-temp

Conversation

@filip-michalsky
Copy link
Copy Markdown
Collaborator

@filip-michalsky filip-michalsky commented Apr 29, 2026

why

Reasoning models (e.g. openai/gpt-5-mini) emit AI SDK Warning: The "temperature" setting is not supported by this model whenever the v3 agent
runs against them, because V3AgentHandler hardcoded temperature: 1 on every generateText/streamText call. There was no constructor or
agent-config option to override it, so users running reasoning models with experimental: true had no way to silence the warning.

what changed

  • Removed the hardcoded temperature: 1 from both the non-streaming and streaming agent paths in
    packages/core/lib/v3/handlers/v3AgentHandler.ts.
  • Agent calls now omit temperature entirely, so the AI SDK no longer flags it as user-set on models that don't accept it. Models that do accept
    temperature still default to the AI SDK provider's own default.
  • Added unit coverage in packages/core/tests/unit/agent-temperature.test.ts asserting neither code path passes a temperature to the LLM
    client.
  • Added a patch changeset (.changeset/remove-agent-temperature-default.md).

test plan

  • pnpm --filter @browserbasehq/stagehand test agent-temperature — new unit tests pass
  • Existing v3 agent unit suite stays green
  • Manual: agent.execute(...) against openai/gpt-5-mini with experimental: true — confirm no "temperature" setting is not supported
    warning in stderr
  • Manual: same call against a non-reasoning model (e.g. openai/gpt-4.1-mini) — confirm agent still completes a multi-step task end-to-end
  • Manual: hybrid-mode agent run on a reasoning model — verify streaming path is also clean

@changeset-bot
Copy link
Copy Markdown

changeset-bot Bot commented Apr 29, 2026

🦋 Changeset detected

Latest commit: a27102b

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 4 packages
Name Type
@browserbasehq/stagehand Patch
@browserbasehq/stagehand-evals Patch
@browserbasehq/stagehand-server-v3 Patch
@browserbasehq/stagehand-server-v4 Patch

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

@github-actions
Copy link
Copy Markdown
Contributor

github-actions Bot commented Apr 30, 2026

✱ Stainless preview builds

No changes were made to the SDKs.


This comment is auto-generated by GitHub Actions and is automatically kept up to date as you push.
If you push custom code to the preview branch, re-run this workflow to update the comment.
Last updated: 2026-04-30 20:28:44 UTC

@filip-michalsky filip-michalsky marked this pull request as ready for review April 30, 2026 20:23
Copy link
Copy Markdown
Contributor

@cubic-dev-ai cubic-dev-ai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No issues found across 3 files

Confidence score: 5/5

  • Automated review surfaced no issues in the provided summaries.
  • No files require special attention.
Architecture diagram
sequenceDiagram
    participant App as Client Application
    participant Agent as V3AgentHandler
    participant SDK as LLMClient (AI SDK)
    participant Model as LLM Provider (e.g. OpenAI)

    App->>Agent: execute(instruction) or stream(...)
    
    Note over Agent: Prepare tools, messages,<br/>and stop conditions

    alt Non-Streaming Path
        Agent->>SDK: CHANGED: generateText(options)
        Note right of Agent: NEW: temperature is omitted<br/>(previously hardcoded to 1)
        SDK->>Model: Request without "temperature" field
    else Streaming Path
        Agent->>SDK: CHANGED: streamText(options)
        Note right of Agent: NEW: temperature is omitted
        SDK->>Model: Stream request without "temperature" field
    end

    alt Model is Reasoning Model (e.g. gpt-5-mini)
        Model-->>SDK: Processes request successfully
        Note over Model: No AI SDK warning emitted<br/>because temperature was not set
    else Model is Standard Model (e.g. gpt-4o)
        Model-->>SDK: Processes request successfully
        Note over Model: Uses provider's internal<br/>default temperature
    end

    SDK-->>Agent: LLM Result/Stream
    Agent-->>App: Task Result / Execution Logs
Loading

@filip-michalsky filip-michalsky changed the title remove agent temp remove hardcoded agent temp Apr 30, 2026
@filip-michalsky
Copy link
Copy Markdown
Collaborator Author

That failing auto-captcha test is going to get removed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant