Skip to content

Using ChatDev with local Ollama #572

@DN2048

Description

@DN2048

I'm trying to use ChatDev with a locally implemented Ollama service but ChatDev seems to just ignore the setting.

I have BASE_URL=http://localhost:11434 in .env file but in the frontend page, all the workflow examples either use Gemini or OpenAI and in the Model Provider I can't select anything else besides those two.

Is there a extra setting somewhere? Does it need a specific workflow?

PS: I know the Ollama service is up and running because I'm using it on other local services.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions