Skip to content

[llama-3.1 70B]Open Interpreter's Preps did not complete after setting the model #1371

@mickitty0511

Description

@mickitty0511

Describe the bug

When reading your official doc about using ollama's module, I tried using llama 3.1 for open interpreter. However, errors were produced during the preps made after the module setup. I need a detailed resolution or explanation about what happened in my case. I hope some developers would reproduce this error and then tell me about this case.

Reproduce

Follow your official doc

Used this command

  • ollama run llama3.1
  • interpreter --model ollama/llama3.1

Then, open interpreter asked me if I would have new profile file.
I did answer n.

Then error is as follows.

[2024-07-30T03:56:01Z ERROR cached_path::cache] ETAG fetch for https://huggingface.co/llama3.1/resolve/main/tokenizer.json failed with fatal error
Traceback (most recent call last):

json.decoder.JSONDecodeError: Unterminated string starting at: line 1 column 2 (char 1)

Expected behavior

I suppose it should complete the preps based on what I read from your official docs.

Screenshots

No response

Open Interpreter version

0.3.4

Python version

3.11.5

Operating System name and version

Windows 11

Additional context

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions