Skip to content

Adapt the ollama streaming tool call #6078

Open
@EntropyYue

Description

@EntropyYue

Before submitting your bug report

Relevant environment info

- OS:
- Continue version:
- IDE version:
- Model:
- config:
  

  
  OR link to assistant in Continue hub:

Description

ollama already supports streaming processing, but continue still considers it unsupported, which will prevent continue from streaming back the model's output in agent mode.
And this also causes models using <think>...</think> to fail to output normally: the model's answer will be included within Thought for ....
An alternative is to use an openai-compatible API with the ollama model, which will work normally.

To reproduce

Use the ollama provider with models that support inference and tool calls, such as qwen3

Log output

Metadata

Metadata

Assignees

Labels

kind:bugIndicates an unexpected problem or unintended behavior

Type

No type

Projects

Status

Todo

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions