Skip to content

Unresponsive chat when using remote self-hosted Ollama server  #946

@MattyRad

Description

@MattyRad

Before submitting your bug report

Relevant environment info

- OS: PopOS 22.04
- Continue: 0.8.17
- IDE: VSCode
- Ollama: 0.1.28

Description

I can successfully use Ollama locally.

I'd prefer to use a remote self-hosted Ollama instance. When adding an apiBase (e.g. 192.168.1.xxx:11434), the continue extension no longer works; submitting the chat is frozen/hangs on the rainbow outline (indicating it's waiting on a response).

To reproduce

  1. Run a remote instance of Ollama. Verify Ollama is running with curl 192.168.1.xxx:11434 (responds with Ollama is running)
  2. Add apiBase value to config.json in the ollama model.
  3. Submit a sample query/chat
  4. Get no response and no error, form is frozen with "loading" animation.

Log output

N/A

I opened the Developer Tools window in VSCode, but I did not get any output related to continue.

Metadata

Metadata

Assignees

Labels

area:chatRelates to chat interfaceide:vscodeRelates specifically to VS Code extensionkind:bugIndicates an unexpected problem or unintended behavior

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions