-
Notifications
You must be signed in to change notification settings - Fork 4.3k
Unresponsive chat when using remote self-hosted Ollama server #946
Copy link
Copy link
Closed
Labels
area:chatRelates to chat interfaceRelates to chat interfaceide:vscodeRelates specifically to VS Code extensionRelates specifically to VS Code extensionkind:bugIndicates an unexpected problem or unintended behaviorIndicates an unexpected problem or unintended behavior
Description
Before submitting your bug report
- I believe this is a bug. I'll try to join the Continue Discord for questions
- I'm not able to find an open issue that reports the same bug
- I've seen the troubleshooting guide on the Continue Docs
Relevant environment info
- OS: PopOS 22.04
- Continue: 0.8.17
- IDE: VSCode
- Ollama: 0.1.28Description
I can successfully use Ollama locally.
I'd prefer to use a remote self-hosted Ollama instance. When adding an apiBase (e.g. 192.168.1.xxx:11434), the continue extension no longer works; submitting the chat is frozen/hangs on the rainbow outline (indicating it's waiting on a response).
To reproduce
- Run a remote instance of Ollama. Verify Ollama is running with
curl 192.168.1.xxx:11434(responds withOllama is running) - Add
apiBasevalue toconfig.jsonin theollamamodel. - Submit a sample query/chat
- Get no response and no error, form is frozen with "loading" animation.
Log output
N/A
I opened the Developer Tools window in VSCode, but I did not get any output related to continue.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
area:chatRelates to chat interfaceRelates to chat interfaceide:vscodeRelates specifically to VS Code extensionRelates specifically to VS Code extensionkind:bugIndicates an unexpected problem or unintended behaviorIndicates an unexpected problem or unintended behavior