Replies: 3 comments 1 reply
-
|
It was the first thing I thought when they told me about this project. I already have an Open WebUi installed on my local company server that, for example, connects with N8N simulating OpenIA, so I understand that it should not be too difficult to connect it here to clawdbot as well. I hope it is implemented soon |
Beta Was this translation helpful? Give feedback.
-
|
This already exists! ✨ Moltbot has an OpenAI-compatible Chat Completions endpoint: Documentation: https://docs.molt.bot/gateway/openai-http-api.md Key features:
Enable it in config: {
gateway: {
http: {
endpoints: {
chatCompletions: { enabled: true }
}
}
}
}This should work exactly as described in the feature request — different agents can be exposed as different "models" in the API. |
Beta Was this translation helpful? Give feedback.
-
|
Perhaps it's the lack of sleep, but I think I didn't explain myself well. What I'm trying to do is get Moltbot to use my local LLMs in Open WebUI instead of the APIs from OpenAI, Google, Anthropic, etc. I understood that what you gave me is the opposite, to use the different Moltbots from Open WebUI... I don't know if I've made a mistake... xD |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
If there was an OpenAI-compatible endpoint, then any UI could be used to chat with Clawd, like for example https://github.com/open-webui/open-webui
You could expose different modes or identities via the models API, but instead of model names you would get different Clawds, like "General Clawd", "Support Clawd" etc. This model has an identifier and the UI would then send chat completion requests with this model identifier to the API. Clawed could then internally process the messages and respond accordingly.
Beta Was this translation helpful? Give feedback.
All reactions