-
Notifications
You must be signed in to change notification settings - Fork 14.2k
Closed
Labels
Description
Prerequisites
Please answer the following questions for yourself before submitting an issue.
- I am running the latest code. Development is very rapid so there are no tagged versions as of now.
- I carefully followed the README.md.
- I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
- I reviewed the Discussions, and have a new bug or useful enhancement to share.
Expected Behavior
I would expect to see the regular Completions endpoint follow the OpenAI API just like the ChatCompletions one, and be called with /v1/completions
Current Behavior
Currently only the ChatCompletions endpoint follows the OpenAI standard, and the regular Completions endpoint is on /completion
Environment and Context
This would make use with software that benefits from the greater control of being able to craft the whole prompt themselves, such as SillyTavern, much better while avoiding extra code to handle the llama.cpp server separately
AlpinDale, sweetcard and alexandrebcaruso