Skip to content

server: make completions endpoint follow openai api just like chatcompletion #4497

@nyxkrage

Description

@nyxkrage

Prerequisites

Please answer the following questions for yourself before submitting an issue.

  • I am running the latest code. Development is very rapid so there are no tagged versions as of now.
  • I carefully followed the README.md.
  • I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
  • I reviewed the Discussions, and have a new bug or useful enhancement to share.

Expected Behavior

I would expect to see the regular Completions endpoint follow the OpenAI API just like the ChatCompletions one, and be called with /v1/completions

Current Behavior

Currently only the ChatCompletions endpoint follows the OpenAI standard, and the regular Completions endpoint is on /completion

Environment and Context

This would make use with software that benefits from the greater control of being able to craft the whole prompt themselves, such as SillyTavern, much better while avoiding extra code to handle the llama.cpp server separately

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions