Skip to content

Misc. bug: No documentation for webui-mcp-proxy in server #20384

@strawberrymelonpanda

Description

@strawberrymelonpanda

Name and Version

ggml_cuda_init: found 1 CUDA devices:
Device 0: NVIDIA GeForce RTX 3090, compute capability 8.6, VMM: yes
version: 8247 (ae87863)
built with GNU 13.3.0 for Linux x86_64

Operating systems

Linux

Which llama.cpp modules do you know to be affected?

llama-server

Command line

llama-server \
--models-preset ./presets.ini \
--models-max 1 \

Problem description & steps to reproduce

Not sure if this is considered a bug or a feature request, but I couldn't find any documentation about --webui-mcp-proxy / MCP CORS proxy in the server README (or anywhere else).

Seems like at least a mention ought to be there.
(And ideally why you might want to use it for local MCP servers.)

First Bad Commit

No response

Relevant log output

No response

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions