Skip to content

Bug: llama-server cannot open shared object file libllama.so after building from SYCL-Dockerfile #9416

@HumerousGorgon

Description

@HumerousGorgon

What happened?

After building the llama-server-intel.Dockerfile into an image and running with the correct docker run args, every single attempt ends in :
/llama-server: error while loading shared libraries: libllama.so: cannot open shared object file: No such file or directory

Building the CLI version results in a prompt being able to be sent and a response generated.
I have tried using --no-cache on the docker build operation, in case previously cached file were messing with things.

I used a bit of a hodge-podge command, working off of the standard options needed in the llama-server dockerfile (non SYCL).

Name and Version

Version: llama-server-intel.Dockerfile.
Command run:
docker run -it -p 11434:11434 -v /mnt/user/models/model-files/:/app/models/ --device /dev
/dri/renderD128:/dev/dri/renderD128 --device /dev/dri/card0:/dev/dri/card0 llama-server-cpp-sycl -m "/app/models/Meta-Llama-3-8B-Instruct-fp16.ggu
f" -n 2048 -e -ngl 33 --host 0.0.0.0 --port 11434

What operating system are you seeing the problem on?

Linux

Relevant log output

/llama-server: error while loading shared libraries: libllama.so: cannot open shared object file: No such file or directory

Metadata

Metadata

Assignees

No one assigned

    Labels

    bug-unconfirmedhigh severityUsed to report high severity bugs in llama.cpp (Malfunctioning hinder important workflow)stale

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions