-
Notifications
You must be signed in to change notification settings - Fork 1.3k
Closed
Description
Prerequisites
Please answer the following questions for yourself before submitting an issue.
- I am running the latest code. Development is very rapid so there are no tagged versions as of now.
- I carefully followed the README.md.
- I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
- I reviewed the Discussions, and have a new bug or useful enhancement to share.
Expected Behavior
I update llama-cpp-python on every release and it usually installs without any problem
Current Behavior
Building wheels for collected packages: llama-cpp-python
Building wheel for llama-cpp-python (pyproject.toml) ... error
error: subprocess-exited-with-error
× Building wheel for llama-cpp-python (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [38 lines of output]
*** scikit-build-core 0.5.0 using CMake 3.27.4 (wheel)
*** Configuring CMake...
2023-09-12 23:23:47,426 - scikit_build_core - WARNING - libdir/ldlibrary: /home/outscale/miniconda3/envs/react-qback/lib/libpython3.9.a is not a real file!
2023-09-12 23:23:47,427 - scikit_build_core - WARNING - Can't find a Python library, got libdir=/home/outscale/miniconda3/envs/react-qback/lib, ldlibrary=libpython3.9.a, multiarch=x86_64-linux-gnu, masd=None
loading initial cache file /tmp/tmpokfl0ylr/build/CMakeInit.txt
-- The C compiler identification is GNU 11.4.0
-- The CXX compiler identification is GNU 11.4.0
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: /home/outscale/miniconda3/envs/react-qback/bin/cc - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: /home/outscale/miniconda3/envs/react-qback/bin/c++ - skipped
-- Detecting CXX compile features
ninja: error: '/tmp/pip-install-cj9hww_r/llama-cpp-python_b6fd4b268fde4a7f9c385ad01df0f9ba/.git/modules/vendor/llama.cpp/index', needed by '/tmp/pip-install-cj9hww_r/llama-cpp-python_b6fd4b268fde4a7f9c385ad01df0f9ba/vendor/llama.cpp/build-info.h', missing and no known rule to make it
*** CMake build failed
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for llama-cpp-python
Failed to build llama-cpp-python
ERROR: Could not build wheels for llama-cpp-python, which is required to install pyproject.toml-based projects
Environment and Context
I'm using two machines : one with a CPU only and one with a GPU A100. It fails to installs / upgrade on both
ryneandal
Metadata
Metadata
Assignees
Labels
No labels