-
Notifications
You must be signed in to change notification settings - Fork 2.4k
Providing scalapack with intel-oneapi-mkl +cluster breaks a few packages #37374
Copy link
Copy link
Closed
Labels
Description
Steps to reproduce the issue
Let me explain the issue with intel-oneapi-mkl +cluster.
Scalapack, Lapack, and Blas interfaces are provided via intel-oneapi-mkl by enabling a variant "+cluster".
In order to satisfy cluster library interfaces for many client packages, the ideal way is to set it in the package.yaml file exclusively.
cat packages.yaml
packages:
scalapack:
require: 'intel-oneapi-mkl +cluster'
lapack:
require: 'intel-oneapi-mkl +cluster'
blas:
require: 'intel-oneapi-mkl +cluster'
Adding this +cluster variant breaks a few packages builds. Without +cluster variant, intel-oneapi-mkl is never concretized as a provider of Scalapack, lapack, and blas libraries.
For reference here are some of the errors I am facing when enabling "intel-oneapi-mkl +cluster" to get the cluster libs interface.
- Arpack-ng and Sundial fails with a similar error signature:
$> spack --env . install
==> Starting concretization
==> Environment concretized in 31.10 seconds.
==> Concretized arpack-ng%intel
==> Concretized arpack-ng%intel
- 2ywjnjj [email protected]%[email protected]~icb~ipo+mpi+shared build_system=cmake build_type=RelWithDebInfo generator=make arch=linux-sles15-haswell
[+] sjtcyti ^[email protected]%[email protected]~doc+ncurses+ownlibs~qt build_system=generic build_type=Release arch=linux-sles15-haswell
[+] aqz2wny ^[email protected]%[email protected]~symlinks+termlib abi=none build_system=autotools arch=linux-sles15-haswell
[+] ttkoosz ^[email protected]%[email protected] build_system=autotools arch=linux-sles15-haswell
[+] on3muvc ^[email protected]%[email protected]~docs~shared build_system=generic certs=mozilla arch=linux-sles15-haswell
[+] 44xqvtw ^ca-certificates-mozilla@2023-01-10%[email protected] build_system=generic arch=linux-sles15-haswell
[+] lqahcum ^[email protected]%[email protected]+cpanm+open+shared+threads build_system=generic arch=linux-sles15-haswell
[+] 2jh7jzs ^[email protected]%[email protected]+cxx~docs+stl build_system=autotools patches=26090f4,b231fcc arch=linux-sles15-haswell
[+] fmjuxfz ^[email protected]%[email protected]~debug~pic+shared build_system=generic arch=linux-sles15-haswell
[+] a7oxann ^[email protected]%[email protected] build_system=autotools arch=linux-sles15-haswell
[+] pndnpl6 ^[email protected]%[email protected] build_system=autotools libs=shared,static arch=linux-sles15-haswell
[+] w4eg7l5 ^[email protected]%[email protected] build_system=autotools arch=linux-sles15-haswell
[+] jkkaq5j ^[email protected]%[email protected] build_system=autotools patches=bbf97f1 arch=linux-sles15-haswell
[+] cf7cfdn ^[email protected]%[email protected]+optimize+pic+shared build_system=makefile arch=linux-sles15-haswell
[+] kbzqf3m ^[email protected]%[email protected]~guile build_system=autotools arch=linux-sles15-haswell
[+] s5ehoc7 ^[email protected]%[email protected]+cluster+envmods~ilp64+shared build_system=generic arch=linux-sles15-haswell
[+] 3jzbxbh ^[email protected]%[email protected]+envmods build_system=generic arch=linux-sles15-haswell
[+] prg6h64 ^[email protected]%[email protected]+envmods~external-libfabric~generic-names~ilp64 build_system=generic arch=linux-sles15-haswell
...
...
...
==> arpack-ng: Executing phase: 'cmake'
==> arpack-ng: Executing phase: 'build'
==> Error: ProcessError: Command exited with status 2:
'/dss/dsshome1/09/di36pex/SPACK_lrz_dir/spack19/opt/haswell/gmake/4.4.1-intel-kbzqf3m/bin/make' '-j24'
1289 errors found in build log:
>> 572 ld: /dss/dsshome1/09/di36pex/SPACK_lrz_dir/spack19/opt/haswell/intel-oneapi-mkl/2023.1.0-intel-s5ehoc7/mkl/2023.1.0/lib/intel64/libmkl_blacs_intelmpi_lp64.so: undefined reference to `MPI_Waitall'
>> 573 ld: /dss/dsshome1/09/di36pex/SPACK_lrz_dir/spack19/opt/haswell/intel-oneapi-mkl/2023.1.0-intel-s5ehoc7/mkl/2023.1.0/lib/intel64/libmkl_blacs_intelmpi_lp64.so: undefined reference to `MPI_Abort'
...- Latte:
$> spack install latte
==> Installing latte-1.2.2-3in27dsvmo7wu3vw75xdyzdurfymc4j6
==> Error: ProcessError: Command exited with status 2:
58 errors found in build log:
...
...
023.1.0-gcc-552aahy/mkl/2023.1.0/lib/intel64/libmkl_blacs_intelmpi_lp64.so -lpthread -lm /usr/lib64/libdl.so
>> 556 /usr/bin/ld: /dss/dsshome1/09/di36pex/SPACK_lrz_dir/spack19/opt/haswell/intel-oneapi-mkl/2023.1.0-gcc-552aahy/mkl/2023.1.0/lib/intel64/libmkl_blacs_intelmpi_lp64.so: undefined reference to `MPI_Waitall'
>> 557 /usr/bin/ld: /dss/dsshome1/09/di36pex/SPACK_lrz_dir/spack19/opt/haswell/intel-oneapi-mkl/2023.1.0-gcc-552aahy/mkl/2023.1.0/lib/intel64/libmkl_blacs_intelmpi_lp64.so: undefined reference to `MPI_Abort'
...- hpl:
$> spack install hpl
==> Installing hpl-2.3-dvsnvkdinsj4a5oksk6r75uctvy3jn3f
==> Error: ProcessError: Command exited with status 77:
2 errors found in build log:
9 checking whether the C compiler works... no
>> 10 configure: error: in `/gpfs/scratch/pr28fa/di36pex/di36pex/haswell/spack-stage-hpl-2.3-dvsnvkdinsj4a5oksk6r75uctvy3jn3f/spack-src':
>> 11 configure: error: C compiler cannot create executables
12 See `config.log' for more details- Atompaw:
$> spack install atompaw
==> Installing atompaw-4.2.0.2-q6jmhh5qabgr3yhtuyopsjzcqekjq2b6
==> Error: ProcessError: Command exited with status 1:
'/gpfs/scratch/pr28fa/di36pex/di36pex/haswell/spack-stage-atompaw-4.2.0.2-q6jmhh5qabgr3yhtuyopsjzcqekjq2b6/spack-src/configure' '--prefix=/dss/dsshome1/09/di36pex/SPACK_lrz_dir/spack19/opt/haswell/atompaw/4.2.0.2-gcc-q6jmhh5' '--with-linalg-libs=-L/dss/dsshome1/09/di36pex/SPACK_lrz_dir/spack19/opt/haswell/intel-oneapi-mkl/2023.1.0-gcc-552aahy/mkl/2023.1.0/lib/intel64 -L/usr/lib64 -lmkl_scalapack_lp64 -lmkl_cdft_core -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lmkl_blacs_intelmpi_lp64 -lpthread -lm -ldl' '--enable-libxc' '--with-libxc-incs=-I/dss/dsshome1/09/di36pex/SPACK_lrz_dir/spack19/opt/haswell/libxc/6.1.0-gcc-m2dquhn/include' '--with-libxc-libs=-L/dss/dsshome1/09/di36pex/SPACK_lrz_dir/spack19/opt/haswell/libxc/6.1.0-gcc-m2dquhn/lib -lxcf90 -lxc'
1 error found in build log:
110 checking how to hardcode library paths into programs... immediate
111 configure:
112 configure: ===== Looking for Linear Algebra libraries
113 configure: ...Linear Algebra: trying with command-line options
114 checking for library containing zhpev... no
115 checking for library containing zgemm... no
>> 116 configure: error: Linear Algebra libraries (blas, lapack) were not found with the specified --with_linalg_libs/--with_linalg_prefix- R:
$> spack install r +X+external-lapack
==> Installing r-4.2.2-u7bzlaku22m3kgrpyt6dblzq6i7wjaqo
...
1 error found in build log:
396 checking whether 'struct tm' includes tm_zone... yes
397 checking whether 'struct tm' includes tm_gmtoff... yes
398 checking for dgemm_ in -L/dss/dsshome1/lrz/sys/spack/release/23.1.0/opt/haswell/intel-oneapi-mkl/2023.1.0-gcc-552aahy/mkl/2023.1.0/lib/intel64 -L/usr/lib64 -lmkl_scalapack_lp64 -lmkl_cdft_core -lmkl_gf_lp64 -lmkl_sequential -lmkl_core -lmkl_blacs_intelmpi_lp64 -lpthrea
d -lm -ldl... no
>> 399 configure: error: BLAS was specified but not available
Error message
Information on your system
$> spack debug report
- Spack: 0.20.0.dev0 (4363b1c)
- Python: 3.6.12
- Platform: linux-sles15-haswell
- Concretizer: clingo
Additional information
No response
General information
- I have run
spack debug reportand reported the version of Spack/Python/Platform - I have run
spack maintainers <name-of-the-package>and @mentioned any maintainers - I have uploaded the build log and environment files
- I have searched the issues of this repo and believe this is not a duplicate
Reactions are currently unavailable