-
Notifications
You must be signed in to change notification settings - Fork 54
Incompatible Package Error when executing setup-meta-modules #1360
Description
Describe the bug
Building the devel branch of the jedi-ufs-env stack and nco (as of 10/28/2024) with Intel results in a "Package with matching name is incompatible" error when it tries to create the meta modules. This bug is very similar to #1126.
To Reproduce
Note: I have forced the use of Open MPI 4.1.6 by requesting it in site/packages.yaml to facilitate using TotalView (the 5.x line has removed the MPIR process acquisition interface required by TotalView).
Ask spack to build jedi-ufs-env and nco either with the command spack add jedi-ufs-env%intel nco%intel or by editing your spack.yaml. Conrcretize has opempi listed under both jedi-ufs-env and nco but only one openmpi is built. I can provide the concretize output if needed. The stack builds to completion and spack module lmod refresh is successful (and creates only one openmpi module), but spack stack setup-meta-modules errors:
Configuring basic directory information ...
... script directory: /opt/src/spack-stack/spack-ext/lib/jcsda-emc/spack-stack/stack
... base directory: /opt/src/spack-stack/spack-ext/lib/jcsda-emc/spack-stack
... spack directory: /opt/src/spack-stack/spack
Configuring active spack environment ...
... environment directory: /opt/src/spack-stack/envs/devel2024-10-28_intel
Parsing spack environment main config ...
... install directory: /opt/spack-stack
Parsing spack environment modules config ...
... configured to use lmod modules
... module directory: /opt/spack-stack/modulefiles
Parsing spack environment package config ...
... list of possible compilers: '['[email protected]', 'gcc', 'clang', 'oneapi', 'xl', 'nag', 'fj', 'aocc']'
... list of possible mpi providers: '['[email protected]', 'openmpi', 'mpich']'
['intel', 'Core', 'openmpi', 'module-index.yaml']
... stack compilers: '{'intel': ['2021.8.0']}'
... stack mpi providers: '{'openmpi': {'4.1.6-26xih74': {'intel': ['2021.8.0']}}}'
... core compilers: ['gcc@=11.4.1']
Preparing meta module directory ...
... meta module directory : /opt/spack-stack/modulefiles/Core
... preferred compiler: intel
Creating compiler modules ...
... ... appending /opt/spack-stack/modulefiles/intel/2021.8.0 to MODULEPATHS_SAVE
... configuring stack compiler [email protected]
... ... CC : /opt/sw/apps/intel/oneapi/compiler/2023.0.0/linux/bin/intel64/icc
... ... CXX : /opt/sw/apps/intel/oneapi/compiler/2023.0.0/linux/bin/intel64/icpc
... ... F77 : /opt/sw/apps/intel/oneapi/compiler/2023.0.0/linux/bin/intel64/ifort
... ... FC' : /opt/sw/apps/intel/oneapi/compiler/2023.0.0/linux/bin/intel64/ifort
... ... COMPFLAGS: setenv("CFLAGS", "-diag-disable=10441")
setenv("CXXFLAGS", "-diag-disable=10441")
setenv("FFLAGS", "-diag-disable=10441")
... ... MODULELOADS: load("intel/2021.8.0")
... ... MODULEPREREQS: prereq("intel/2021.8.0")
... ... ENVVARS : prepend_path("LD_LIBRARY_PATH", "/opt/sw/apps/intel/oneapi/compiler/2023.0.0/linux/compiler/lib/intel64_lin")
... ... MODULEPATHS : prepend_path("MODULEPATH", "/opt/spack-stack/modulefiles/intel/2021.8.0")
... writing /opt/spack-stack/modulefiles/Core/stack-intel/2021.8.0.lua
==> Error: Package with matching name is incompatible: ordereddict([('version', ['4.1.6']), ('require', '~internal-hwloc+two_level_namespace')])
The stack is still usable, but no stack-openmpi or stack-python modules are created. I suspect this is actually the reason why I got this same error shown in #1126 when I moved aside one of the openmpi builds. The stack in that issue was also jedi-ufs-env and nco.
Expected behavior
spack stack setup-meta-modules should complete without errors.
System:
Linux Workstation running Rocky 9. Intel is 2023.0.0 (2021.8.0 legacy compilers (icc,icpc,ifort)) and GCC is 11.4.1 system package manager install.
Additional context
I suspect I might be able to work around this by simply adding a depends_on("nco", type="run") to the spack-ext/repos/spack-stack/packages/jedi-ufs-env/package.py file but I have not tested this yet. A quick look at spack-ext/lib/jcsda-emc/spack-stack/stack/meta_modules.py makes me think that any time an MPI package is referenced more than once, this error will occur even if they are, in fact, the exact same package.