-
Notifications
You must be signed in to change notification settings - Fork 2.4k
PETSc: MPI configuration not stored properly #7953
Description
@BarrySmith @jedbrown @ckhroulev
Summary
PISM fails to configure because PETSc has not stored path to MPI in its configuration files. This is probably caused by a change in the Spack package; specifically the loss of the --with-mpi-dir flag from d04ae9a2.
Details
When PETSc builds, it creates a file <prefix>/lib/petsc/conf/petscvariables, containing a number of key-value pairs describing the PETSc installation. This file is later read and relied upon by some programs that use PETSc; PISM, for example:
https://github.com/pism/pism
PISM's CMake build looks up the PETSc variable PETSC_CC_INCLUDES:
https://github.com/pism/pism/blob/master/CMake/FindPETSc.cmake#L159
It then uses this variable to construct correct command lines for small test programs used to deduce the configuration of PETSc:
The Problem
At the point in the PISM CMake build where test programs are compiled, the variable ${includes} needs to contain an -I flag for MPI (in my case, OpenMPI). However, it does not. This is traced back to PETSC_CC_INCLUDES not containing a -I flag for MPI in the PETSc file petscvariables (above).
It is believed that OpenMPI is not showing up in PETSC_CC_INCLUDES because the Spack package no longer proviedes PETSc with the flag --with-mpi-dir as of d04ae9a2.
Proposed Fix
I will try restoring --with-mpi=1 and --with-mpi-dir, while keeping --with-cc, --with-cxx and --with-fc as well. Thoughts from @jedbrown and @BarrySmith on possible effects this might have --- or whether there's a better way involving PETSc internals --- would be appreciated. I will report back on what I find.