Skip to content

petsc: run a test#1145

Merged
tgamblin merged 2 commits intospack:developfrom
davydden:pkg/petsc_tests
Jun 30, 2016
Merged

petsc: run a test#1145
tgamblin merged 2 commits intospack:developfrom
davydden:pkg/petsc_tests

Conversation

@davydden
Copy link
Copy Markdown
Member

No description provided.

@tgamblin
Copy link
Copy Markdown
Member

@davydden: is this ready to merge?

@davydden
Copy link
Copy Markdown
Member Author

@tgamblin should be good now.

@tgamblin tgamblin merged commit a7b8cb6 into spack:develop Jun 30, 2016
@nrichart
Copy link
Copy Markdown
Contributor

This breaks petsc. The test is a mpi code run in sequential on our site we have mvapich2+slurm if you do not run with srun it fails.
And no wrapper mpirun is installed when using +slurm so just adding mpirun will not solve it.

@davydden
Copy link
Copy Markdown
Member Author

@nrichart i will check the example file and if needed switch to another one.

However, the fact that you was not able to run it with MPI in my opinion indicates a bigger issue, not related to petsc per-se.

@nrichart
Copy link
Copy Markdown
Contributor

No don't get me wrong I am able to run it.
srun ./ex50 <parameters> works fine, but it mean you cannot hardcode mpirun in the package.py of petsc or other packages.
Since depending on the variant of mpi the mpirun wrapper might not be installed

@davydden
Copy link
Copy Markdown
Member Author

davydden commented Jul 1, 2016

@nrichart how about we introduce another variable to mpi wrappers spec['mpi'].run which for OpenMPI would be mpirun and for mvapich2+slurm -- srun, whereas without slurm it should probably also be mpirun. I am not sure about the parameter for the number of processes, i think this is also not the same among MPI implementations -np, -n, etc.

I would then switch these tests to mpi-only and use spec['mpi'].run to run them.

@davydden
Copy link
Copy Markdown
Member Author

davydden commented Jul 1, 2016

@nrichart can you check if the following test works with your installation

        if 'mpi' in self.spec:
            with working_dir('src/ksp/ksp/examples/tutorials'):
                env['PETSC_DIR'] = self.prefix
                make("ex50", parallel=False)
                make("runex50", parallel=False)
                make("ex52", parallel=False)
                make("runex52", parallel=False)
                make("runex52_mumps", parallel=False)
                make("runex52_superlu_dist", parallel=False)
                make("ex55", parallel=False)
                make("runex55_hypre", parallel=False)

It uses PETSc's make file and targets to run. Those are supposed to pick up mpiexec and alike.

@davydden davydden deleted the pkg/petsc_tests branch May 28, 2017 21:00
olupton pushed a commit to olupton/spack that referenced this pull request Feb 7, 2022
* Add Bluepy==2.3.0, morphio==3.1.1, bluepysnap==0.12.0
* remove depends_on from PyMorphologyRepairWorkflow [temporary]
* Update c++ morphio
* fix brainbuilder tests
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants