Check the installed HDF5 library for consistency#812
Conversation
|
The Travis failure looks unrelated. |
|
Perhaps we should also add |
|
@adamjstewart Yes, |
|
The travis failure is indicating that the new HDF5 package failed the Python 2.6 compatibility check: Just change the |
| "-L%s" % join_path(spec.prefix, "lib"), "-lhdf5", | ||
| "-lz") | ||
| try: | ||
| output = subprocess.check_output("./check") |
There was a problem hiding this comment.
Suggest either using Popen or:
check = Executable('./check')
output = check(return_output=True)|
@eschnett Can we factor to some place in the framework (if not already present) functions like : def try_compile(sources, ...):
...
return executable
def try_execute(executable, ...):
...
return outputand use them here to have a concise description of the tests within a package? Also, can we keep C code in C sources, instead of raw string literals? Per discussion in #806 with @davydden I like the idea of having tests to ensure that things were compiled correctly, but I think the amount of boilerplate will be too much if we start to repeat (i.e. cut and paste) this all over the place. FWIW, I can help with the implementation if you don't have time to do that. |
|
@alalazo Splitting things up into functions is a nice idea. I'm sure they will evolve over time, since people will also want to build C++ and Fortran source code. A third function might be C code in C source files: Ideally, these tests run after the library has been installed, and after the staging directory has been removed (or renamed), to ensure that nothing accidentally points back into the staging directory. The install tests should run in their own directory. How would I copy the C sources and the expected output into a new directory? Yes, I might be busy this week with other things. I would be happy if you were to take a stab at refactoring my checking routine. If so, it might be easiest if you opened a pull request against origin, not against my repository. |
👍 |
|
@tgamblin in which file/where would you put that functionality? |
|
how about |
|
Is the intention of package tests that they run on the same host which builds the packages? |
|
@robertdfrench for now, i would say yes. But i think this discussion belongs to a separate thread/issue. |
|
Once we have the newarch support it should be easy to get the frontend for the build platform, even if the build itself is for the backend. So we can revisit the tests for cray and BG then |
* highfive pinned to release brion only * eyescale deps only for 3.1.0 * remove lunchbox from sonata-network-reduction in bbp-packages.yaml * disable subproprojects for emsim * choosePython.cmake path can vary * emsim to continue using [email protected] * brion patch in brayns * style in brion package.py * reorder brayns@immersive deps
I added a routine to the HDF5 install script that checks the installed library. This is not the self-check that HDF5 itself can perform as part of its build; rather, this tests whether Spack installed the library in such a way that its customers (other packages etc.) find a self-consistent and usable HDF5 library.
In particular, this routine builds and runs a small C program that uses the HDF5 library, thus checking include, library, and run time path settings. It also checks the version numbers, both via preprocessor constants (from include headers) and via function calls (from the library found at run time), and compares these to the expected version numbers. A mismatch leads to an error.
I think such a system is generically useful for many packages. I'm suggesting to add this to HDF5 as a test balloon, and if it works, it can be cleaned up, made part of Spack, and generally be used for many other packages as well.
As a side node, I think packages could define environment variables that define the list of library names that should be used to link against a library; in this case, HDF5 could define e.g. the equivalent of
HDF5_LIBS='-lhdf5'or similar. This is just a note, this is not implemented here.