Skip to content

Developer support for CMake-based Projects#474

Closed
citibeth wants to merge 3 commits intospack:developfrom
citibeth:efischer/160301-CMakeProject
Closed

Developer support for CMake-based Projects#474
citibeth wants to merge 3 commits intospack:developfrom
citibeth:efischer/160301-CMakeProject

Conversation

@citibeth
Copy link
Copy Markdown
Member

@citibeth citibeth commented Mar 2, 2016

New CMakePackage subclass of Package. This has two advantages, without breaking backwards compatibility:

  1. Boilerplate code in CMake projects can now be eliminated (see everytrace/package.py for details on how it can be done now).
  2. The CMakeProject understands a special version 'local', to enlist Spack to help configure CMake properly when developing projects. Consider the following example for usage, in which Spack is used to set up a build and a module, but not acutally DO the build.
     git clone https://github.com/citibeth/everytrace.git
     cd everytrace
     spack diy --skip-patch everytrace@local
     mkdir build
     cd build
     /usr/bin/python ../spconfig.py ..     # Runs cmake with Spack-supplied configuration
     make
     make install        # Installs into spack directory
     spack load everytrace@local   # Spack even makes a module

Once you're happy with your project, you can add appropriate version() commands to your package.py, and use Spack normally with it. To make this work, your project has to cooperate with Spack, as follows:

a) You need to use CMake (of course).

b) Your CMakeLists.txt should use the following line, which will ensure that all TRANSITIVE dependencies are added to the include path. If you're not running with Spack, then this line will do nothing.
include_directories($ENV{CMAKE_TRANSITIVE_INCLUDE_PATH})

children d2e5234
on branches efischer/develop, origin/efischer/develop

Elizabeth F added 3 commits March 1, 2016 23:33
…one place (this code was previously included inline).
…t breaking backwards compatibility:

1. Boilerplate code in CMake projects can now be eliminated (see everytrace/package.py for details on how it can be done now).

2. The CMakeProject understands a special version 'local', to enlist Spack to help configure CMake properly when developing projects.  Consider the following example for usage, in which Spack is used to set up a build and a module, but not acutally DO the build.

     git clone https://github.com/citibeth/everytrace.git
     cd everytrace
     spack diy --skip-patch everytrace@local
     mkdir build
     cd build
     /usr/bin/python ../spconfig.py ..     # Runs cmake with Spack-supplied configuration
     make
     make install        # Installs into spack directory
     spack load everytrace@local   # Spack even makes a module

Once you're happy with your project, you can add appropriate version() commands to your package.py, and use Spack normally with it.  To make this work, your project has to cooperate with Spack, as follows:

  a) You need to use CMake (of course).

  b) Your CMakeLists.txt should use the following line, which will ensure that all TRANSITIVE dependencies are added to the include path.  If you're not running with Spack, then this line will do nothing.
       include_directories($ENV{CMAKE_TRANSITIVE_INCLUDE_PATH})
@citibeth
Copy link
Copy Markdown
Member Author

citibeth commented Mar 2, 2016

I added a new commit to my branch that should fix the failure. I'm led to believe that if I just do nothing, eventually Travis-CI will re-run and the PR will turn green. But that hasn't happened yet...

homepage = "https://github.com/citibeth/everytrace"
url = "https://github.com/citibeth/everytrace/tarball/dev"

version('devel', '0123456789abcdef0123456789abcdef')
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

does this fetch? I guess not -- but what if someone wants to build every trace non-local?

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The URL I put in there (line 12) should fetch; GitHub can give you a tarball directly, if you tell it which branch/tag you want to fetch. Of course, fetching the "dev" branch would be bad practice for anyone other than the developer. Since the "spack diy" usage I outlined above doesn't try to fetch anyway, it would be more useful to put a real version tag in that URL in case someone is running Spack in regular mode. But... if a project is early in development, there won't be such a tag yet; so might as well just fetch master or dev.

The checksum won't match yet, but that can be fixed easily enough.

@tgamblin
Copy link
Copy Markdown
Member

tgamblin commented Mar 2, 2016

Ok this looks pretty cool. Comments:

  1. The local version seems a bit hacky to me. Is there another way we could do this? Maybe with another command instead of hijacking the version and the install function for CMakePackage? Some variant of spack diy seems to me like a better place to put all that logic.
  2. Is CMAKE_TRANSITIVE_INCLUDES necessary? If you are building with Spack's wrappers, they should already be adding -I$prefix/include for each dependency. Also, Spack sets CMAKE_PREFIX_PATH, which causes things like find_file and find_package to search in $prefix/include for everything on the CMAKE_PREFIX_PATH already -- so what do the new variable modification to the CMakeLists.txt file add?
  3. I haven't thought it through much but is there some usage of spack env <spec> <cmd> that would do part of this stuff already? You need spack diy to get the build env set up.

@tgamblin
Copy link
Copy Markdown
Member

tgamblin commented Mar 2, 2016

Looks like it failed again despite the new commit (which you can see above). Not sure why.

Edit: oh wait, it looks like it just hasn't run again. Not sure how it decides that...

@citibeth
Copy link
Copy Markdown
Member Author

citibeth commented Mar 2, 2016

The local version seems a bit hacky to me. Is there another way we
could do this? Maybe with another command instead of hijacking the
version and the install function for CMakePackage? Some variant of
spack diy seems to me like a better place to put all that logic.

I agree, the UI needs to be fixed and a new PR submitted. No need to debug Travis-CI at this time. (In the meantime, I have something that at least gets the job done). How about the following usage, assuming a "spack cmake" command:

 git clone https://github.com/citibeth/everytrace.git
 cd everytrace
 mkdir build
 spack cmake ..
 make
 make install        # Installs into spack directory
 spack load everytrace@cmake   # Spack even makes a module

Notes/Questions here:

  1. I like the "spack cmake" thing because it's so close to the way one normally uses cmake. "spack cmake" should run cmake with all the arguments you give it, plus the Spack-supplied arguments and environment.
  2. In this PR, Spack runs cmake with a "clean" environment. We should decide whether "spack cmake" should do that, or whether it's better to inherit from the user's environment. Any thoughts here?
  3. In this PR, Spack generates a cmake-calling script, and then you have to run it to call cmake. In the proposal above, spack runs cmake directly without writing out a script first. That's probably a better idea. However, one might still want either:
    a) "spack cmake --verbose ..", which would show what environment and parameters it's passing to cmake
    b) "spack cmake --write-script", which would produce a CMake-calling script as before.
    Any thoughts/preferences on this issue?
  4. "spack cmake" needs to know what package.py file to use, so it can (a) know what dependencies to provide in the environment to CMake, and (b) know what module file to produce. Possible ways to do that:
    a) Look in the directory specified on the "spack cmake" command line. (Eg: if you do "spack cmake ..", then it will look for ../package.py. Or mabye ../spack-package.py, to be more explicit).
    b) Look in the normally configured Spack repos
    c) Specify on the command line
    How should Spack choose between these methods, and how should they be controlled from the "spack cmake" command?
  5. Should producing a module file be optional with "spack cmake"? Should it default to yes or no? Is anyone worried about "spack cmake" producing a module file before the package is built?
  6. What version number should be used for packages compiled with "spack cmake", if the user doesn't supply one? If the user does supply one, how should it be supplied?

Is CMAKE_TRANSITIVE_INCLUDES necessary?

In general: suppose our project uses netcdf-cxx4. And when you #include something from that project, it in turn #includes something from netcdf. Then we need -I.../netcdf/include to be provided to the compiler, even though netcdf is not a (direct) dependency of our project.

In my experience, we don't need this transitive behavior with libraries, at least as long as RPATH is used. But maybe I'm wrong...

I can't tell for sure from your description... Are Spack wrappers already providing this transitive functionlity? Or do they just provide -I for direct dependencies?

Also, Spack sets CMAKE_PREFIX_PATH, which causes
things like find_file and find_package to search in $prefix/include
for everything on the CMAKE_PREFIX_PATH already

I believe that Spack only sets CMAKE_PREFIX_PATH for direct dependencies. That is the correct behavior, since only direct dependencies should be found via the CMake find_package commands.

-- so what do the new
variable modification to the CMakeLists.txt file add?

They add include directories for packages that are NOT direct dependencies of your project. In the old days, that was not such a problem, since most packages were in one big tree as well: chances are, netcdf would be in the same place as netcdf-cxx4, even if you didn't specify it as a dependency. When every project has its own tree, it becomes important to enumerate all transitive depenencies.

If you are building with
Spack's wrappers, they should already be adding -I$prefix/include for
each dependency.

I like the Spack wrapper approach better than mine because it requires no Spack-specific elements in the CMakeLists.txt file.

But... Spack's wrappers complain if you try to use them outside of Spack. Spack is not currently set up to just run "make" a zillion times, which is what you do when developing software (after running "cmake" once). I see some possible solutions to this problem, I'd appreciate your thoughts:

  1. Have "spack cmake" set things up to compile WITHOUT Spack's wrappers. This is what I did. But it worries me:

    a) You end up duplicating effort; for example the wrapper -I stuff and TRANSITIVE_DEPENDENCIES are really two mechanisms to get the same -I flags to the compiler in the end.

    b) You don't get Spack's RPATH support. But YOUR project uses RPATH anyway so it doesn't matter, right? Right? I can see both sides of this. Do we want to make CMake builds that are dependent on being run by Spack (probably not)? But if our software is so complex no one can compile our software without Spack anyway, do we care (probably yes; maybe they're using another auto-builder)?

    c) If you don't use the Spack wrappers, the build you do with "spack cmake" could be different from the final build you get with "spack install." Do we want to even allow the user to install things built with "spack cmake; make install" into the Spack tree? If we don't allow it, will that be a pain in the neck for developers?

  2. Change the Spack wrappers so they can work in a sensible way without Spack.

  3. Add a "spack make" command that allows the Spack wrappers to work in a sensible way. Now, you would do:

      mkdir build
      cd build
      spack cmake ..
      spack make
      ... fix errors ...
      spack make
      ... fix errors ...
      spack make install

I haven't thought it through much but is there some usage of spack env
that would do part of this stuff already?

I think it's similar but different. "spack env" sets up the environment for one package. If I'm developing package A that depends on X, Y and Z, then I really want "spack env X Y Z", or "spack env dependencies-of-A". Which is the same as what you get if you do (assuming recursive modules such as lmod):

    spack load A
    spack unload A

@citibeth citibeth closed this Mar 2, 2016
@citibeth citibeth deleted the efischer/160301-CMakeProject branch March 2, 2016 23:39
@mathstuf
Copy link
Copy Markdown
Contributor

mathstuf commented Mar 8, 2016

Is CMAKE_TRANSITIVE_INCLUDES necessary?

In general: suppose our project uses netcdf-cxx4. And when you #include something from that project, it in turn #includes something from netcdf. Then we need -I.../netcdf/include to be provided to the compiler, even though netcdf is not a (direct) dependency of our project.

In my experience, we don't need this transitive behavior with libraries, at least as long as RPATH is used. But maybe I'm wrong...

I can't tell for sure from your description... Are Spack wrappers already providing this transitive functionlity? Or do they just provide -I for direct dependencies?

This is part of the thing that is supposed to be solved by pkg-config and CMake's exported target information scripts. I certainly don't think spack should require a patch to all CMake projects. Libraries are generally handled by the linker (but if you use a symbol directly, linkers have been requiring that you link directly in which case you need to add it to your project anyways. Transitive header includes leaking out are usually a bug in the middle project.

@mathstuf
Copy link
Copy Markdown
Contributor

mathstuf commented Mar 8, 2016

Oh, and for static libraries, you need to link the transitive dependencies for executables so that the symbols can be found.

@citibeth
Copy link
Copy Markdown
Member Author

citibeth commented Mar 8, 2016

Transitive header includes leaking out are usually a bug in the middle
project.

Hmmm... What about a thin C++ header-only wrapper around a C library?

@mathstuf
Copy link
Copy Markdown
Contributor

mathstuf commented Mar 8, 2016

You're going to need to link the C library directly anyways, so you'll need to know where its bits live to link successfully.

@citibeth
Copy link
Copy Markdown
Member Author

citibeth commented Mar 8, 2016

Spack RPath-ifies everything, so I don't need to link the C library
directly. Even before I used Spack, so many things did RPath that I didn't
need to link directly.

Overall, I'd like a system in which you only have to list as dependencies
things that you directly use yourself --- either through #include or
linking. Anything else requires too much knowledge of all the zillions of
dependencies below you that developers just don't have.

On Tue, Mar 8, 2016 at 3:00 PM, Ben Boeckel [email protected]
wrote:

You're going to need to link the C library directly anyways, so you'll
need to know where its bits live to link successfully.


Reply to this email directly or view it on GitHub
#474 (comment).

@citibeth
Copy link
Copy Markdown
Member Author

citibeth commented Mar 8, 2016 via email

@tgamblin
Copy link
Copy Markdown
Member

tgamblin commented Mar 8, 2016

Just a thought FYI: one thing we are thinking very strongly of implementing is a "profile" capability, which would work kind of like virtualenv or conda environments, but for anything, not just python packages. It would likely look sort of like the existing python support in Spack, where you can activate/deactivate packages and they are symlinked into a common prefix (in the current Python case, that is the interpreter prefix).

You could do something like:

spack profile create mystack   # creates a profile called `mystack` in a directory somewhere
spack profile add <spec>    # installs <spec> if needed, then links it into mystack
spack profile remove <spec>    # unlinks <spec> from mystack
... etc ...
spack profile activate    # add mystack/bin to $PATH, add mystack to CMAKE_PREFIX_PATH, etc.

This would make it easy to create and save environments and stacks of packages for different teams, and to get them in/out of your environment. I would likely also want to version the profiles. This feature is in package managers like nix and guix. Another nice feature is that it would allow external apps to RPATH the profile directory (or a symlink to it, to allow transactional updates), but you could do seamless upgrades of packages linked into the profile by removing them and adding a new version.

If you had something like this, would you still need spack cmake?

@mathstuf
Copy link
Copy Markdown
Contributor

mathstuf commented Mar 8, 2016

C++ header-only wrapper around a C library

Spack RPath-ifies everything, so I don't need to link the C library directly.

What library are you linking to given a header-only C++ library?

That second make command will fail because it will run the (unwrapped) CMake automatically, instead of the "spack cmake" command. I'm wondering if there's any way to tell CMake what CMake command you want it to use on subsequent invocations of itself.

No, there isn't CMake sets CMAKE_COMMAND at the top to the full path to the executable that was run.

FWIW, I think that a CMake-specific subcommand is probably not the way to go (What about scons? waf? autotools?). As much as it'd be nice if CMake were the only way to build C and C++ code, it is not.

@tgamblin
Copy link
Copy Markdown
Member

tgamblin commented Mar 8, 2016

I'm not opposed to a cmake-specific subcommand, although speaking (in #506) about clutter, we probably need to get argparse to spit out something shorter and more helpful for arg-less spack -h.

@citibeth
Copy link
Copy Markdown
Member Author

citibeth commented Mar 8, 2016

Todd,

I really like the spack profile idea.

If you had something like this, would you still need spack cmake?

I think that's a harder question to answer. I like the idea of "spack
cmake" as a way to manually build a package as close as possible to how
Spack would auto-build it. I want to write mylib/package.py and then use
it with as little muss/fuss as possible to build mylib.

IF Spack profiles are to serve this purpose, then there would need to be a
way to create a Spack profile from a package.py file. But I still don't
think this would work as well as "spack cmake." Because "spack cmake" also
translates variants into CMake build parameters --- a job you would have to
do manually if you're just running plain CMake on a Spack profile.

So in the end... I think that "spack cmake" would be better at what I want
done.

-- Elizabeth

On Tue, Mar 8, 2016 at 3:20 PM, Todd Gamblin [email protected]
wrote:

Just a thought FYI: one thing we are thinking very strongly of
implementing is a "profile" capability, which would work kind of like
virtualenv or conda environments, but for anything, not just python
packages. It would likely look sort of like the existing python support in
Spack, where you can activate/deactivate packages and they are symlinked
into a common prefix (in the current Python case, that is the interpreter
prefix).

You could do something like:

spack profile create mystack # creates a profile called mystack in a directory somewhere
spack profile add # installs if needed, then links it into mystack
spack profile remove # unlinks from mystack
... etc ...
spack profile activate # add mystack/bin to $PATH, add mystack to CMAKE_PREFIX_PATH, etc.

This would make it easy to create and save environments and stacks of
packages for different teams, and to get them in/out of your environment. I
would likely also want to version the profiles. This feature is in package
managers like nix and guix. Another nice feature is that it would allow
external apps to RPATH the profile directory (or a symlink to it, to allow
transactional updates), but you could do seamless upgrades of packages
linked into the profile by removing them and adding a new version.

If you had something like this, would you still need spack cmake?


Reply to this email directly or view it on GitHub
#474 (comment).

@citibeth
Copy link
Copy Markdown
Member Author

citibeth commented Mar 8, 2016

On Tue, Mar 8, 2016 at 3:22 PM, Ben Boeckel [email protected]
wrote:

C++ header-only wrapper around a C library

Spack RPath-ifies everything, so I don't need to link the C library
directly.

What library are you linking to given a header-only C++ library?

See, for example, this wrapper of the PROJ.4 library:

https://github.com/citibeth/ibmisc/blob/master/slib/ibmisc/Proj.hpp

FWIW, I think that a CMake-specific subcommand is probably not the way to
go (What about scons? waf? autotools?). As much as it'd be nice if CMake
were the only way to build C and C++ code, it is not.

CMake is the only way that I build software, so a CMake-specific subcommand
would be very useful for me. CMake is also second only to Autotools, and
growing. Since this command is for DEVELOPERS, not USERS of the software,
its lack of capability with Autotools is not a problem for Autotools-based
development. Autotools developers can build Spack Autotools helpers.

But point taken... maybe I should think about this in a layered approach.
The bottom layer exports a bunch of stuff from a concretized Spack spec.
And the next layer up actually calls CMake, etc.

@tgamblin
Copy link
Copy Markdown
Member

tgamblin commented Mar 8, 2016

I guess I am still not clear on how spack diy doesn't do what you want. The package files already translate variants into build parameters, e.g.:

if '+foo' in spec:
     std_cmake_args.append('-Dfoo')
cmake (*std_cmake_args)

That could be made more concise, but most packages don't have to do a lot of it so far. With spack diy, you make a package for your software that is in development, and you cd into a checked out source directory, where spack will run the build. And if you want to play in the spack build environment, there is spack env <spec> bash.

The nice thing about diy is it doesn't have to use cmake. It just runs whatever commands are in your package file here, in the current directory. Isn't that what spack cmake would do anyway? The only difference would be that you have to write the cmake invocation in package.py, but that is going to be pretty different depending on which project you check out, anyway...

@mathstuf
Copy link
Copy Markdown
Contributor

mathstuf commented Mar 8, 2016

See, for example, this wrapper of the PROJ.4 library:
https://github.com/citibeth/ibmisc/blob/master/slib/ibmisc/Proj.hpp

And what library provides pj_free if you don't add -lproj4 somehow? You have to know that you're using the C library even with the C++ wrapper here.

@mathstuf
Copy link
Copy Markdown
Contributor

mathstuf commented Mar 8, 2016

if '+foo' in spec:
     std_cmake_args.append('-Dfoo')
cmake (*std_cmake_args)

Modifying std_cmake_args directly isn't a good idea since you're modifying the list for all subsequent CMake-using packages. Luckily, I don't see any modification of it, so there are no existing usages, but I figure it's best to head this off at the pass before it becomes a pattern :) .

@tgamblin
Copy link
Copy Markdown
Member

tgamblin commented Mar 8, 2016

No I'm not. Every package build in spack gets its own process, which was a very intentional design decision to prevent mayhem like this.

Package authors are free to do whatever they want in that sandbox -- this is why they're allowed to, e.g., set whatever env vars they want. The root spack process stays clean, and it forks before it enters any package code.

I was actually debating changing std_cmake_args to cmake_args.

@mathstuf
Copy link
Copy Markdown
Contributor

mathstuf commented Mar 8, 2016

Ah, I had missed that detail (I was even in the code right above that section of code). That's going to make Windows support interesting…

@tgamblin
Copy link
Copy Markdown
Member

tgamblin commented Mar 8, 2016

@citibeth: Since it's not particularly well documented (sorry!) maybe an example will help.

Try this on your command line with the latest develop. I'm assuming you have run setup-env.sh so that spack cd will work.

$ spack stage libelf

   <spack downloads and expands libelf>

$ spack cd libelf    # cd to libelf stage
$ ls
COPYING.LIB  Makefile.in  aclocal.m4     configure*    libelf.pc.in
ChangeLog    README       config.guess*  configure.in  mkinstalldirs*
INSTALL      VERSION      config.h.in    install-sh*   po/
MANIFEST     acconfig.h   config.sub*    lib/          stamp-h.in

   ... hack source code ...

$ spack diy libelf@my-special-version

   <spack builds & installs source in current working dir>
   ... hack some more ...

$ spack diy libelf@my-special-version-2

   <spack builds & installs source in current working dir>
   ... hack some more ...

$ spack diy libelf@my-special-version-3

   <spack builds & installs source in current working dir>

$ spack find
$ spack find libelf
==> 3 installed packages.
-- darwin-x86_64 / [email protected] ------------------------------
libelf@my-special-version  libelf@my-special-version-2  libelf@my-special-version-3

This is both a decent way to test out how a spack package will work, and a way to generate lots of dev versions of something. Spack will auto-fetch and install deps before it runs the DIY build, so you basically get the env you want and it runs the same package you would use with spack.

Note that you must provide a version after the @, and DIY trusts you on the version number (as it cannot verify as it can with a checksum).

@tgamblin
Copy link
Copy Markdown
Member

tgamblin commented Mar 8, 2016

@mathstuf: it's not a chroot (alas) but at least you can fork as a user. I think this could still be implemented on Windows via subprocess. You could spawn worker, send it a package object, and run the same code pretty easily. More complicated might be the I/O redirection in llnl.util.tty.log.

@citibeth
Copy link
Copy Markdown
Member Author

citibeth commented Mar 8, 2016

@tgamblin:

Try this on your command line with the latest develop. I'm assuming you
have run setup-env.sh so that spack cd will work.

Other auto-build systems (MacPorts, EasyBuild) separate a build into an
unbelievably large number of stages, some of which are no-ops for some
kinds of builds. They then make a class hierarchy in which all the stages
eventually get implemented, providing a complete build process. Things
like fetching the tarball are implemented high in the class hierarchy,
whereas package-specific details are implemented in the leaves.

I do not see such a structure in Spack, which seems to be both good and
bad. Good because it's easier to follow the code in a package.py, since
it's not cut up into 100 different methods and scattered around 17
different superclasses. Bad because it's harder to override with
non-default behavior. And also harder to do certain interesting things
with the class defined in a package.py file. For example...

Suppose we have a hypothetical stage-separated Spack with the following
stages:

  1. Concretize (convert the spec you give to a full concretized spec).
  2. InstallDeps (Run all stages on all dependencies)
  3. Stage: Fetch the tarball, create the stage, untar, etc.
  4. ConfigSetup: Get arguments and environment that will be passed to the
    configure stage.
  5. Configure: Run "cmake," "configure" or equivalent with args and
    environment from Step 4.
  6. Build: Run "make" or equivalent
  7. Install: Run "make install" or equivalent."
  8. Module: Create the module file.

If Spack were structured in this way, then what I want in "spack cmake"
would be pretty easy to describe (and implement): I want "spack cmake" to
run stages 1,2, 4 5, 8.

Similarly, "spack diy" can be described as running stages 1, 2, 4, 5, 6, 7,
8.

The simple answer to why "spack diy" doesn't do what I want is, I want to
run steps 6 and 7 manually. Why:

a) Spack is slow to startup, concretize, etc. I don't want to incur that
overhead every time I run "make."

b) In practice, Spack involves long, complex command lines with lots of
"^" options. That might change with PR#120. But for now, I don't even
want to run Spack every time I run CMake (which happens every time a
CMakeLists.txt file changes) because I don't want to carry around that long
command line. That's why I've hacked Spack to make a script for me that
runs CMake, eliminating the need to run Spack every time I build.

The downside of (b) is that now, the two-stage build (configure,
make/install) has been lengthened to a three-stage build (configure
external dependenceis, configure this project, make/install). Maybe that's
not so bad: the two-stage build promoted by Autotools/CMake eventually won
out over the single-stage build in which all that is done inside one "make"
command. Maybe the three-stage build will eventually win out as well, when
one takes external deps into account.

Rather than talking about "spack cmake" vs "spack diy," maybe we should
focus on defining discrete stages for Spack --- if not for the entire Spack
process, then for the things that currently happen inside the install()
method. This could be done by subclassing spack.Package, and then building
the stages inside that subclass. Existing un-staged Spack packages would
still work as always, but we'd also be able to write Spack packages that
work in discrete stages.

With that change, one would write a version of what I want with "spack
cmake" that should work with all build systems.

On Tue, Mar 8, 2016 at 3:44 PM, Ben Boeckel [email protected]
wrote:

See, for example, this wrapper of the PROJ.4 library:
https://github.com/citibeth/ibmisc/blob/master/slib/ibmisc/Proj.hpp

And what library provides pj_free if you don't add -lproj4 somehow? You
have to know that you're using the C library even with the C++ wrapper here.

The ibmisc project depends_on('lproj'), and the proj/lib directory gets
written into libibmisc.so's RPATH. Therefore, when I use the ibmisc
library, there is no need to add -lproj.

Modifying std_cmake_args directly isn't a good idea since you're modifying

the list for all subsequent CMake-using packages. Luckily, I don't see any
modification of it, so there are no existing usages, but I figure it's best
to head this off at the pass before it becomes a pattern :) .

I use the following pattern, of concatenating std_cmake_args with
package-specific args:

        options = self.config_args(spec, prefix) +
spack.build_environment.get_std_cmake_args(self)

        build_directory = join_path(self.stage.path, 'spack-build')
        source_directory = self.stage.source_path

        with working_dir(build_directory, create=True):
            cmake(source_directory, *options)
            make()
            make("install")

Package authors are free to do whatever they want in that sandbox -- this

is why they're allowed to, e.g., set whatever env vars they want. I was
actually debating changing std_cmake_args to cmake_args.

I've been wondering why packages set os.environ at all. Since it's quite
easy to set up your own env dict(), and then pass that to a process that
you spawn.

@tgamblin
Copy link
Copy Markdown
Member

tgamblin commented Mar 8, 2016

Adding stages seems like a good idea to me. We've got internal feature request to separate out configure, make and install. It would help for building packages and for use cases like this. Suggested command breakdown would be spack fetch/stage/configure/build/install (the existing ones) and spack diy fetch/stage/configure/build/install, where adding diy just says "use the current directory as the stage."

For the environment question, the reason is that there are builds that really want particular environment variables set, either for makefiles, or for whatever. The Python Mac OS X build requires an env var set during build.

@mathstuf
Copy link
Copy Markdown
Contributor

mathstuf commented Mar 8, 2016

The ibmisc project depends_on('lproj'), and the proj/lib directory gets written into libibmisc.so's RPATH. Therefore, when I use the ibmisc library, there is no need to add -lproj.

The linker in newer GNU binutils require that if you use a symbol that is provided by a library, it must be explicitly listed on the command line (either by -Lpath/to/proj/lib -lproj or a full path to the library). The fact that some middle library happens to bring in the proj library doesn't matter: use a symbol, link its provider. I also don't believe that the rpath of libX matters when doing -lX, but that's a detail I've never encountered before.

@citibeth
Copy link
Copy Markdown
Member Author

citibeth commented Mar 8, 2016

In that case, we won't be able to build reliably without something that
provides transitive pkgconfig-like functionality for determining what to
link. Sigh...

On Tue, Mar 8, 2016 at 5:59 PM, Ben Boeckel [email protected]
wrote:

The ibmisc project depends_on('lproj'), and the proj/lib directory gets
written into libibmisc.so's RPATH. Therefore, when I use the ibmisc
library, there is no need to add -lproj.

The linker in newer GNU binutils require that if you use a symbol that is
provided by a library, it must be explicitly listed on the command line
(either by -Lpath/to/proj/lib -lproj or a full path to the library). The
fact that some middle library happens to bring in the proj library doesn't
matter: use a symbol, link its provider. I also don't believe that the
rpath of libX matters when doing -lX, but that's a detail I've never
encountered before.


Reply to this email directly or view it on GitHub
#474 (comment).

@citibeth
Copy link
Copy Markdown
Member Author

citibeth commented Mar 8, 2016

On Tue, Mar 8, 2016 at 5:50 PM, Todd Gamblin [email protected]
wrote:

Adding stages seems like a good idea to me. We've got internal feature
request to separate out configure, make and install. It would help for
building packages and for use cases like this. Suggested command breakdown
would be spack fetch/stage/configure/build/install (the existing ones)
and spack diy fetch/stage/configure/build/install, where adding diy just
says "use the current directory as the stage."

OK. I'm suggesting one additional step (or sub-step): configure-cmd. This
produces the command needed to configure, but does not actually run it.
Then, we can save that command to a script if we like and re-run it later
without Spack. See config_args() for a prototype example of how this can
work in a CMake context:

https://github.com/citibeth/spack/blob/efischer/develop/var/spack/repos/builtin/packages/ibmisc/package.py

I can take a go at it, of separating the current install() method into
separate stages.

For the environment question, the reason is that there are builds that

really want particular environment variables set, either for makefiles, or
for whatever. The Python Mac OS X build requires an env var set during
build.

Not sure I understand. I'm suggesting we replace something like this:

 os.environ['FOO'] = 'bar'
 make()

with something like this:

 env = dict()
 env['CC'] = os.environ['CC']
 env['FOO'] = 'bar'
 make(env=env)   # env= specifies the environment used for this command

matz-e added a commit to matz-e/spack that referenced this pull request Apr 27, 2020
The folks at the Apache mirrors were so nice to purge old versions.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants