Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
33 commits
Select commit Hold shift + click to select a range
a733857
Added a simple test for the tokenization of the new syntax
alalazo Mar 12, 2020
0d6c52d
Spec with virtual dependency bindings parse correctly
alalazo Mar 12, 2020
24a9431
Specs can pick which virtual they want to bind
alalazo Mar 17, 2020
cb6a9a5
Vendor ChainMap for Python 2.X
alalazo Mar 17, 2020
014599e
Added tests for more complicated cases of bindings
alalazo Mar 17, 2020
1a821b7
Declare virtual dependencies that needs to be provided together
alalazo Mar 18, 2020
353a89a
Added further tests of consistency with the proposed semantics
alalazo Mar 18, 2020
1bc9c68
Dict comprehension not allowed in Python 2.6
alalazo Mar 19, 2020
72d4dce
Added documentation for explicit bindings of vdeps
alalazo Mar 24, 2020
bbaf81b
Removed leftover TODOs and FIXMEs
alalazo Mar 25, 2020
5627539
Improved documentation
alalazo Apr 15, 2020
424418c
chainmap: added a link to the license
alalazo Apr 15, 2020
bb3b938
DependencySpec: added "virtuals" as a new edge attribute
alalazo Apr 16, 2020
ee65bbc
provider_index: improved docstrings and variable names
alalazo Apr 16, 2020
ac44f4f
Modified the parser to allow for multiple bindings
alalazo Apr 16, 2020
b0d1e17
DependencySpec: remove default value for "virtuals" argument
alalazo Apr 17, 2020
38ea0fc
Spec._dup_deps now correctly duplicates information on virtuals
alalazo Apr 17, 2020
44d12b1
Keep track of virtual dependencies on DAG edges
alalazo Apr 20, 2020
dd11a89
Renamed dictionary from "provided_together" to "used_together"
alalazo Apr 20, 2020
3bb6fbd
Handle conditionals when selecting virtual dependencies
alalazo Apr 20, 2020
cd54221
Renamed "virtuals" to "provides" in spec YAML dictionary
alalazo Apr 20, 2020
0e4ee81
Fixed issue with test/ci.py
alalazo Apr 21, 2020
b2e044f
Renamed "_explicit_providers" to "_user_requested_providers"
alalazo Apr 21, 2020
5cd7027
Fixed issue with reindex and a failing unit test
alalazo May 8, 2020
ef93be9
Updated documentation according to review comments
alalazo May 18, 2020
e718b3b
Spec._providers is now a private method
alalazo May 18, 2020
aedaa78
flake8: comply to new rules on variable naming
alalazo May 18, 2020
9a18d56
Removed spurious FIXME
alalazo May 18, 2020
8adea7a
Removed ^mpich from test spec
alalazo May 18, 2020
263de40
Properly reconstruct virtuals on edges for old DBs
alalazo May 18, 2020
da431ce
Preserve monkeypatching of objects if name is virtual
alalazo Jul 13, 2020
22a20f4
Copy specs when parsing user requested providers
alalazo Jul 17, 2020
4622489
__contains__ now respect edge properties
alalazo Aug 10, 2020
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions bin/spack
Original file line number Diff line number Diff line change
Expand Up @@ -42,6 +42,9 @@ sys.path.insert(0, spack_lib_path)
# Add external libs
spack_external_libs = os.path.join(spack_lib_path, "external")

if sys.version_info[:1] == (2,):
sys.path.insert(0, os.path.join(spack_external_libs, 'py2'))

if sys.version_info[:2] == (2, 6):
sys.path.insert(0, os.path.join(spack_external_libs, 'py26'))

Expand Down
50 changes: 50 additions & 0 deletions lib/spack/docs/basic_usage.rst
Original file line number Diff line number Diff line change
Expand Up @@ -1167,6 +1167,56 @@ any MPI implementation will do. If another package depends on
error. Likewise, if you try to plug in some package that doesn't
provide MPI, Spack will raise an error.

""""""""""""""""""""""""""""""""""""""""
Explicit binding of virtual dependencies
""""""""""""""""""""""""""""""""""""""""

There are packages that provide more than just one virtual dependency.
When interacting with them users might want to pick only
a single virtual dependency from one package and use different providers for
the others. For example, consider these two packages:

* ``intel-parallel-studio``, which provides ``mpi``, ``lapack``, and ``blas``
* ``openblas``, which provides ``lapack`` and ``blas``

We might want to use ``intel-parallel-studio`` for ``mpi`` and ``openblas``
for ``lapack`` and ``blas``:

.. code-block:: console

$ spack spec netlib-scalapack ^mpi=intel-parallel-studio@cluster+mpi ^openblas
Input spec
--------------------------------
netlib-scalapack
^intel-parallel-studio@cluster+mpi
^openblas

Concretized
--------------------------------
[email protected]%[email protected] build_type=RelWithDebInfo patches=f2baedde688ffe4c20943c334f580eb298e04d6f35c86b90a1f4e8cb7ae344a2 ~pic+shared arch=linux-ubuntu18.04-broadwell
^[email protected]%[email protected]~doc+ncurses+openssl+ownlibs~qt arch=linux-ubuntu18.04-broadwell
^[email protected]%[email protected]~symlinks+termlib arch=linux-ubuntu18.04-broadwell
^[email protected]%[email protected] arch=linux-ubuntu18.04-broadwell
^[email protected]%[email protected]+systemcerts arch=linux-ubuntu18.04-broadwell
^[email protected]%[email protected]+cpanm+shared+threads arch=linux-ubuntu18.04-broadwell
^[email protected]%[email protected] arch=linux-ubuntu18.04-broadwell
^[email protected]%[email protected] arch=linux-ubuntu18.04-broadwell
^[email protected]%[email protected]+optimize+pic+shared arch=linux-ubuntu18.04-broadwell
^intel-parallel-studio@cluster%[email protected]~advisor auto_dispatch=none ~clck+daal~gdb~ilp64~inspector+ipp~itac+mkl+mpi~newdtags+rpath+shared+tbb threads=none ~vtune arch=linux-ubuntu18.04-broadwell
^[email protected]%[email protected]~consistent_fpcsr~ilp64+pic+shared threads=none arch=linux-ubuntu18.04-broadwell

The ``^<virtual>=<spec>`` syntax tells Spack to use ``<spec>`` to satisfy the
requested virtual (so ``^mpi=intel-parallel-studio`` tells Spack to use
``intel-parallel-studio`` as the ``mpi`` provider) and to consider its other virtual
dependencies after any other explicitly mentioned package. Adding ``^openblas``
to the spec thus means that ``openblas`` will then be used for ``lapack``, and
``blas`` because it is specified explicitly.

In this example, that covers all the virtual dependencies, but if there were other
virtual dependencies, Spack could try to satisfy them with ``intel-parallel-studio``.
For instance, if another package depended on ``tbb`` or ``daal`` (which ``intel-parallel-studio``
also provides), Spack could use ``intel-parallel-studio`` to satisfy them as a last resort.

^^^^^^^^^^^^^^^^^^^^^^^^
Specifying Specs by Hash
^^^^^^^^^^^^^^^^^^^^^^^^
Expand Down
35 changes: 35 additions & 0 deletions lib/spack/docs/packaging_guide.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2414,6 +2414,41 @@ The ``provides("mpi")`` call tells Spack that the ``mpich`` package
can be used to satisfy the dependency of any package that
``depends_on('mpi')``.

""""""""""""""""""""""""""""""""""""""""""""""
Providing multiple dependencies simultaneously
""""""""""""""""""""""""""""""""""""""""""""""

Packages can provide multiple virtual dependencies and sometimes, due mainly
to implementation details, they need to provide them simultaneously. A good
example for such a case is ``openblas``. This package provides both the ``lapack``
and ``blas`` APIs in a single library called ``libopenblas`` therefore, if a
spec is using ``openblas`` for one of the two virtuals, it must use it
for the other, as well.

To express this constraint in a package, the two virtual dependencies must be
listed in the same ``provides`` directive:

.. code-block:: python

provides('blas', 'lapack')

This makes it impossible to select ``openblas`` as a provider for one of the two
virtual dependencies and not for the other. Any request to do so results in an
error message:

.. code-block:: console

$ spack spec netlib-scalapack ^openblas ^blas=atlas
Input spec
--------------------------------
netlib-scalapack
^atlas
^openblas

Concretized
--------------------------------
==> Error: "openblas" needs to be a provider for "blas" if used as a provider for "lapack"

^^^^^^^^^^^^^^^^^^^^
Versioned Interfaces
^^^^^^^^^^^^^^^^^^^^
Expand Down
8 changes: 8 additions & 0 deletions lib/spack/external/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,14 @@
vendored copy ever needs to be updated again:
https://github.com/spack/spack/pull/6786/commits/dfcef577b77249106ea4e4c69a6cd9e64fa6c418

chainmap
--------

* Homepage: https://bitbucket.org/jeunice/chainmap/src/default/
* Usage: Port of collections.ChainMap for Python 2.6 and 2.7
* Version: 1.0.3
* License: https://bitbucket.org/jeunice/chainmap/src/default/LICENSE.txt

ctest_log_parser
----------------

Expand Down
154 changes: 154 additions & 0 deletions lib/spack/external/py2/chainmap.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,154 @@
try:
from collections.abc import MutableMapping
except ImportError:
from collections import MutableMapping


try:
from thread import get_ident
except ImportError:
try:
from threading import _get_ident as get_ident
except ImportError:
from threading import get_ident


def _recursive_repr(fillvalue='...'):
'Decorator to make a repr function return fillvalue for a recursive call'

def decorating_function(user_function):
repr_running = set()

def wrapper(self):
key = id(self), get_ident()
if key in repr_running:
return fillvalue
repr_running.add(key)
try:
result = user_function(self)
finally:
repr_running.discard(key)
return result

# Can't use functools.wraps() here because of bootstrap issues
wrapper.__module__ = getattr(user_function, '__module__')
wrapper.__doc__ = getattr(user_function, '__doc__')
wrapper.__name__ = getattr(user_function, '__name__')
wrapper.__annotations__ = getattr(user_function, '__annotations__', {})
return wrapper

return decorating_function


class ChainMap(MutableMapping):
"""
A ChainMap groups multiple dicts (or other mappings) together to create a
single, updateable view.

The underlying mappings are stored in a list. That list is public and can
accessed or updated using the *maps* attribute. There is no other state.

Lookups search the underlying mappings successively until a key is found. In
contrast, writes, updates, and deletions only operate on the first mapping.
"""

def __init__(self, *maps):
"""
Initialize a ChainMap by setting *maps* to the given mappings.
If no mappings are provided, a single empty dictionary is used.
"""
self.maps = list(maps) or [{}] # always at least one map

def __missing__(self, key):
raise KeyError(key)

def __getitem__(self, key):
for mapping in self.maps:
try:
return mapping[key] # can't use 'key in mapping' with defaultdict
except KeyError:
pass
return self.__missing__(key) # support subclasses that define __missing__

def get(self, key, default=None):
return self[key] if key in self else default

def __len__(self):
return len(set().union(*self.maps)) # reuses stored hash values if possible

def __iter__(self):
return iter(set().union(*self.maps))

def __contains__(self, key):
return any(key in m for m in self.maps)

def __bool__(self):
return any(self.maps)

@_recursive_repr()
def __repr__(self):
return '{0.__class__.__name__}({1})'.format(
self, ', '.join(map(repr, self.maps)))

@classmethod
def fromkeys(cls, iterable, *args):
"Create a ChainMap with a single dict created from the iterable."
return cls(dict.fromkeys(iterable, *args))

def copy(self):
"""
New ChainMap or subclass with a new copy of ``maps[0]`` and refs
to ``maps[1:]``
"""
return self.__class__(self.maps[0].copy(), *self.maps[1:])

__copy__ = copy

def new_child(self, m=None): # like Django's Context.push()
"""
New ChainMap with a new map followed by all previous maps. If no
map is provided, an empty dict is used.
"""
if m is None:
m = {}
return self.__class__(m, *self.maps)

@property
def parents(self): # like Django's Context.pop()
"New ChainMap from ``maps[1:]``."
return self.__class__(*self.maps[1:])

def __setitem__(self, key, value):
self.maps[0][key] = value

def __delitem__(self, key):
try:
del self.maps[0][key]
except KeyError:
raise KeyError('Key not found in the first mapping: {0!r}'.format(key))

def popitem(self):
"""
Remove and return an item pair from ``maps[0]``. Raise ``KeyError`` is
``maps[0]`` is empty.
"""
try:
return self.maps[0].popitem()
except KeyError:
raise KeyError('No keys found in the first mapping.')

def pop(self, key, *args):
"""
Remove ``key`` from ``maps[0]`` and return its value. Raise ``KeyError``
if ``key`` not in ``maps[0]``.
"""
try:
return self.maps[0].pop(key, *args)
except KeyError:
raise KeyError('Key not found in the first mapping: {0!r}'.format(key))

def clear(self):
"""
Clear ```maps[0]```, leaving ``maps[1:]`` intact.
"""
self.maps[0].clear()
2 changes: 1 addition & 1 deletion lib/spack/spack/cmd/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -168,7 +168,7 @@ def parse_specs(args, **kwargs):
return specs

except spack.spec.SpecParseError as e:
msg = e.message + "\n" + str(e.string) + "\n"
msg = str(e) + "\n" + str(e) + "\n"
msg += (e.pos + 2) * " " + "^"
raise spack.error.SpackError(msg)

Expand Down
18 changes: 14 additions & 4 deletions lib/spack/spack/database.py
Original file line number Diff line number Diff line change
Expand Up @@ -673,9 +673,10 @@ def _assign_dependencies(self, hash_key, installs, data):
spec = data[hash_key].spec
spec_dict = installs[hash_key]['spec']
if 'dependencies' in spec_dict[spec.name]:
reconstruct_virtuals_on_edges = False
yaml_deps = spec_dict[spec.name]['dependencies']
for dname, dhash, dtypes in spack.spec.Spec.read_yaml_dep_specs(
yaml_deps):
for item in spack.spec.Spec.read_yaml_dep_specs(yaml_deps):
dname, dhash, dtypes, virtuals = item
# It is important that we always check upstream installations
# in the same order, and that we always check the local
# installation first: if a downstream Spack installs a package
Expand All @@ -697,7 +698,14 @@ def _assign_dependencies(self, hash_key, installs, data):
tty.warn(msg)
continue

spec._add_dependency(child, dtypes)
if virtuals is None:
reconstruct_virtuals_on_edges = True
virtuals = []

spec._add_dependency(child, dtypes, virtuals=virtuals)

if reconstruct_virtuals_on_edges:
spec.reconstruct_virtuals_on_edges()

def _read_from_file(self, filename):
"""Fill database from file, do not maintain old data.
Expand Down Expand Up @@ -1104,7 +1112,9 @@ def _add(
):
dkey = dep.spec.dag_hash()
upstream, record = self.query_by_spec_hash(dkey)
new_spec._add_dependency(record.spec, dep.deptypes)
new_spec._add_dependency(
record.spec, dep.deptypes, virtuals=dep.virtuals
)
if not upstream:
record.ref_count += 1

Expand Down
35 changes: 23 additions & 12 deletions lib/spack/spack/directives.py
Original file line number Diff line number Diff line change
Expand Up @@ -408,11 +408,19 @@ def _execute_extends(pkg):
return _execute_extends


@directive('provided')
@directive(dicts=('provided', 'used_together'))
def provides(*specs, **kwargs):
"""Allows packages to provide a virtual dependency. If a package provides
'mpi', other packages can declare that they depend on "mpi", and spack
can use the providing package to satisfy the dependency.
"""Allows packages to provide a virtual dependency.

If a package provides 'mpi', other packages can declare that they
depend on "mpi", and spack can use the providing package to satisfy
the dependency.

Args:
*specs: virtual specs provided by this package
**kwargs:

when: condition when this provides clause needs to be considered
"""
def _execute_provides(pkg):
when = kwargs.get('when')
Expand All @@ -424,15 +432,18 @@ def _execute_provides(pkg):
# to build the ProviderIndex.
when_spec.name = pkg.name

for string in specs:
for provided_spec in spack.spec.parse(string):
if pkg.name == provided_spec.name:
raise CircularReferenceError(
"Package '%s' cannot provide itself.")
spec_objs = [spack.spec.Spec(x) for x in specs]
spec_names = [x.name for x in spec_objs]
pkg.used_together.setdefault(when_spec, []).append(set(spec_names))
for provided_spec in spec_objs:
if pkg.name == provided_spec.name:
raise CircularReferenceError(
"Package '%s' cannot provide itself.")

if provided_spec not in pkg.provided:
pkg.provided[provided_spec] = set()
pkg.provided[provided_spec].add(when_spec)

if provided_spec not in pkg.provided:
pkg.provided[provided_spec] = set()
pkg.provided[provided_spec].add(when_spec)
return _execute_provides


Expand Down
11 changes: 9 additions & 2 deletions lib/spack/spack/environment.py
Original file line number Diff line number Diff line change
Expand Up @@ -1438,10 +1438,17 @@ def _read_lockfile_dict(self, d):
specs_by_hash[dag_hash] = Spec.from_node_dict(node_dict)

for dag_hash, node_dict in json_specs_by_hash.items():
for dep_name, dep_hash, deptypes in (
reconstruct_virtuals_on_edges = False
for dep_name, dep_hash, deptypes, virtuals in (
Spec.dependencies_from_node_dict(node_dict)):
if virtuals is None:
virtuals = []
reconstruct_virtuals_on_edges = True
specs_by_hash[dag_hash]._add_dependency(
specs_by_hash[dep_hash], deptypes)
specs_by_hash[dep_hash], deptypes, virtuals=virtuals)

if reconstruct_virtuals_on_edges:
specs_by_hash[dag_hash].reconstruct_virtuals_on_edges()

# If we are reading an older lockfile format (which uses dag hashes
# that exclude build deps), we use this to convert the old
Expand Down
Loading