Skip to content

Add latest version of MPFR and patches for older versions#7545

Merged
adamjstewart merged 3 commits intospack:developfrom
adamjstewart:features/mpfr
Mar 23, 2018
Merged

Add latest version of MPFR and patches for older versions#7545
adamjstewart merged 3 commits intospack:developfrom
adamjstewart:features/mpfr

Conversation

@adamjstewart
Copy link
Copy Markdown
Member

Fixes #7543. @peetsv

Adds the latest version of MPFR, which builds and passes all tests for me on macOS 10.13.3 with Clang 9.0.0.

The MPFR developers are some of the few people I've seen who actually release patches to fix bugs in older versions of software. They release these patches in two forms: individual patches and cumulative patches. I was using individual patches in #3506, but after #7193 was merged, the order in which patches are applied is no longer the order in which they are listed in the package, causing them to conflict (see #7543). This PR replaces the individual patches with cumulative patches and adds patches for older releases as well.

Note that some of these patches are fairly large. Spack does support fetching patches from the internet, but I don't think I'm comfortable fetching patches remotely without checksumming them. And since the cumulative patches are updated continuously as bugs are discovered, checksumming them would just lead to headaches.

Pinging @alalazo and @davydden who have had opinions on long patches in the past.

@davydden
Copy link
Copy Markdown
Member

Note that some of these patches are fairly large. Spack does support fetching patches from the internet, but I don't think I'm comfortable fetching patches remotely without checksumming them. And since the cumulative patches are updated continuously as bugs are discovered, checksumming them would just lead to headaches.

I don't have anything against this. We fetch it in some scenarios but also store them locally in Spack for other packages.

@peetsv
Copy link
Copy Markdown
Contributor

peetsv commented Mar 21, 2018

Consistency (the dubious bugaboo) would suggest that having patch files stored with tarballs in a local mirror would not be too out of place. That said, checksum support of those files would also add some fun.

As a complete non sequitur, I would like to have a spack function to populate my local mirror based upon the package definitions - source tarballs and patches too if that's the way this gets played. So, assuming your mirrors.yaml file is defined somewhere ...
./bin/spack mirror populate [email protected]

yeah, I know, "Go away kid, ya bother me"

@adamjstewart
Copy link
Copy Markdown
Member Author

I would like to have a spack function to populate my local mirror based upon the package definitions

You mean like spack mirror create? http://spack.readthedocs.io/en/latest/mirrors.html

@peetsv
Copy link
Copy Markdown
Contributor

peetsv commented Mar 21, 2018

Almost. I already have a mirror that was established long ago. What I really need is a simple way to add things as package prerequisites change... Maybe a flag or something that tells spack to put a copy of anything it downloads into the local mirror...

Right now, I tend to look through the build log for curl failures for those instances when the local mirror does not have the file required, then I use wget to fetch the same tarball spack retrieved, then compare md5sums. I guess I could take a look at the cache directory, as well, but, the build log tells me which tarball spack was looking for in the mirror and sometimes the filenames do not match what got downloaded. Especially so for packages like sqlite.
wget https://www.sqlite.org/2017/sqlite-autoconf-3200000.tar.gz -O MIRROR_DIR/sqlite/sqlite-3.20.0.tar.gz

@alalazo
Copy link
Copy Markdown
Member

alalazo commented Mar 22, 2018

Spack does support fetching patches from the internet, but I don't think I'm comfortable fetching patches remotely without checksumming them.

You can checksum them, according to the directive signature:

def patch(url_or_filename, level=1, when=None, working_dir=".", **kwargs):
    """Packages can declare patches to apply to source.  You can
    optionally provide a when spec to indicate that a particular
    patch should only be applied when the package's spec meets
    certain conditions (e.g. a particular version).

    Args:
        url_or_filename (str): url or filename of the patch
        level (int): patch level (as in the patch shell command)
        when (Spec): optional anonymous spec that specifies when to apply
            the patch
        working_dir (str): dir to change to before applying

    Keyword Args:
        sha256 (str): sha256 sum of the patch, used to verify the patch
            (only required for URL patches)
        archive_sha256 (str): sha256 sum of the *archive*, if the patch
            is compressed (only required for compressed URL patches)

    """
    ....

@alalazo
Copy link
Copy Markdown
Member

alalazo commented Mar 22, 2018

I would like to have a spack function to populate my local mirror based upon the package definitions

@adamjstewart @peetsv
OT: I think there's another small feature that has been missing for a long time, i.e. the possibility to refer to a mirror by name. Right now you can do:

$ spack mirror create -d <DIR> <specs>

but given that when you add a mirror you name it, I'd like to be able to say also:

$ spack mirror create -name <MIRROR_NAME> <specs>

@adamjstewart
Copy link
Copy Markdown
Member Author

Spack does support fetching patches from the internet, but I don't think I'm comfortable fetching patches remotely without checksumming them.

You can checksum them, according to the directive signature:

I know, but:

since the cumulative patches are updated continuously as bugs are discovered, checksumming them would just lead to headaches.

Do you think it would be worth fetching anyway? The package would break every time a bug fix came out. I don't want to deal with that many issues.

@alalazo
Copy link
Copy Markdown
Member

alalazo commented Mar 22, 2018

I left a comment in #7193. Personally I'd prefer fixing the sorting and continue with individual patches. There are other packages, like nwchem, for which patching order might be important, e.g. if they don't provide cumulative patches.

@adamjstewart
Copy link
Copy Markdown
Member Author

I agree that we should fix patching order. But there are 5-10 individual patches per version of this package, so that may lead to clutter.

@alalazo
Copy link
Copy Markdown
Member

alalazo commented Mar 22, 2018

Off the top of my head, I think it makes sense to store the cumulative patch if we are dealing with an old version for which it's not going to change anymore. If the version is recent enough to be maintained (and thus could be subject to further patching), probably we should apply individual patches. Does that seem sensible to you?

that may lead to clutter

I think we have a similar problem with resources in llvm. I'll try to move them to a yaml file within the package directory and see if this looks better. In case the result is satisfactory I'll submit a PR.

@adamjstewart
Copy link
Copy Markdown
Member Author

@alalazo I was thinking something along the same lines. I just checked and it looks like they don't release patches for old releases after a new release comes out. So it should be safe to fetch and checksum patches for old releases. For the current release, we could either fetch individual patches or store cumulative patches. There are no patches for the current release so far, so we can dodge this question for now.

I still vote we use cumulative patches wherever possible. Version 3.1.3 for example includes 17 individual patches.

@adamjstewart adamjstewart changed the title Add latest version of MPFR and patches for older versions [WIP] Add latest version of MPFR and patches for older versions Mar 22, 2018
@adamjstewart
Copy link
Copy Markdown
Member Author

adamjstewart commented Mar 22, 2018

Well, that saves us from adding 8,000 lines of patches. Unfortunately (fortunately?) it has uncovered a much more severe bug:

$ spack install [email protected]
...
==> Installing mpfr
==> Warning: There is no checksum on file to fetch [email protected] safely.
==>   Fetch anyway? [y/N] y
==> Fetching with no checksum.
  Add a checksum or use --no-checksum to skip this check.
==> Fetching https://ftp.gnu.org/gnu/mpfr/mpfr-3.1.2.tar.bz2
######################################################################## 100.0%
==> Staging archive: /Users/Adam/spack/var/spack/stage/mpfr-3.1.6-xvkli4eiurcd6dz5fct2ktfbocpuux3c/mpfr-3.1.2.tar.bz2
==> Created stage in /Users/Adam/spack/var/spack/stage/mpfr-3.1.6-xvkli4eiurcd6dz5fct2ktfbocpuux3c
==> Fetching http://www.mpfr.org/mpfr-3.1.6/allpatches
######################################################################## 100.0%
1 out of 1 hunk FAILED -- saving rejects to file VERSION.rej
1 out of 1 hunk FAILED -- saving rejects to file src/mpfr.h.rej
1 out of 1 hunk FAILED -- saving rejects to file src/version.c.rej
1 out of 1 hunk FAILED -- saving rejects to file VERSION.rej
1 out of 1 hunk FAILED -- saving rejects to file src/mpfr.h.rej
2 out of 3 hunks FAILED -- saving rejects to file src/root.c.rej
1 out of 1 hunk FAILED -- saving rejects to file src/version.c.rej
1 out of 2 hunks FAILED -- saving rejects to file tests/troot.c.rej
==> Patch http://www.mpfr.org/mpfr-3.1.6/allpatches failed.
==> Error: ProcessError: Command exited with status 1:
    '/usr/bin/patch' '-s' '-p' '1' '-i' '/Users/Adam/spack/var/spack/stage/spack-stage-0lf6pvr5/allpatches' '-d' '.'
==> Error: [Errno 2] No such file or directory: '/Users/Adam/spack/var/spack/stage/mpfr-3.1.6-xvkli4eiurcd6dz5fct2ktfbocpuux3c/mpfr-3.1.2/spack-build.out'

There are a couple of terrifying things to note here.

  1. We do have a checksum for 3.1.6!
  2. It is fetching 3.1.2 instead of 3.1.6 (we also have a checksum for 3.1.2)!
  3. It is applying the patch for 3.1.6 correctly, but that obviously fails because it is applying it to 3.1.2
  4. The error that makes it quit is that spack-build.out is missing. Spack should know better than to look for this file and should fail gracefully with a PatchError or something like that.

4 is a long-standing bug that I haven't really paid that much attention to, but we should probably fix it.

1 and 2 don't make any sense to me, but they remind me of #3129. I suspect that the same bug is causing both of these problems.

'3.1.2': '9f96a5c7cac1d6cd983ed9cf7d997074',
}

for version, checksum in patches.items():
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And here's the culprit! A for cycle here using the name "version" leaves a version attribute defined at class level, that overrides the version property in the base class. The last version you iterate on is 3.1.2, and the discrepancy between self.spec.versions[0] and self.version does the rest.

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Using:

for v, c in patches.items():
    ...

should help avoiding troubles, as I am pretty sure we don't define those attributes in base classes (joy and pain of dynamic typing).

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In #7563 I ended up adding a double underscore to loop variables. Not sure what's the worst choice between very short names and names that starts with a double underscore. Should we set an informal style policy for the few packages that have logic at class level?

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How silly of me, fixed in the latest commit. I wonder if there's any way we can make this idiot proof.

Copy link
Copy Markdown
Member

@alalazo alalazo Mar 22, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wonder if there's any way we can make this idiot proof.

Maybe using a descriptor and intercepting the redefinition at class level. Not sure it's worth the effort though. Just refreshed the __set__ signature, and that's not possible. If we want to intercept rebinding at class level we probably need to add some logic in a metaclass - definitely too much for a bug like this 😄

@alalazo
Copy link
Copy Markdown
Member

alalazo commented Mar 22, 2018

1 and 2 don't make any sense to me, but they remind me of #3129. I suspect that the same bug is causing both of these problems.

👏 for remembering things like that more than one year later 😅

@adamjstewart adamjstewart changed the title [WIP] Add latest version of MPFR and patches for older versions Add latest version of MPFR and patches for older versions Mar 22, 2018
@adamjstewart adamjstewart merged commit 32a78ea into spack:develop Mar 23, 2018
@adamjstewart adamjstewart deleted the features/mpfr branch March 23, 2018 00:37
ch741 pushed a commit to ch741/spack that referenced this pull request Apr 20, 2018
* Add latest version of MPFR and patches for older versions

* Fetch and checksum patches instead of storing them

* Fix bug, version attribute was being overridden
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

gcc v5.4.0 build fails due to mpfr patching problem

4 participants