CI: Add job that runs tests with pip pre-release dependencies#1852
CI: Add job that runs tests with pip pre-release dependencies#1852
Conversation
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #1852 +/- ##
=======================================
Coverage 79.12% 79.12%
=======================================
Files 15 15
Lines 915 915
Branches 194 194
=======================================
Hits 724 724
Misses 168 168
Partials 23 23 ☔ View full report in Codecov by Sentry. |
Adds a job which installs and runs the tests with pre-release pip dependencies (using `pip install --pre`). This allows to detect deprecation warnings, failures and other incompatibilities with future dependencies earlier.
|
This makes sense with the current setup.py, which doesn't pin the dependencies. But we should have at least the dependencies pinned for the major version. This means that currently, people who want to reproduce simulations using earlier version of Mesa won't be able to do so because the versions are not pinned. If the major versions are pinned, then pre-releases that bump the major version won't get tested. And the deprecation warnings will only show up in the CI logs, but the overall result is still a pass. |
Corvince
left a comment
There was a problem hiding this comment.
This is something that has bitten us before, but it somewhat depends on other libraries doing pre-releases. So I am not sure about the usefulness, but I am approving and then we can always re-evaluate later.
.github/workflows/build_lint.yml
Outdated
| matrix: | ||
| os: [windows, ubuntu, macos] | ||
| python-version: ["3.12"] | ||
| name: [""] |
There was a problem hiding this comment.
is this somehow required for it to work?
There was a problem hiding this comment.
As far as I understand / have tested, weirdly, yes.
|
Apologies @EwoutH I feel like we failed to provide you a definitive answer on this. My understanding is we are not merging this because of the dependency issue, where someone may pin |
|
Pinning dependencies in library development, has severe downsides and is generally advised against. The primary concern is the limitation it places on user flexibility. Libraries like Mesa are often integrated into complex systems with a variety of other libraries. By fixing dependencies to specific versions, you risk creating conflicts with other parts of a user's project that may require different versions of these dependencies. This can be particularly problematic in fast-moving fields where dependencies are frequently updated. Another critical issue is the maintenance burden. Pinning dependencies means the library must be constantly updated to keep pace with the updates of its dependencies. This can be a significant drain on resources, especially for open-source projects that rely on community contributions. Furthermore, there's the problem of reproducibility. While pinning dependencies seems like it would enhance reproducibility by ensuring everyone uses the same versions, it actually can do the opposite. Over time, older versions of dependencies may no longer be supported or may have known vulnerabilities, making it difficult to reproduce older work or forcing users to rely on potentially insecure software. For instance, in the Python ecosystem, a project like NumPy frequently updates with improvements and bug fixes. If Mesa were to pin a specific version of NumPy, it would prevent users from benefiting from these updates unless Mesa also updates its dependencies correspondingly. This can be a significant hindrance in environments where the latest features or security patches are essential. So while pinning dependencies might seem like a method to ensure stability, it can lead to significant drawbacks in terms of flexibility, maintenance burden, and even reproducibility. It's more beneficial for libraries like Mesa to maintain compatibility with a range of versions of their dependencies, rather than pinning them to specific releases. |
|
I agree that pinning down to the minor version or patch version is indeed problematic. But I said in #1852 (comment) that we should at least pin for the major version. Do you agree that we should have a non-specific requirements list for interoperability with other libraries, and a lockfile (poetry.lock, the output of pip-tools, ...) for reproducibility?
The occasional auto-updater should handle this just fine.
You can't be 100% sure if it is exact reproduction of a simulation result (could be for archaeology or bisection) unless if you use the older versions, with the old bugs being included as a "feature". |
Only for minimum versions. For new versions, if it breaks, it breaks, we fix it and release a patch. But if we just follow SPEC 0 I don’t think minimum versions are needed for most packages, except if we specifically want to use a newer feature from a package.
That’s a user problem. If they want perfect reproducibility, they can add a file with pinned requirements to their project/model. As a library, we should offer broad support where possible. |
|
I’m closing this PR, it has been open too long. This was meant to be a quick “just in case” feature to make our internal testing a little more robust. Somehow we got into a huge, totally unrelated, discussion about pinning dependencies. Even if the discussion is worth having, can we please do it in dedicated discussion threads? Or open a separate PR for dependency management / policy. You can be inspired by an PR to have a discussion, but please do not let it block it or even have it in the PR thread unless it’s directly related.Open a new one. (CC @tpike3 @jackiekazil) |
|
Might be useful to start testing against NumPy 2.0 beta. Also Pandas 3.0 is coming at some point. |
Adds a job which installs and runs the tests with pre-release pip dependencies (using
pip install --pre). This allows to detect deprecation warnings, failures and other incompatibilities with future dependencies earlier.