use prepare_metadata_for_build_wheel instead of parsing setup.py#2296
use prepare_metadata_for_build_wheel instead of parsing setup.py#2296finswimmer wants to merge 12 commits intopython-poetry:developfrom
Conversation
|
This seems to be reimplementing a lot of the functionality from |
|
@pradyunsg Your comment was fast :) First I was a bit irritated by it, what you mean by
And then I found this: https://pypi.org/project/pep517/ I guess this is what you meant? Looks promising. Will have a look at this. Thanks a lot! 👍 |
|
We're all kinds of bad at naming things. :) I was on mobile, so linking to the PyPI page was... possible but annoyingly difficult. Sorry! |
abn
left a comment
There was a problem hiding this comment.
Overall, i think this is a step in the right direction. The AST parser was too fragile anyway. One minor thing to note is that this will remove the improvenents at #2257. Which is I think is okay, because we are receiving more accurate results.
A few minor questions and suggestions in the code. Leaving this as a comment since this is not ready for review yet afaict.
| command._pool = poetry.pool | ||
|
|
||
| mocker.patch("poetry.utils._compat.Path.open") | ||
| # mocker.patch("poetry.utils._compat.Path.open") |
There was a problem hiding this comment.
If you have not considered it yet, you could consider adding a test fixture to do this since this is reused. Something like this might be useful.
@pytest.fixture
def mocked_pyproject_toml(mocker):
def side_effect(self, *args, **kwargs):
if self.name == "pyproject.toml":
return MagicMock()
return self.open(*args, **kwargs)
patched_open = mocker.patch("poetry.utils._compat.Path.open")
patched_open.side_effect = side_effect
yield patched_openThere was a problem hiding this comment.
|
|
||
|
|
||
| def test_search_for_vcs_read_setup_raises_error_if_no_version(provider, mocker): | ||
| def test_search_for_vcs_read_setup_with_dynamic_version(provider, mocker): |
There was a problem hiding this comment.
Should we consider retaining the old case as well since it is an expected failure?
| install_requires = ["python-dateutil>=2.6,<3.0", "pytzdata>=2018.3"] | ||
|
|
||
| extras_require = {':python_version < "3.5"': ["typing>=3.6,<4.0"]} | ||
| extras_require = {'typing:python_version < "3.5"': ["typing>=3.6,<4.0"]} |
|
|
||
|
|
||
| @pytest.mark.skipif(not PY35, reason="AST parsing does not work for Python <3.4") | ||
| def test_setup_reader_read_setup_call_in_main(setup): |
There was a problem hiding this comment.
Does this case still work as expected with this change? ie. a setup file with the call in main.
| from importlib.metadata import PathDistribution | ||
| except ImportError: | ||
| from ConfigParser import ConfigParser | ||
| from importlib_metadata import PathDistribution |
| not result["install_requires"] | ||
| and not result["extras_require"] | ||
| and not result["python_requires"] | ||
| def read_from_pep517_hook(cls, directory): |
There was a problem hiding this comment.
Fwiw, I think it might be great to retain the name for now. Since we are reading form the givent directory.
| setup_call, body, "python_requires" | ||
| ) | ||
| with pep517.envbuild.BuildEnvironment() as env, temporary_directory() as tmp_dir: | ||
| env.pip_install([cls.build_requires]) |
There was a problem hiding this comment.
Does pep517 do something like python -m ensurepip when using pip commands?
|
|
||
| if distribution.requires: | ||
| for record in distribution.requires: | ||
| requirements = record.split(";", 1) |
There was a problem hiding this comment.
can we simply use poetry.core.packages.dependency_from_pep_508(record) here?
| except IndexError: | ||
| result["install_requires"].append(project_name) | ||
|
|
||
| egg_info = Path(directory).glob("*.egg-info") |
There was a problem hiding this comment.
We should not need this anymore correct? Since the metadata will be generated in the isolated environment?
| "pendulum.tz.data", | ||
| "pendulum.tz.zoneinfo", | ||
| "pendulum.utils", | ||
| # "pendulum._extensions", |
There was a problem hiding this comment.
Is this because we use a truncated source in the fixture?
|
Thanks for the thorough and comprehensive work on this PR! However, I think the purpose of the But for projects using By going this route, we might end up losing performances (using the I am not too keen on executing anything just for the sake of getting metadata (even using the I know it's a tricky subject and not everyone agrees but, in my opinion, the build backend should only be used to build a wheel for installation and nothing else. That being said I appreciate the work you put into it, don't get me wrong, and if someone can guarantee me that nothing will ever be executed by going this PR route, I am willing to do it. But for now I would prefer not to go this way, even if it follows the current packaging guidelines. And, again, this is great work and I appreciate the time put into it. And I am open for discussion :-) |
|
@sdispater the ast parsing solution can still be retained. In our current implementation, as I understand it, we do execute the |
|
@abn This is true only for git and directory dependencies (since in this case we assume people know what they are doing). So if we remove the change from the The other thing that bothers me (even though I must admit it's already the case) is that it will build a new environment each time. A better solution would be to create a unique environment lazily that will be reused during the dependency resolution and cleaned up after. I know the AST parser is flaky (I wrote most of it 😅) but it works reasonably well so we should try to rely on it as much as possible. |
|
Ha, I agree that the AST parser is a decent fallback - we can obviously improve it as we go along. Considering we can cache this information for packages and that this would be the exception cases, I do not see the perf-hit to be considerable or unmanagable spinning up an isolated environment (we can always improve this later). I guess there are different views on that one. Fwiw, if we can get better metadata, I feel that this or a similar solution might be worth it. It will also mean we can support non poetry PEP-517 source packages too (the ones without the metadata). Also, this discussion is relevant for #2301 as well. |
|
Hello @sdispater , I can understand your reservation about executing something to get the necessary and I would also prefer something static. That's why I started the discussion here The current result of the discussion is, that using The biggest plus for my implementation I see in the fact that we can support other packaging tools aside poetry and Maybe we can use @abn: Thanks a lot for your code comments. I will go through them later. |
|
I agree that using the AST parser, when the backend is setuptools, is a good idea. Maybe the setuptools backend should probably be doing these AST shenanigans instead, but that's not something under poetry's control, so it's definitely a good idea to use the AST parser for situations where it's pretty unambigous what the results would be. For non-setuptools backends though, I do think it'd be good to have PEP 517 support in poetry. :) |
|
So, I agree with @pradyunsg: If the backend is For other backends, we use the appropriate PEP-517 hooks. I don't think we can use the |
|
This pull request has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs. |
This PR is a complete rework, how poetry gather metadata about non-poetry project, that are necessary for dependency resolution, in case no
sdistorwheelis available.In the original implementation this was done by trying to parse the
setup.pyorsetup.cfg. This has the strong limitation to situation where version and dependencies are not determine during runtime. And poetry has to run with python3, because theastmodule for python2 works in a different way.Build tools like
setuptoolsthat implements PEP 517 should provide a methodprepare_metadata_for_build_wheelfor getting access to the metadata one need for dependency resolution.To gain access to this API hook following steps are needed:
prepare_metadata_for_build_wheelfor the build backend on the folder containing the packageimportlib.metadataLot of this work is already done in the pep517 package.
This implementation has the advantage:
setup.pycan be obtainedsetuptools. Virtually any PEP517 compliant build system can be integrated easilyThe only downside is, that the whole process needs more time than parsing. In my opinion this is negligible over the advantages.
Pull Request Check List
This is just a reminder about the most common mistakes. Please make sure that you tick all appropriate boxes. But please read our contribution guide at least once, it will save you unnecessary review cycles!
- [ ] Updated documentation for changed code.