Skip to content

Multiple Python restricted dependencies not solved properly #4947

@Erhanjinn

Description

@Erhanjinn
  • I am on the latest Poetry version.
  • I have searched the issues of this repo and believe that this is not a duplicate.
  • If an exception occurs when executing a command, I executed it again in debug mode (-vvv option).
  • OS version and name: Ubuntu 20.04
  • Poetry version: 1.1.12
  • Link of a Gist with the contents of your pyproject.toml file:

Issue

Hello, I have run into issues with dependency solving using multiple Python restricted dependencies as follows:

...
[tool.poetry.dependencies]
python = "^3.8.6"
py4j = [
    {version = "0.10.9", python = "~3.8"},
    {version = "^0.10.9.2", python = "^3.9"}
]
pyspark = [
    {version = "3.1.2", python = "~3.8"},
    {version = "^3.2.0", python = "^3.9"}
]
...

with the error (after running poetry update):

Updating dependencies
Resolving dependencies... (0.1s)

  SolverProblemError

  Because no versions of pyspark match >3.2.0,<4.0.0
   and pyspark (3.2.0) depends on py4j (0.10.9.2), pyspark (>=3.2.0,<4.0.0) requires py4j (0.10.9.2).
  So, because my-custom-package depends on both py4j (0.10.9) and pyspark (^3.2.0), version solving failed.

  at ~/miniconda3/envs/myenv395/lib/python3.9/site-packages/poetry/puzzle/solver.py:241 in _solve
      237│             packages = result.packages
      238│         except OverrideNeeded as e:
      239│             return self.solve_in_compatibility_mode(e.overrides, use_latest=use_latest)
      240│         except SolveFailure as e:
    → 241│             raise SolverProblemError(e)
      242│ 
      243│         results = dict(
      244│             depth_first_search(
      245│                 PackageNode(self._package, packages), aggregate_package_nodes


When I try the different versions for Python ~3.8 and for Python ^3.9 separately as

...
[tool.poetry.dependencies]
python = "^3.8.6"
py4j =  "0.10.9"
pyspark = "3.1.2"
...

and

...
[tool.poetry.dependencies]
python = "^3.8.6"
py4j =  "^0.10.9.2"
pyspark = "^3.2.0"
...

everything goes fine.

Why does poetry mix the Python specific dependencies together and does not respect the Python version conditions?

Metadata

Metadata

Assignees

No one assigned

    Labels

    area/solverRelated to the dependency resolverkind/bugSomething isn't working as expected

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions