Skip to content

quantile with Dask arrays #3326

@jkmacc-LANL

Description

@jkmacc-LANL

Currently the quantile method raises an exception when it encounters a Dask array.

        if isinstance(self.data, dask_array_type):
            raise TypeError(
                "quantile does not work for arrays stored as dask "
                "arrays. Load the data via .compute() or .load() "
                "prior to calling this method."
            )

I think it's because taking a quantile needs to see all the data in the dimension it's quantile-ing, or blocked/approximate methods weren't on hand when the feature was added. Dask arrays where the dimension being quantile-ed was exactly one chunk in extent seem like a special case where no blocked algorithm is needed.

The problem with following the suggestion of the exception (loading the array into memory) is that "wide and shallow" arrays are too big to load into memory, yet each chunk is statistically independent if the quantile dimension is the "shallow" dimension.

I'm not necessarily proposing delegating to Dask's quantile (unless it's super easy), but wanted to explore this special case described above.

Related links:

Thank you!

EDIT: added stackoverflow link

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions