-
-
Notifications
You must be signed in to change notification settings - Fork 11.8k
TYP: Various typing updates #30041
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TYP: Various typing updates #30041
Conversation
|
I don't think we should backport |
This comment has been minimized.
This comment has been minimized.
6d0b573 to
d76fa43
Compare
Co-authored-by: Joren Hammudoglu <[email protected]>
Co-authored-by: Joren Hammudoglu <[email protected]>
|
I am afraid your pushes got lost (again). |
|
The mypy failures in But the mypy failure in Lines 5453 to 5455 in 49e4e89
so that it's something like this: class flexible(_RealMixin, generic[_FlexibleItemT_co], Generic[_FlexibleItemT_co]): ... # type: ignore[misc] |
I did not push anything though |
This comment has been minimized.
This comment has been minimized.
|
The pyspark primer diff originates from this line: tmp_val = arg[np._NoValue] # type: ignore[attr-defined]where arg is a So the previous |
- Update numpy/__init__.pyi - Update numpy/_core/numerictypes.pyi
This comment has been minimized.
This comment has been minimized.
Co-authored-by: Joren Hammudoglu <[email protected]>
With this the last two failures should be taken care of |
This comment has been minimized.
This comment has been minimized.
|
Diff from mypy_primer, showing the effect of this PR on type check results on a corpus of open source code: xarray (https://github.com/pydata/xarray)
+ xarray/compat/npcompat.py: note: In function "isdtype":
+ xarray/compat/npcompat.py:54: error: All conditional function variants must have identical signatures [misc]
+ xarray/compat/npcompat.py:54: note: Original:
+ xarray/compat/npcompat.py:54: note: def isdtype(dtype: dtype[Any] | type, kind: type[Any] | dtype[Any] | _SupportsDType[dtype[Any]] | tuple[Any, Any] | list[Any] | _DTypeDict | str | None | tuple[type[Any] | dtype[Any] | _SupportsDType[dtype[Any]] | tuple[Any, Any] | list[Any] | _DTypeDict | str | None, ...]) -> bool
+ xarray/compat/npcompat.py:54: note: Redefinition:
+ xarray/compat/npcompat.py:54: note: def isdtype(dtype: dtype[Any] | type[Any], kind: type[Any] | dtype[Any] | _SupportsDType[dtype[Any]] | tuple[Any, Any] | list[Any] | _DTypeDict | str | None | tuple[type[Any] | dtype[Any] | _SupportsDType[dtype[Any]] | tuple[Any, Any] | list[Any] | _DTypeDict | str | None, ...]) -> bool
- xarray/namedarray/dtypes.py: note: In function "maybe_promote":
- xarray/namedarray/dtypes.py:86: error: Too many arguments for "generic" [call-arg]
spark (https://github.com/apache/spark)
+ python/pyspark/pandas/series.py:1210: error: Unused "type: ignore" comment [unused-ignore]
+ python/pyspark/pandas/series.py:1210: error: No overload variant of "__getitem__" of "Series" matches argument type "_NoValueType" [call-overload]
+ python/pyspark/pandas/series.py:1210: note: Error code "call-overload" not covered by "type: ignore" comment
+ python/pyspark/pandas/series.py:1210: note: Possible overload variants:
+ python/pyspark/pandas/series.py:1210: note: def __getitem__(self, list[str] | Index[Any] | Series[Any] | slice[Any, Any, Any] | Series[builtins.bool] | ndarray[tuple[Any, ...], dtype[numpy.bool[builtins.bool]]] | list[builtins.bool] | tuple[Hashable | slice[Any, Any, Any], ...], /) -> Series[Any]
+ python/pyspark/pandas/series.py:1210: note: def __getitem__(self, str | bytes | date | datetime | timedelta | <7 more items> | complex | integer[Any] | floating[Any] | complexfloating[Any, Any], /) -> Any
+ python/pyspark/pandas/series.py:1212: error: Unused "type: ignore" comment [unused-ignore]
|
jorenham
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I went over it again, and I think it should be good to merge now.
|
Thanks @charris |
No description provided.