Skip to content

Commit d2e6730

Browse files
committed
docs: clarify torch.arange floating-point rounding behavior
1 parent 9b89fa4 commit d2e6730

File tree

1 file changed

+6
-0
lines changed

1 file changed

+6
-0
lines changed

torch/_torch_docs.py

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -9122,6 +9122,12 @@ def merge_dicts(*dicts):
91229122
with values from the interval ``[start, end)`` taken with common difference
91239123
:attr:`step` beginning from `start`.
91249124
9125+
Note: When using floating-point dtypes (especially reduced precision types like ``bfloat16``),
9126+
the results may be affected by floating-point rounding behavior. Some values in the sequence
9127+
might not be exactly representable in certain floating-point formats, which can lead to
9128+
repeated values or unexpected rounding. For precise sequences, it is recommended to use
9129+
integer dtypes instead of floating-point dtypes.
9130+
91259131
Note that non-integer :attr:`step` is subject to floating point rounding errors when
91269132
comparing against :attr:`end`; to avoid inconsistency, we advise subtracting a small epsilon from :attr:`end`
91279133
in such cases.

0 commit comments

Comments
 (0)