Skip to content

Use dim= instead of axis= in comfort_reward._within_bound#99

Open
lonexreb wants to merge 1 commit intoNVlabs:mainfrom
lonexreb:refactor/comfort-reward-axis-to-dim
Open

Use dim= instead of axis= in comfort_reward._within_bound#99
lonexreb wants to merge 1 commit intoNVlabs:mainfrom
lonexreb:refactor/comfort-reward-axis-to-dim

Conversation

@lonexreb
Copy link
Copy Markdown
Contributor

@lonexreb lonexreb commented May 4, 2026

Problem

_within_bound in finetune/rl/rewards/comfort_reward.py is the only spot in the file that uses NumPy-style axis= rather than PyTorch's dim=:

return torch.all(metric_within_bound, axis=-1).float()

Every other reduction in the same file uses the PyTorch-idiomatic dim=:

Line Call
47, 49 delta = tensor[..., 1:] - tensor[..., :-1] (slicing, fine)
54 torch.diff(yaw, dim=-1)
91 torch.linalg.norm(..., dim=-1)
128 _within_bound(...).mean(2) (positional dim)
131 comfort_metric_dict[name].mean(dim=-1)

PyTorch accepts axis= as a deprecated alias for dim=, so the call still works — this is purely a consistency fix.

Fix

One-character change: axis=-1dim=-1. Semantics are identical (torch.all(x, axis=-1) == torch.all(x, dim=-1)).

Verification

After the change: grep -n "axis=" finetune/rl/rewards/comfort_reward.py returns nothing.

`_within_bound` calls ``torch.all(metric_within_bound, axis=-1)`` -- the
NumPy keyword. PyTorch accepts ``axis=`` as a deprecated alias for
``dim=``, so the call still works, but it is the only ``axis=`` usage
in the entire file: every other reduction in comfort_reward.py uses
the PyTorch-idiomatic ``dim=`` (lines 47, 49, 54, 91, 128, 131).

Switch to ``dim=-1`` so the file reads consistently. The semantics are
identical (`torch.all(x, axis=-1) == torch.all(x, dim=-1)`).

Verified after the change: ``grep -n "axis=" finetune/rl/rewards/comfort_reward.py``
returns nothing.

Signed-off-by: lonexreb <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant