-
Notifications
You must be signed in to change notification settings - Fork 1.5k
iter_patch memory consumption #5611
Copy link
Copy link
Closed
Description
Discussed in #5610
Originally posted by schellenchris November 30, 2022
Hello,
I was playing around with the GridPatchDataset / training with smaller Slices than the original volume, when I ran into some weird RAM OOM problems. After some investigation i noticed that line 290 in data/utils.py cause the problem:
Lines 288 to 290 in d0db5fd
| # pad image by maximum values needed to ensure patches are taken from inside an image | |
| if padded: | |
| arrpad = np.pad(arr, tuple((p, p) for p in patch_size_), look_up_option(mode, NumpyPadMode).value, **pad_opts) |
is there a reason to pad the image that much (especially padding the channels doesn't make sense for me)?
I managed to reduce the memory by 15! GB by modifying the code a bit and just padding the axis slices are generated from. I was just wondering if I am missing out on something.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels