-
Notifications
You must be signed in to change notification settings - Fork 26.3k
Open
Labels
featureA request for a proper, new feature.A request for a proper, new feature.module: dataloaderRelated to torch.utils.data.DataLoader and SamplerRelated to torch.utils.data.DataLoader and SamplertriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module
Description
I am trying to implement Dynamic Samplers in Pytorch. This seems impossible with the existing train_loader api's. As the weighted the samplers only allow weights to be set once. I was thinking that weights could change based on the loss value for each and individual data point. This can be done without train_loader API but I think this a feature that could be included in a future release.
Something as simple as Loss Based sampling can't be done right now.
aaxwaz, cao13jf, Tsingularity and NimSed
Metadata
Metadata
Assignees
Labels
featureA request for a proper, new feature.A request for a proper, new feature.module: dataloaderRelated to torch.utils.data.DataLoader and SamplerRelated to torch.utils.data.DataLoader and SamplertriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module