-
Notifications
You must be signed in to change notification settings - Fork 26.3k
Description
Hi,
I recently switched my codebase from tensorflow to pytorch. After watching the tutorials as well as some example projects, I felt quite excited about pytorch. It is way more easier to use and more transparent than tensorflow. The source code is very enjoyable to read. Thanks for making it public.
I have a feature request stated as follows.
I am using BCELoss for training discriminator in GANs. However, I noticed that BCELoss is built on top of sigmoid layer, which is numerically unstable. In my tensorflow implementation, I used sigmoid_cross_entropy_with_logits, which can prevent the overflow situation that often happens when we start to train from scratch.
I wrote a very basic module to replace BCELoss in my code. The code below illustrates the idea, and is definitely not a very formal code. I feel that it is necessary to add similar loss modules to address this numerical stability issue.
class StableBCELoss(nn.modules.Module):
def __init__(self):
super(StableBCELoss, self).__init__()
def forward(self, input, target):
neg_abs = - input.abs()
loss = input.clamp(min=0) - input * target + (1 + neg_abs.exp()).log()
return loss.mean()