0

I have a problem about calculating binary cross entropy. The way I know that works out in pytorch is:

import torch
import torch.nn as nn
import torch.nn.functional as F
def lossfunc():
    return F.binary_cross_entropy

criterion = lossFunc()
input = torch.randn((3, 2), requires_grad=True)
target = torch.rand((3, 2), requires_grad=False)
loss = criterion(torch.sigmoid(input),target)

But how to complete the lossfunc() in such way, because I don't know how to pass the arguments to the function:

#the function that add sigmoid to input and calculate the binary cross entropy loss
def lossfunc():
   return

criterion = lossFunc()
input = torch.randn((3, 2), requires_grad=True)
target = torch.rand((3, 2), requires_grad=False)
loss = criterion(input,target)

1 Answer 1

1

I think you're confusing the nn api with the functional F api. In functional api, loss function F.binary_cross_entropy can be used as a function directly.

In nn api, you need to create an object of the loss class such as criterion = nn.BCELoss()

Thus, you can simply do:

def lossFunc(input, target):
   return F.binary_cross_entropy(torch.sigmoid(input),target)

input = torch.randn((3, 2), requires_grad=True)
target = torch.rand((3, 2), requires_grad=False)
loss = lossFunc(input,target)

Also, PyTorch provides nn.nn.BCEWithLogitsLoss() and F.binary_cross_entropy_with_logits() that combines both sigmoid and binary cross-entropy.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.