I am trying to set some constraints for weight parameters in PyTorch, e.g. the sum of every row of the weight matrix be exactly one for a fully connected layer:
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
self.layer1 = nn.Linear(28*28, 10*10)
self.layer2 = nn.Linear(10*10, 5*5)
self.layer3 = nn.Linear(5*5, 10)
def forward(self, x):
x=torch.sigmoid(self.layer1(x))
x=torch.sigmoid(self.layer2(x))
x=torch.sigmoid(self.layer3(x))
model=Net()
The constraint for this example network would be:
torch.sum(model.linear1.weight,0)==1
torch.sum(model.linear2.weight,0)==1
torch.sum(model.linear3.weight,0)==1
A commonly used method to set a constraint, clamp, is used to set constraints for every element, but in this case, I would be setting a constraint for every row, instead of any particular element of the weight matrix. Are there any ways to implement this kind of constraint?
clampis not helpful" is not a helpful statement