Adding L1/L2 Regularization In PyTorch?


Answer :

Following should help for L2 regularization:

optimizer = torch.optim.Adam(model.parameters(), lr=1e-4, weight_decay=1e-5) 

This is presented in the documentation for PyTorch. Have a look at http://pytorch.org/docs/optim.html#torch.optim.Adagrad. You can add L2 loss using the weight decay parameter to the Optimization function.


For L2 regularization,

l2_lambda = 0.01 l2_reg = torch.tensor(0.) for param in model.parameters():     l2_reg += torch.norm(param) loss += l2_lambda * l2_reg 

References:

  • https://discuss.pytorch.org/t/how-does-one-implement-weight-regularization-l1-or-l2-manually-without-optimum/7951.
  • http://pytorch.org/docs/master/torch.html?highlight=norm#torch.norm.

Comments

Popular posts from this blog

Converting A String To Int In Groovy

"Cannot Create Cache Directory /home//.composer/cache/repo/https---packagist.org/, Or Directory Is Not Writable. Proceeding Without Cache"

Android How Can I Convert A String To A Editable