Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to initialize the weights of different layers of nn.Sequential block in different styles in pytorch?

Tags:

python

pytorch

Let's suppose I have a nn.Sequential block, it has 2 linear layers. I want to initialize the weights of first layer by uniform distribution but want to initialize the weights of second layer as constant 2.0.

net = nn.Sequential()
net.add_module('Linear_1', nn.Linear(2, 5, bias = False))
net.add_module('Linear_2', nn.Linear(5, 5, bias = False)
like image 263
Akshit Mittal Avatar asked Oct 15 '25 20:10

Akshit Mittal


1 Answers

Here is one way of doing so:

import torch
import torch.nn as nn 

net = nn.Sequential()

ll1 = nn.Linear(2, 5, bias = False)
torch.nn.init.uniform_(ll1.weight, a=0, b=1) # a: lower_bound, b: upper_bound
net.add_module('Linear_1', ll1)
print(ll1.weight)

ll2 = nn.Linear(5, 5, bias = False)
torch.nn.init.constant_(ll2.weight, 2.0)
net.add_module('Linear_2', ll2)
print(ll2.weight)

print(net)

Output:

Parameter containing:
tensor([[0.2549, 0.7823],
        [0.3439, 0.4721],
        [0.0709, 0.6447],
        [0.3969, 0.7849],
        [0.7631, 0.5465]], requires_grad=True)

Parameter containing:
tensor([[2., 2., 2., 2., 2.],
        [2., 2., 2., 2., 2.],
        [2., 2., 2., 2., 2.],
        [2., 2., 2., 2., 2.],
        [2., 2., 2., 2., 2.]], requires_grad=True)

Sequential(
(Linear_1): Linear(in_features=2, out_features=5, bias=False)
(Linear_2): Linear(in_features=5, out_features=5, bias=False)
)
like image 94
Anubhav Singh Avatar answered Oct 17 '25 10:10

Anubhav Singh



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!