How can I change Dropout during training? For example
Dropout= [0.1, 0.2, 0.3]
I tried passing it as as a list but I couldn't make it work.
To change the dropout probability during training, you should use the functional
version, i.e. torch.nn.functional.dropout
.
The input arguments to the functional version of dropout are
self.training
)Thus, you can alter the probability of the dropout in your forward
method, according to your needs.
For example, you can do in your forward
method:
def forward(self, x):
...
# apply some layers to the input
h = self.my_layers(x)
# set the value of p
p = self.get_value_for_p()
# apply dropout with new p
h = torch.nn.functional.dropout(h, p, self.training)
...
More on the functional version of dropout, here: https://pytorch.org/docs/stable/nn.functional.html#dropout-functions
It is also possible to directly access the attribute of the Dropout
class:
def forward(self, x, p=0.5):
self.dropout_layer.p = p
x = self.layers1(x)
x = self.dropout_layer(x)
x = self.layers2(x)
return x
Example to show that this works:
layer = torch.nn.Dropout(p=0.2)
x = torch.rand()
layer(x) # tensor([0.0680, 0.0000, 1.4633, 6.5492, 0.0000])
layer.p = 0.9
layer(x) # tensor([0.0000, 0.0000, 0.0000, 6.5492, 0.0000])
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With