Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Changing Dropout value during training

How can I change Dropout during training? For example

Dropout= [0.1, 0.2, 0.3]

I tried passing it as as a list but I couldn't make it work.

like image 887
manda2 Avatar asked Oct 18 '25 11:10

manda2


2 Answers

To change the dropout probability during training, you should use the functional version, i.e. torch.nn.functional.dropout.

The input arguments to the functional version of dropout are

  • the input tensor
  • the dropout probability (which you can alter)
  • a boolean to indicate if it is in training mode (you can use the self.training)
  • and a flag to indicate if you want the operation to be performed in place.

Thus, you can alter the probability of the dropout in your forward method, according to your needs.

For example, you can do in your forward method:


def forward(self, x):

    ...

    # apply some layers to the input
    h = self.my_layers(x)

    # set the value of p
    p = self.get_value_for_p()

    # apply dropout with new p
    h = torch.nn.functional.dropout(h, p, self.training)

    ...

More on the functional version of dropout, here: https://pytorch.org/docs/stable/nn.functional.html#dropout-functions

like image 173
Xxxo Avatar answered Oct 21 '25 11:10

Xxxo


It is also possible to directly access the attribute of the Dropout class:

def forward(self, x, p=0.5):
    self.dropout_layer.p = p
    x = self.layers1(x)
    x = self.dropout_layer(x)
    x = self.layers2(x)
    return x

Example to show that this works:

layer = torch.nn.Dropout(p=0.2)
x = torch.rand()
layer(x) # tensor([0.0680, 0.0000, 1.4633, 6.5492, 0.0000])
layer.p = 0.9
layer(x) # tensor([0.0000, 0.0000, 0.0000, 6.5492, 0.0000])
like image 38
Vilda Avatar answered Oct 21 '25 10:10

Vilda



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!