How could you implement these 2 Keras
models (inspired by the Datacamp course 'Advanced Deep Learning with Keras in Python') in Pytorch
:
Classification with 1 input, 2 outputs:
from keras.layers import Input, Concatenate, Dense
from keras.models import Model
input_tensor = Input(shape=(1,))
output_tensor = Dense(2)(input_tensor)
model = Model(input_tensor, output_tensor)
model.compile(optimizer='adam', loss='categorical_crossentropy')
X = ... # e.g. a pandas series
y = ... # e.g. a pandas df with 2 columns
model.fit(X, y, epochs=100)
A model with classification and regression:
from keras.layers import Input, Dense
from keras.models import Model
input_tensor = Input(shape=(1,))
output_tensor_reg = Dense(1)(input_tensor)
output_tensor_class = Dense(1, activation='sigmoid')(output_tensor_reg)
model.compile(loss=['mean_absolute_error','binary_crossentropy']
X = ...
y_reg = ...
y_class = ...
model.fit(X, [y_reg, y_class], epochs=100)
This ressource was particularly helpful.
Basically, the idea is that, contrary to Keras, you have to explicitly say where you're going to compute each output in your forward function and how the global loss is gonna be computed from them.
For example, regarding the 1st example:
def __init__(self, ...):
... # define your model elements
def forward(self, x):
# Do your stuff here
...
x1 = F.sigmoid(x) # class probabilities
x2 = F.sigmoid(x) # bounding box calculation
return x1, x2
Then you compute the losses:
out1, out2 = model(data)
loss1 = criterion1(out1, target1)
loss2 = criterion2(out2, targt2)
alpha = ... # define the weights of each sub-loss in the global loss
loss = alpha * loss1 + (1-alpha) * loss2
loss.backward()
For the 2nd one, it's almost the same but you compute the loss at different point in the forward pass:
def __init__(self, ...):
... # define your model elements
def forward(self, main_input, aux_input):
aux = F.relu(self.dense_1(main_input))
x = F.relu(self.input_layer(aux))
x = F.sigmoid(x)
return x, aux
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With