Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Can Neural Network model use Weighted Mean (Sum) Squared Error as its loss function?

I am nooby in this field of study and probably this is a pretty silly question. I want to build a normal ANN, but I am not sure if I can use a weighted mean square error as the loss function. If we are not treating each sample equally, I mean we care the prediction precision more for some of the categories of the samples more than the others, then we want to form a weighted loss function. Lets say, we have a categorical feature ci, i is the index of the sample, and for simplicity, we assume that this feature takes binary value, either 0 or 1. So, we can form the loss function as

(ci + 1)(yi_hat - yi)^2

#and take the sum for all i

Are there going to be any problem with the back-propagation? I don't see any issue with calculating the gradient or updating the weights between layers. And, if no issue, how can I program this loss function in Keras? Because it seems that the loss function only takes two parameters, y_true and y_pred, how can I plug in the vector c?

like image 766
Aaron_Geng Avatar asked Oct 20 '25 03:10

Aaron_Geng


1 Answers

There is absolutely nothing wrong with that. Functions can declare the constants within themselves or even take the constants from an outside scope:

import keras.backend as K

c = K.constant([c1,c2,c3,c4,...,cn])

def weighted_loss(y_true,y_pred):
    loss = keras.losses.get('mse')
    return c * loss(y_true,y_pred)

Exactly like yours:

def weighted_loss(y_true,y_pred):

    weighted = (c+1)*K.square(y_true-y_pred)
    return K.sum(weighted)
like image 122
Daniel Möller Avatar answered Oct 27 '25 06:10

Daniel Möller



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!