Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

pearsons linear coefficient keras

I have tried to implement pearsons linear coefficient as a metric in Keras however because of placeholders I cannot compile my model using this metric.

def CC(y_true, y_pred):

y_true = K.clip(y_true, K.epsilon(), 1)
y_pred = K.clip(y_pred, K.epsilon(), 1)
n_y_true=y_true/(K.sum(y_true)+K.epsilon())
n_y_pred=y_pred/(K.sum(y_pred)+K.epsilon())
y_true_average=K.mean(y_true)
y_pred_average=K.mean(y_pred)
print((K.map_fn(lambda x: x-y_pred_average,n_y_pred)).shape[0])
if not(K.map_fn(lambda x: x-y_pred_average,n_y_pred)).shape[0]==None:
    return (K.sum(K.dot((K.map_fn(lambda x: x-y_pred_average,n_y_pred)),(K.map_fn(lambda x: x-y_true_average,n_y_true))))/(K.count_params(n_y_true)-1))/(K.dot(K.std(n_y_pred),K.std(n_y_true)))
else:
    return 0

I tried using K.dot instead of * however the same error remains. During compile I get the error unsupported operand type(s) for *: 'NoneType' and 'NoneType. And I cannot figure out how to solve it. It happens because I want to elementwise multiply two tensors but the batchsize in the shape is not difined during compile and represented as a ? in the shape of (?,224,224,3). Is there a way to set this or work around it?

like image 869
azteks Avatar asked Dec 10 '25 12:12

azteks


1 Answers

The problem lies in two facts:

  1. The first dimension of a tensor is a batch dimension (that's why during model compilation is set to None).
  2. You are using sum and mean in such manner that you are also including this additional dimension in your computations.

Your Pearson's correlation loss should look the following:

def pearson_loss(y_true, y_pred):
    y_true = K.clip(y_true, K.epsilon(), 1)
    y_pred = K.clip(y_pred, K.epsilon(), 1)
    # reshape stage
    y_true = K.reshape(y_true, shape=(-1, 224 * 224 * 3))
    y_pred = K.reshape(y_pred, shape=(-1, 224 * 224 * 3))
    # normalizing stage - setting a 0 mean.
    y_true -= y_true.mean(axis=-1)
    y_pred -= y_pred.mean(axis=-1)
    # normalizing stage - setting a 1 variance
    y_true = K.l2_normalize(y_true, axis=-1)
    y_pred = K.l2_normalize(y_pred, axis=-1)
    # final result
    pearson_correlation = K.sum(y_true * y_pred, axis=-1)
    return pearson_correlation
like image 63
Marcin Możejko Avatar answered Dec 12 '25 03:12

Marcin Możejko



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!