I want to use the keras layer Flatten() or Reshape((-1,)) at the end of my model to output an 1D vector like [0,0,1,0,0, ... ,0,0,1,0].
Sadly there is an problem because of my unknown input shape which is:input_shape=(4, None, 1))).
So typically the input shape is something between [batch_size, 4, 64, 1] and [batch_size, 4, 256, 1] the output should be batch_size x unknown dimension (for the fist example above: [batch_size, 64] and for the secound [batch_size, 256]).
My model looks like:
model = Sequential()
model.add(Convolution2D(32, (4, 32), padding='same', input_shape=(4, None, 1)))
model.add(BatchNormalization())
model.add(LeakyReLU())
model.add(Convolution2D(1, (1, 2), strides=(4, 1), padding='same'))
model.add(Activation('sigmoid'))
# model.add(Reshape((-1,))) produces the error
# int() argument must be a string, a bytes-like object or a number, not 'NoneType'
model.compile(loss='binary_crossentropy', optimizer='adadelta')
So that my current output shape is [batchsize, 1, unknown dimension, 1].
Which does not allow me to use class_weights for example "ValueError: class_weight not supported for 3+ dimensional targets.".
Is it possible to use something like Flatten() or Reshape((1,)) to flatt my 3 dimensional output in keras (2.0.4 with tensorflow backend) when I use a flexible input shape?
Thanks a lot!
You can try K.batch_flatten() wrapped in a Lambda layer.
The output shape of K.batch_flatten() is dynamically determined at runtime.
model.add(Lambda(lambda x: K.batch_flatten(x)))
model.summary()
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d_5 (Conv2D) (None, 4, None, 32) 4128
_________________________________________________________________
batch_normalization_3 (Batch (None, 4, None, 32) 128
_________________________________________________________________
leaky_re_lu_3 (LeakyReLU) (None, 4, None, 32) 0
_________________________________________________________________
conv2d_6 (Conv2D) (None, 1, None, 1) 65
_________________________________________________________________
activation_3 (Activation) (None, 1, None, 1) 0
_________________________________________________________________
lambda_5 (Lambda) (None, None) 0
=================================================================
Total params: 4,321
Trainable params: 4,257
Non-trainable params: 64
_________________________________________________________________
X = np.random.rand(32, 4, 256, 1)
print(model.predict(X).shape)
(32, 256)
X = np.random.rand(32, 4, 64, 1)
print(model.predict(X).shape)
(32, 64)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With