Here is my code:
model = Sequential()
model.add(Dense(50, input_dim=33, init='uniform', activation='relu'))
for u in range(3): #how to efficiently add more layers
model.add(Dense(33, init='uniform', activation='relu'))
model.add(Dense(122, init='uniform', activation='sigmoid'))
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
#This line of code is an update to the question and may be responsible
model.fit(X_train, Y_train, nb_epoch=35, batch_size=20, validation_split=0.2, callbacks=[EarlyStopping(monitor='val_loss', patience=10)])
It was running the Epochs and getting better in accuracy but then the loss started being nan and the accuracy went way down. I used model.predict and got an error from that as well.
Anyone got a fix?
If you are using categorical_crossentropy as loss function then the last layer of the model should be softmax.
Here you are using sigmoid which has the chance of making all dimensions of output close to 0 which will result in loss to overflow and hence nan.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With