Is it possible to continue training a Keras estimator with all the hyperparameters (including decreasing learning rate) and weights saved from previous epochs, as one does in scikit-learn with the warm_start parameter? Something like this:
estimator = KerasRegressor(build_fn=create_model, epochs=20, batch_size=40, warm_start=True)
Specifically, warm start should do this:
warm_start : bool, optional, default False When set to True, reuse the solution of the previous call to fit as initialization, otherwise, just erase the previous solution.
Is there anything like that in Keras?
Yes - it's possible. But rather cumbersome. You need to use train_on_batch function which keeps all model parameters (also optimizer ones).
This is cumbersome because you need to divide your dataset to batches on your own and you are also losing the possibility to apply Callbacks and to use automatic progbar. I hope that in new Keras version this option would be added to a fit method.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With