I implemented Hyperparameter Tuning with KerasTuner. I would like to have the option to skip the Hyperparameter Tuning and use the default values instead.
It looks like this now (which builds the model with the best parameters after the search)
MyHyperModel(HyperModel)
def build(self, hp)
...hp.choice('hyperparameter', [1,2,3], default=3)
return model
tuner = HyperBand(
MyHyperModel(),
...
)
tuner.search(
train_inputs,
train_targets,
...
)
best_hp = tuner.get_best_hyperparameters()[0]
model = tuner.hypermodel.build(best_hp)
I would like to have something like
default_model = tuner.hypermodel.build(use_default_parameter=True)
which returns the Keras model with default values for the hyperparameters and can then be trained. But I can't figure it out.
Calling the build function with an empty HyperParameters container as argument returns the model with default parameters:
hypermodel = MyHyperModel()
hp = kt.HyperParameters()
model = hypermodel.build(hp)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With