自定義一個關于學習率的回調函數
定義函數,可以自定義修改學習率
比如我設定的lr = [0.01, 0.01, 0.001, 0.0001]
epochs = [5, 10, 15, 19, 23]
這跟tensorflow有點不一樣,tensorflow自帶的需要lr比epoch多設定一個
def scheduler(epoch):
lr = [0.01, 0.01, 0.001, 0.0001]
epochs = [5, 10, 15, 19, 23]
if epoch in epochs:
index_lr = epochs.index(epoch)
lr_now = lr_inputs[index_lr]
lr = K.get_value(self.model.optimizer.lr)
K.set_value(self.model.optimizer.lr, lr_now)
print("pre_lr {}".format(lr))
print("lr changed to {}".format(lr_now))
return K.get_value(self.model.optimizer.lr)
reduce_lr = LearningRateScheduler(scheduler)