sha li sha li - 1 month ago 19
Python Question

print learning rate evary epoch in sgd

I tried to print the learning rate in the mini-batch gradient descent. But the Ir remain unchanged(always 0.10000000149) for many epochs. But it was suppossed to change evrery mini-batch. The code is as follows:

# set the decay as 1e-1 to see the Ir change between epochs.
sgd = SGD(lr=0.1, decay=1e-1, momentum=0.9, nesterov=True)
model.compile(loss='categorical_crossentropy',
optimizer=sgd,
metrics=['accuracy'])
class LossHistory(Callback):
def on_epoch_begin(self, batch, logs={}):
lr=self.model.optimizer.lr.get_value()
print('Ir:', lr)
history=LossHistory()
model.fit(X_train, Y_train,
batch_size= batch_size,
nb_epoch= nb_epoch,
callbacks= [history])

Answer

What you are printing is the initial learning rate, not the actual one which is calculated on the go :

lr = self.lr * (1. / (1. + self.decay * self.iterations))