sha li sha li - 4 months ago 57
Python Question

print learning rate evary epoch in sgd

I tried to print the learning rate in the mini-batch gradient descent. But the Ir remain unchanged(always 0.10000000149) for many epochs. But it was suppossed to change evrery mini-batch. The code is as follows:

# set the decay as 1e-1 to see the Ir change between epochs.
sgd = SGD(lr=0.1, decay=1e-1, momentum=0.9, nesterov=True)
class LossHistory(Callback):
def on_epoch_begin(self, batch, logs={}):
print('Ir:', lr)
history=LossHistory(), Y_train,
batch_size= batch_size,
nb_epoch= nb_epoch,
callbacks= [history])


What you are printing is the initial learning rate, not the actual one which is calculated on the go :

lr = * (1. / (1. + self.decay * self.iterations))