- class mmengine.optim.ExponentialLR(optimizer, *args, **kwargs)¶
Decays the learning rate of each parameter group by gamma every epoch.
optimizer (Optimizer or OptimWrapper) – Wrapped optimizer.
gamma (float) – Multiplicative factor of learning rate decay.
begin (int) – Step at which to start updating the learning rate. Defaults to 0.
end (int) – Step at which to stop updating the learning rate. Defaults to INF.
last_step (int) – The index of last step. Used for resume without state dict. Defaults to -1.
by_epoch (bool) – Whether the scheduled learning rate is updated by epochs. Defaults to True.
verbose (bool) – Whether to print the learning rate for each update. Defaults to False.