- class mmengine.optim.MultiStepLR(optimizer, *args, **kwargs)¶
Decays the specified learning rate in each parameter group by gamma once the number of epoch reaches one of the milestones. Notice that such decay can happen simultaneously with other changes to the learning rate from outside this scheduler.
optimizer (Optimizer or OptimWrapper) – Wrapped optimizer.
milestones (list) – List of epoch indices. Must be increasing.
gamma (float) – Multiplicative factor of learning rate decay. Defaults to 0.1.
begin (int) – Step at which to start updating the learning rate. Defaults to 0.
end (int) – Step at which to stop updating the learning rate. Defaults to INF.
last_step (int) – The index of last step. Used for resume without state dict. Defaults to -1.
by_epoch (bool) – Whether the scheduled learning rate is updated by epochs. Defaults to True.
verbose (bool) – Whether to print the learning rate for each update. Defaults to False.