Shortcuts

StepLR

class mmengine.optim.StepLR(optimizer, *args, **kwargs)[源代码]

Decays the learning rate of each parameter group by gamma every step_size epochs. Notice that such decay can happen simultaneously with other changes to the learning rate from outside this scheduler.

参数:
  • optimizer (Optimizer or OptimWrapper) – Wrapped optimizer.

  • step_size (int) – Period of learning rate decay.

  • gamma (float) – Multiplicative factor of learning rate decay. Defaults to 0.1.

  • begin (int) – Step at which to start updating the learning rate. Defaults to 0.

  • end (int) – Step at which to stop updating the learning rate. Defaults to INF.

  • last_step (int) – The index of last step. Used for resume without state dict. Defaults to -1.

  • by_epoch (bool) – Whether the scheduled learning rate is updated by epochs. Defaults to True.

  • verbose (bool) – Whether to print the learning rate for each update. Defaults to False.