Shortcuts

MultiStepLR

class mmengine.optim.MultiStepLR(optimizer, *args, **kwargs)[源代码]

Decays the specified learning rate in each parameter group by gamma once the number of epoch reaches one of the milestones. Notice that such decay can happen simultaneously with other changes to the learning rate from outside this scheduler.

参数:
  • optimizer (Optimizer or OptimWrapper) – Wrapped optimizer.

  • milestones (list) – List of epoch indices. Must be increasing.

  • gamma (float) – Multiplicative factor of learning rate decay. Defaults to 0.1.

  • begin (int) – Step at which to start updating the learning rate. Defaults to 0.

  • end (int) – Step at which to stop updating the learning rate. Defaults to INF.

  • last_step (int) – The index of last step. Used for resume without state dict. Defaults to -1.

  • by_epoch (bool) – Whether the scheduled learning rate is updated by epochs. Defaults to True.

  • verbose (bool) – Whether to print the learning rate for each update. Defaults to False.

Read the Docs v: latest
Versions
latest
stable
v0.10.3
v0.10.2
v0.10.1
v0.10.0
v0.9.1
v0.9.0
v0.8.5
v0.8.4
v0.8.3
v0.8.2
v0.8.1
v0.8.0
v0.7.4
v0.7.3
v0.7.2
v0.7.1
v0.7.0
v0.6.0
v0.5.0
v0.4.0
v0.3.0
v0.2.0
Downloads
epub
On Read the Docs
Project Home
Builds

Free document hosting provided by Read the Docs.