Shortcuts

ExponentialLR

class mmengine.optim.ExponentialLR(optimizer, *args, **kwargs)[source]

Decays the learning rate of each parameter group by gamma every epoch.

Parameters:
  • optimizer (Optimizer or OptimWrapper) – Wrapped optimizer.

  • gamma (float) – Multiplicative factor of learning rate decay.

  • begin (int) – Step at which to start updating the learning rate. Defaults to 0.

  • end (int) – Step at which to stop updating the learning rate. Defaults to INF.

  • last_step (int) – The index of last step. Used for resume without state dict. Defaults to -1.

  • by_epoch (bool) – Whether the scheduled learning rate is updated by epochs. Defaults to True.

  • verbose (bool) – Whether to print the learning rate for each update. Defaults to False.

Read the Docs v: latest
Versions
latest
stable
v0.10.4
v0.10.3
v0.10.2
v0.10.1
v0.10.0
v0.9.1
v0.9.0
v0.8.5
v0.8.4
v0.8.3
v0.8.2
v0.8.1
v0.8.0
v0.7.4
v0.7.3
v0.7.2
v0.7.1
v0.7.0
v0.6.0
v0.5.0
v0.4.0
v0.3.0
v0.2.0
Downloads
epub
On Read the Docs
Project Home
Builds

Free document hosting provided by Read the Docs.