Shortcuts

ColossalAIOptimWrapper

class mmengine._strategy.colossalai.ColossalAIOptimWrapper(optimizer, booster=None, accumulative_counts=1)[source]

OptimWrapper for ColossalAI.

The available optimizers are:
  • CPUAdam

  • FusedAdam

  • FusedLAMB

  • FusedSGD

  • HybridAdam

  • Lamb

  • Lars

You can find more details in the colossalai tutorial

Parameters:
  • optimizer (dict or torch.optim.Optimizer) – The optimizer to be wrapped.

  • accumulative_counts (int) – The number of iterations to accumulate gradients. The parameters will be updated per accumulative_counts.

  • booster (None) –

backward(loss, **kwargs)[source]

Perform gradient back propagation.

Provide unified backward interface compatible with automatic mixed precision training. Subclass can overload this method to implement the required logic. For example, torch.cuda.amp require some extra operation on GradScaler during backward process.

Note

If subclasses inherit from OptimWrapper override backward, _inner_count +=1 must be implemented.

Parameters:
Return type:

None

optim_context(model)[source]

A Context for gradient accumulation and automatic mix precision training.

If subclasses need to enable the context for mix precision training, e.g., :class:`AmpOptimWrapper, the corresponding context should be enabled in optim_context. Since OptimWrapper uses default fp32 training, optim_context will only enable the context for blocking the unnecessary gradient synchronization during gradient accumulation

If model is an instance with no_sync method (which means blocking the gradient synchronization) and self._accumulative_counts != 1. The model will not automatically synchronize gradients if cur_iter is divisible by self._accumulative_counts. Otherwise, this method will enable an empty context.

Parameters:

model (nn.Module) – The training model.

Read the Docs v: stable
Versions
latest
stable
v0.10.5
v0.10.4
v0.10.3
v0.10.2
v0.10.1
v0.10.0
v0.9.1
v0.9.0
v0.8.5
v0.8.4
v0.8.3
v0.8.2
v0.8.1
v0.8.0
v0.7.4
v0.7.3
v0.7.2
v0.7.1
v0.7.0
v0.6.0
v0.5.0
v0.4.0
v0.3.0
v0.2.0
Downloads
epub
On Read the Docs
Project Home
Builds

Free document hosting provided by Read the Docs.