Shortcuts

mmengine.model

Module

BaseModule

Base module for all modules in openmmlab.

ModuleDict

ModuleDict in openmmlab.

ModuleList

ModuleList in openmmlab.

Sequential

Sequential module in openmmlab.

Model

BaseModel

Base class for all algorithmic models.

BaseDataPreprocessor

Base data pre-processor used for copying data to the target device.

ImgDataPreprocessor

Image pre-processor for normalization and bgr to rgb conversion.

BaseTTAModel

Base model for inference with test-time augmentation.

EMA

BaseAveragedModel

A base class for averaging model weights.

ExponentialMovingAverage

Implements the exponential moving average (EMA) of the model.

MomentumAnnealingEMA

Exponential moving average (EMA) with momentum annealing strategy.

StochasticWeightAverage

Implements the stochastic weight averaging (SWA) of the model.

Model Wrapper

MMDistributedDataParallel

A distributed model wrapper used for training,testing and validation in loop.

MMSeparateDistributedDataParallel

A DistributedDataParallel wrapper for models in MMGeneration.

MMFullyShardedDataParallel

A wrapper for sharding Module parameters across data parallel workers.

is_model_wrapper

Check if a module is a model wrapper.

Weight Initialization

BaseInit

Caffe2XavierInit

ConstantInit

Initialize module parameters with constant values.

KaimingInit

Initialize module parameters with the values according to the method described in `Delving deep into rectifiers: Surpassing human-level performance on ImageNet classification - He, K.

NormalInit

Initialize module parameters with the values drawn from the normal distribution \(\mathcal{N}(\text{mean}, \text{std}^2)\).

PretrainedInit

Initialize module by loading a pretrained model.

TruncNormalInit

Initialize module parameters with the values drawn from the normal distribution \(\mathcal{N}(\text{mean}, \text{std}^2)\) with values outside \([a, b]\).

UniformInit

Initialize module parameters with values drawn from the uniform distribution \(\mathcal{U}(a, b)\).

XavierInit

Initialize module parameters with values according to the method described in `Understanding the difficulty of training deep feedforward neural networks - Glorot, X.

bias_init_with_prob

initialize conv/fc bias value according to a given probability value.

caffe2_xavier_init

constant_init

initialize

Initialize a module.

kaiming_init

normal_init

trunc_normal_init

uniform_init

update_init_info

Update the _params_init_info in the module if the value of parameters are changed.

xavier_init

Utils

detect_anomalous_params

merge_dict

Merge all dictionaries into one dictionary.

stack_batch

Stack multiple tensors to form a batch and pad the tensor to the max shape use the right bottom padding mode in these images.

revert_sync_batchnorm

Helper function to convert all SyncBatchNorm (SyncBN) and mmcv.ops.sync_bn.SyncBatchNorm`(MMSyncBN) layers in the model to `BatchNormXd layers.

convert_sync_batchnorm

Helper function to convert all BatchNorm layers in the model to SyncBatchNorm (SyncBN) or `mmcv.ops.sync_bn.SyncBatchNorm`(MMSyncBN) layers. Adapted from <https://pytorch.org/docs/stable/generated/torch.nn.Sy ncBatchNorm.html#torch.nn.SyncBatchNorm.convert_sync_batchnorm>_.

Read the Docs v: latest
Versions
latest
stable
Downloads
html
epub
On Read the Docs
Project Home
Builds

Free document hosting provided by Read the Docs.