Methods in ResNet module for modifying momentum and dropout parameters
This merge requests adds a couple of useful methods to the resnet.py module:
- Method
set_batch_norm_momentum
inResNetArch
for modifying the momentum parameter of the batch normalization layers in the network. - Method
set_dropout_rate
inResNetArch
for modifying the dropout rate parameter of the dropout layers in the network. - Equivalent methods in
ResNetBlock
- Possibility to specify the above parameters at initialization
- Added
training=training
in all calls to the dropout layers
The above changes have only been implemented for ResNet (2D). Same methods should be implemented for ResNet1D before merging this