Remove dropout from ResNet? (And other ketos architectures)
In my own experience, training the ketos ResNet architecture with dropout > 0 results in (far) worse performance than dropout = 0. This appears to be consistent with observations made by others. See for example: https://www.kdnuggets.com/2018/09/dropout-convolutional-networks.html . I'm wondering if we should simply remove the dropout argument from the definition of the ResNetBlock and ResNetArch in ketos: https://gitlab.meridian.cs.dal.ca/public_projects/ketos/-/blob/master/ketos/neural_networks/resnet.py#L100 ?
@bpadovese , @fsfrazao , your thoughts/experiences?
I'm not sure if the dropout has similar negative impact on the other ketos architectures (cnn, densenet, inception) ...
To upload designs, you'll need to enable LFS and have an admin enable hashed storage. More information