You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
layers = importCaffeLayers('MemNet_M6R6_80C64_train.prototxt')
Gives the error:
Error using nnet.internal.cnn.caffe.CaffeModelReader/importLayers (line 294)
The layer 'weight_output_end_01' specifies a Scale layer without a preceding BatchNorm layer. Scale layers are only supported when preceded by a BatchNorm layer.
Error in importCaffeLayers (line 73)
layers = importLayers(readerObj);
Using MatlabR2021a
The text was updated successfully, but these errors were encountered:
Checked the Matlab forums and came across this explaination - "If we see the original paper of Batch Normalization, the author mentioned that, “we make sure that the transformation inserted in the network can represent the identity transform”. Without the Scale layer after the BatchNorm layer will not work, Since Caffe BatchNorm layer has no learnable parameters."
So it looks like the MemNet implimentation isn't supported in current Caffe format unfortunately, making it much more of a pain to port to Matlab, sigh. Oh well, was looking forward to playing around with this network! maybe if someone get's time and updates this, or uses a different wrapper... Cheers!
layers = importCaffeLayers('MemNet_M6R6_80C64_train.prototxt')
Gives the error:
Error using nnet.internal.cnn.caffe.CaffeModelReader/importLayers (line 294)
The layer 'weight_output_end_01' specifies a Scale layer without a preceding BatchNorm layer. Scale layers are only supported when preceded by a BatchNorm layer.
Error in importCaffeLayers (line 73)
layers = importLayers(readerObj);
Using MatlabR2021a
The text was updated successfully, but these errors were encountered: