Is it intended, that the default behavior, when not specifying the blobs_lr in a layer, is to not use backpropagation on this layer?
I just spend a couple of hours trying to figure out why my network wasn't working until I realized that this was the cause.
I personally think this is a very dangerous behavior and the default should be to set the learning rate multiplier for this blob to 1. Only if someone explicitly sets the blobs_lr to 0, backpropagation should be deactivated.