Skip to content

Behavior when not specifying the blobs_lr in a layer #100

@tdomhan

Description

@tdomhan

Is it intended, that the default behavior, when not specifying the blobs_lr in a layer, is to not use backpropagation on this layer?
I just spend a couple of hours trying to figure out why my network wasn't working until I realized that this was the cause.
I personally think this is a very dangerous behavior and the default should be to set the learning rate multiplier for this blob to 1. Only if someone explicitly sets the blobs_lr to 0, backpropagation should be deactivated.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions