Skip to content

Supporting arbitrary directed acyclic networks #92

@aravindhm

Description

@aravindhm

Backpropagation can work on arbitrary directed acyclic networks. Does the current implementation support a blob being used by two different layers. I see that each layer is initializing bottom_diff to zero and then accumulating gradient into it, this would override the gradient contributed by a second layer acting on the same blob. Or am I missing something? If not, is simply changing = to += a good way of solving the problem?

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions