Skip to content
This repository was archived by the owner on Jan 7, 2025. It is now read-only.
This repository was archived by the owner on Jan 7, 2025. It is now read-only.

Training an autoencoder #117

@mainak124

Description

@mainak124

Hi,
I am trying to pretrain an auto-encoder (similar to the one specified in caffe mnist). I have tthe last 3 layers as follows:

layer {
  name: "loss"
  type: "SigmoidCrossEntropyLoss"
  bottom: "decode1"
  bottom: "flatdata"
  top: "cross_entropy_loss"
  loss_weight: 1
}
layer {
  name: "decode1neuron"
  type: "Sigmoid"
  bottom: "decode1"
  top: "decode1neuron"
}
layer {
  name: "loss"
  type: "EuclideanLoss"
  bottom: "decode1neuron"
  bottom: "flatdata"
  top: "l2_error"
  loss_weight: 0
}

Although these two loss layers are specified, when I start training, it immediately fails with this error message:

Traceback (most recent call last):
  File "/home/mainak/digits/digits/scheduler.py", line 394, in task_thread
    task.run(**options)
  File "/home/mainak/digits/digits/task.py", line 161, in run
    self.before_run()
  File "/home/mainak/digits/digits/model/tasks/caffe_train.py", line 98, in before_run
    self.save_prototxt_files()
  File "/home/mainak/digits/digits/model/tasks/caffe_train.py", line 158, in save_prototxt_files
    assert len(loss_layers) > 0, 'must specify a loss layer'
AssertionError: must specify a loss layer

Any idea where I might be going wrong? Also, I believe DIGITS is not whining about a loss layer after removing it himself? Also, is it possible at all to pre-train an auto-encoder in DIGITS?

Thanks in advance!

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions