Skip to content

Conversation

@shelhamer
Copy link
Member

To feed inputs of varying dimension, the DATA layer reshapes its prefetch and top blobs when the batch size is 1. This is useful for models of variable input size, such as fully convolutional models.

By the grace of #594 this is a simple change.

  • simplify to reshape always (there's no cost)
  • make compatible with crop thanks to @philkr
  • test data layer reshaping

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not const Datum& datum = iter_->value;

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

By they way you forgot to add DecodeDatum(&datum); just in case datum was encoded

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The Datum isn't stored with the proper dimensions and it's only set right when decoding? Shouldn't encoding / decoding be isolated to the data, float_data fields?

@bhack
Copy link
Contributor

bhack commented Dec 22, 2014

@shelhamer Reading this ticket comments the status is unknown. Was self assigned but there is no evidence of further step needs , no todo list, no bullet points.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does this work if crop_size is nonzero?

@longjon
Copy link
Contributor

longjon commented Dec 24, 2014

So I would like to merge this soon. Why not lose Reshape, move the current shaping code from DataLayerSetUp to InternalThreadEntry, and add an unconditional reshape to Forward (in BaseDataLayer?).

Then cropping should work fine (it looks broken, no?), and the patch should be only be adding ~2 net loc. Unless I'm forgetting something.

longjon added a commit to longjon/caffe that referenced this pull request Dec 29, 2014
Reshape single input batches for inputs of varying dimension
longjon added a commit to longjon/caffe that referenced this pull request Dec 29, 2014
Reshape single input batches for inputs of varying dimension
longjon added a commit to longjon/caffe that referenced this pull request Dec 29, 2014
Reshape single input batches for inputs of varying dimension
longjon added a commit to longjon/caffe that referenced this pull request Dec 29, 2014
Reshape single input batches for inputs of varying dimension
longjon added a commit to longjon/caffe that referenced this pull request Dec 30, 2014
Reshape single input batches for inputs of varying dimension
longjon added a commit to longjon/caffe that referenced this pull request Dec 30, 2014
Reshape single input batches for inputs of varying dimension
longjon added a commit to longjon/caffe that referenced this pull request Dec 30, 2014
Reshape single input batches for inputs of varying dimension
longjon added a commit to longjon/caffe that referenced this pull request Dec 30, 2014
Reshape single input batches for inputs of varying dimension
longjon added a commit to longjon/caffe that referenced this pull request Dec 31, 2014
Reshape single input batches for inputs of varying dimension
longjon added a commit to longjon/caffe that referenced this pull request Dec 31, 2014
Reshape single input batches for inputs of varying dimension
longjon added a commit to longjon/caffe that referenced this pull request Dec 31, 2014
Reshape single input batches for inputs of varying dimension
longjon added a commit to longjon/caffe that referenced this pull request Jan 1, 2015
Reshape single input batches for inputs of varying dimension
longjon added a commit to longjon/caffe that referenced this pull request Jan 2, 2015
Reshape single input batches for inputs of varying dimension
longjon added a commit to longjon/caffe that referenced this pull request Jan 2, 2015
Reshape single input batches for inputs of varying dimension
longjon added a commit to longjon/caffe that referenced this pull request Jan 3, 2015
Reshape single input batches for inputs of varying dimension
longjon added a commit to longjon/caffe that referenced this pull request Jan 3, 2015
Reshape single input batches for inputs of varying dimension
@shelhamer
Copy link
Member Author

@longjon I'll take care of this once #1568 is reviewed and finished.

@bhack
Copy link
Contributor

bhack commented Jan 8, 2015

@longjon Can you check some of the last @sguada comments on our #1416? I think that if we want to let top reshaping a forward business we need to use it in the same way in this two PR.

@shelhamer
Copy link
Member Author

@longjon re: #1313 (comment) the DataLayerSetup reshaping is needed if the net is to have any dimensions before a call to Forward, as needed for diagnostic output during initialization and for the data layer tests.

It does seem like this could be simplified so that data layers unconditionally reshape, but we need to decide what to do about Net::Init() and I'd rather defer that.

@shelhamer
Copy link
Member Author

By the way this is still broken w.r.t. crop. This works and has tests but could use a clearer check to catch the case when batch size > 1 but the source data has different shapes instance by instance. This will fail by a dimension check in the data transformer, but that obscures the issue.

It can wait until a more general data reformation though.

@shelhamer shelhamer force-pushed the reshape-data-layer branch 6 times, most recently from 7fa470e to ba39b58 Compare February 6, 2015 03:07
@shelhamer
Copy link
Member Author

@longjon DATA + IMAGE_DATA now reshape with tests. Could you review + merge?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should this be Reshape(this->prefetch_data_.num()? This is unconditional, not just batch size one, right?

philkr added a commit to philkr/caffe that referenced this pull request Feb 15, 2015
Reshape single input batches for inputs of varying dimension
To feed inputs of varying dimension, the `DATA` and `IMAGE_DATA` layer
reshapes its prefetch and top blobs when the batch size is 1.

The `BasePrefetchingDataLayer` always reshapes on forward.
@dxj19831029
Copy link

em... I think the python's wrapper need to adjust as well, otherwise the preprocess will always resize the input image.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants