Skip to content

Conversation

@iamshnoo
Copy link
Member

@iamshnoo iamshnoo commented Mar 16, 2020

As per the suggestions made in the last few comments in #2269 , I am opening this PR. As far as I can understand this only has the 7 commits with 22 files changed by me and the rest is same as the master branch of mlpack as of now. I hope I am doing this right :)

@zoq @sreenikSS If you guys find some time, could you please review the changes.

I basically did all the work in my feature branch, then updated the master branch of my fork today, and merged the feature branch to my master. Now, I am trying to merge this master with mlpack:master. I am keeping this PR as draft for now, until it is reviewed. Never mind, I am switching from draft to review mode. I guess I just wanted to use the draft feature badly :)

@geekypathak21
Copy link
Member

@iamshnoo Always keep the master branch for rebasing Now if you want to rebase your pr It will cause problems for you. Make sure, if you are making changes it should be on feature branch.

@iamshnoo
Copy link
Member Author

iamshnoo commented Mar 16, 2020

@himanshupathak21061998 I am kind of new to the git workflow. I had never heard of rebasing before this. But thank you, I will keep this in mind for any new PRs I create. Unless of course, you are suggesting that I should do that for this PR itself?

Comment on lines +127 to +131
//! Get the value of the includeKl parameter.
bool IncludeKL() const { return includeKl; }

//! Get the value of the beta hyperparameter.
double Beta() const { return beta; }
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice detail with distinguishing parameter and hyperparameter :)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

😅

@sreenikSS
Copy link
Contributor

@iamshnoo could you add a simple test to check if the returned values are correct?

You'll need to make changes to the file src/mlpack/tests/ann_layer_test.cpp.

@iamshnoo
Copy link
Member Author

I am going to try some experiments with git now and in case I mess up, I will probably delete my fork and re-start. If that happens I might have to re-open this PR separately, so just a heads up about this!

And @sreenikSS sure, I will write down those tests as soon as I can. Thanks

Update forked master with upstream changes.
Copy link
Member

@zoq zoq left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That should at least resolve the first error, maybe there others as well.

OutputDataType& Gradient() { return gradient; }

//! Get the input width.
size_t const& InputWidth() const { return width; }
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some modules in mlpack only check for InputWidth and expect OutputWidth is there as well, either we remove the getter here or add a pseudo OutputWidth method as well.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see!

@iamshnoo
Copy link
Member Author

I am going to try some experiments with git now and in case I mess up, I will probably delete my fork and re-start. If that happens I might have to re-open this PR separately, so just a heads up about this!

I messed up. I am going to delete my fork now!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants