-
-
Notifications
You must be signed in to change notification settings - Fork 1.7k
Accessor method implementations (continued) #2307
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Accessor implementation
|
@iamshnoo Always keep the master branch for rebasing Now if you want to rebase your pr It will cause problems for you. Make sure, if you are making changes it should be on feature branch. |
|
@himanshupathak21061998 I am kind of new to the git workflow. I had never heard of rebasing before this. But thank you, I will keep this in mind for any new PRs I create. Unless of course, you are suggesting that I should do that for this PR itself? |
| //! Get the value of the includeKl parameter. | ||
| bool IncludeKL() const { return includeKl; } | ||
|
|
||
| //! Get the value of the beta hyperparameter. | ||
| double Beta() const { return beta; } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice detail with distinguishing parameter and hyperparameter :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
😅
|
@iamshnoo could you add a simple test to check if the returned values are correct? You'll need to make changes to the file src/mlpack/tests/ann_layer_test.cpp. |
|
I am going to try some experiments with git now and in case I mess up, I will probably delete my fork and re-start. If that happens I might have to re-open this PR separately, so just a heads up about this! And @sreenikSS sure, I will write down those tests as soon as I can. Thanks |
Update forked master with upstream changes.
zoq
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That should at least resolve the first error, maybe there others as well.
| OutputDataType& Gradient() { return gradient; } | ||
|
|
||
| //! Get the input width. | ||
| size_t const& InputWidth() const { return width; } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some modules in mlpack only check for InputWidth and expect OutputWidth is there as well, either we remove the getter here or add a pseudo OutputWidth method as well.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see!
I messed up. I am going to delete my fork now! |
As per the suggestions made in the last few comments in #2269 , I am opening this PR. As far as I can understand this only has the 7 commits with 22 files changed by me and the rest is same as the master branch of mlpack as of now. I hope I am doing this right :)
@zoq @sreenikSS If you guys find some time, could you please review the changes.
I basically did all the work in my feature branch, then updated the master branch of my fork today, and merged the feature branch to my master. Now, I am trying to merge this master with mlpack:master.
I am keeping this PR as draft for now, until it is reviewed.Never mind, I am switching from draft to review mode. I guess I just wanted to use the draft feature badly :)