-
-
Notifications
You must be signed in to change notification settings - Fork 1.7k
Added accessor method implementations for ANN layers #2269
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Thanks for opening your first pull request in this repository! Someone will review it when they have a chance. In the mean time, please be sure that you've handled the following things, to make the review process quicker and easier:
Thank you again for your contributions! 👍 |
typo fix in recurrent_attention layer
Improving Contributing guidlines
Removed r-value references from remaining functions [and some style fixes]
Fixed markdown pipe symbol issue
src/mlpack/methods/ann/layer/add.hpp
Outdated
| OutputDataType& Gradient() { return gradient; } | ||
|
|
||
| //! Get the output size. | ||
| size_t const& OutputSize() const { return outSize; } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you use the same style as used here:
| OutputDataType TrainingVariance() { return runningVariance / count; } | ||
|
|
||
| //! Get the number of input units. | ||
| size_t const& InSize() const { return size; } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
See comment above, I think we should use the same style across the codebase.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Okay! Will update these changes in a commit probably a few hours from now.
* soft shrinkage activation function * changes * new changes * new changes * void Fn complete * comments * changes * changes * lambda as param * added derivate also * changes * complete function * changes in ss function * new changes * soft shrinkage activation function * changes * new changes * new changes * void Fn complete * comments * changes * changes * lambda as param * added derivate also * changes * complete function * changes in ss function * new changes * style corrections * style corrections 2 * corrections * corrections * final corrections * base layer changes * cmake softshrink function * softshrink.hpp in layer * softshrink_impl.hpp added * layer_types.hpp updated * updating activation_functions_test.cpp * soft shrink changes * commit * style corrections * style changes * changes style * small style changes * changes * style checks * final checks * changes * doxygen syntax for comment * soft shrink comments * style checks for 80 characters * dummy commit * softshrink.hpp to softshrink_impl.hpp fn to forward * bug fixes in build * style fixes * updating lambda definition Co-Authored-By: Ryan Birmingham <[email protected]> * Use _Internal module, not util. * Print code snippets in ```julia blocks. * Always pass 64-bit integers back and forth. * Additional i386 fixes. * Remove unneeded REQUIRE. * Add docstrings to the mlpack module. * New NumFOCUS badge mockup * Match style of other badges * Minor changes reflecting discusssion * Use Csize_t instead of uint64_t, and UInt/Int. * Oops, fix declaration of s. * Switch int64_t to int. * Remove inaccurate comments. * Fix a few other minor issues. * Redirected badge and removed info about NUMfocus * Reverted deletion and added attribution line * added @tparam and rvalue correction according to recent refactor pr Co-authored-by: Ryan Birmingham <[email protected]> Co-authored-by: Ryan Curtin <[email protected]> Co-authored-by: Sriram <[email protected]>
Use Cint to match C++'s int type, not Julia's Int.
sreenikSS
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good job. However, keep in mind as Ryan as already said, that you'll need to pull master and then make these changes to avoid a conflict.
| //! Get the value of model parameter. | ||
| bool const& Model() const { return model;} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You may consider changing the name of the function as Model() already exists and returns the network variable.
I also don't think that having a getter function for the private variable model is absolutely necessary.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Okay! I see your point. I will remove the Model() getters in the next commit.
| //! Get the value of model parameter. | ||
| bool const& Model() const { return model; } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same as above.
| //! Get the value of run parameter. | ||
| bool const& Run() const { return run; } | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks like run is not something that might benefit with a getter function.
On the other hand, having a getter coupled with a modifier might make sense if the user wants to manually shift between forward and backward. Any thoughts on this @zoq?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am going ahead and adding a setter for this now. If required I can remove it in a later commit.
| //! Get the number of steps to backpropagate through time. | ||
| size_t const& Rho() const { return rho; } | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is fine, however, little can be known by getting the rho alone. I think obtaining the other constructor parameters would be useful too.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So, there are 4 other variables in the constructor, namely const StartModuleType& start, const InputModuleType& input, const FeedbackModuleType& feedback and const TransferModuleType& transfer. I did not write getter functions for them because the data types that will be returned in that case are not usually returned in any other layers. Most probably this is because, these modules are pushed into the network and then returned via the getter for network. Let me know if I should still be returning them via getters and if so, what should be the style I need to follow!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes you are right, I had overlooked that. No need to write separate getters for these since we have a getter for the network itself.
| //! Get the module output size. | ||
| size_t const& OutSize() const { return outSize; } | ||
|
|
||
| //! Get the number of steps to backpropagate through time. | ||
| size_t const& Rho() const { return rho; } | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same as as above, i.e., the other constructor parameters can have getters too.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same as above
| //! Get the value of the model parameter. | ||
| bool const& Model() const { return model; } | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same as the changes suggested in highway.hpp.
| //! Get the width of each sample. | ||
| size_t const& InSize() const { return inSize; } | ||
|
|
||
| //! Get the starting row index of subview vector or matrix. | ||
| size_t const& BeginRow() const { return beginRow; } | ||
|
|
||
| //! Get the ending row index of subview vector or matrix. | ||
| size_t const& EndRow() const { return endRow; } | ||
|
|
||
| //! Get the starting column index of subview vector or matrix. | ||
| size_t const& BeginCol() const { return beginCol; } | ||
|
|
||
| //! Get the ending column index of subview vector or matrix. | ||
| size_t const& EndCol() const { return endCol; } | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Might as well add modifiers for these.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Will do!
|
Just a comment to avoid possible confusion. I am new to git. So, in case the commit history looks weird, my bad :) Also, I am not yet done with implementing all the changes requested. I am working on that commit as of now. And it shouldn't be long before I complete it and push. |
|
@iamshnoo it is currently very difficult to review the changes because the changes that you have pulled have made it to your commits as well. So reviewing your code separately would be a big hassle since there are a lot of files that have been changed (you can take a look at the Files Changed tab). Could you just do something like create a new branch and copy-paste-replace the files where you have made changes and submit a new PR (just make sure there are no new changes to those files by other people after the last time you modified them)? Git might offer a better way to do it but this is what comes to my mind right now. |
|
@sreenikSS I think updating the master branch will also work for this problem. I was also dealing with the same issue but when I updated master branch of my forked repo this issue was solved |
|
Thanks @himanshupathak21061998 @sreenikSS for your suggestions. Also, should I be closing this PR as only the other one is now relevant? |
|
@iamshnoo you can close this one. |
|
Okay! Sure. |
Solves the issue described in #2258
The final comment by me in that issue gives a high-level overview of the changes I have made here.
@zoq @sreenikSS Could you guys give me some feedback about this PR? Thanks.