Skip to content

Conversation

@iamshnoo
Copy link
Member

@iamshnoo iamshnoo commented Mar 9, 2020

Solves the issue described in #2258
The final comment by me in that issue gives a high-level overview of the changes I have made here.

@zoq @sreenikSS Could you guys give me some feedback about this PR? Thanks.

@mlpack-bot
Copy link

mlpack-bot bot commented Mar 9, 2020

Thanks for opening your first pull request in this repository! Someone will review it when they have a chance. In the mean time, please be sure that you've handled the following things, to make the review process quicker and easier:

  • All code should follow the style guide
  • Documentation added for any new functionality
  • Tests added for any new functionality
  • Tests that are added follow the testing guide
  • Headers and license information added to the top of any new code files
  • HISTORY.md updated if the changes are big or user-facing
  • All CI checks should be passing

Thank you again for your contributions! 👍

OutputDataType& Gradient() { return gradient; }

//! Get the output size.
size_t const& OutputSize() const { return outSize; }
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OutputDataType TrainingVariance() { return runningVariance / count; }

//! Get the number of input units.
size_t const& InSize() const { return size; }
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

See comment above, I think we should use the same style across the codebase.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay! Will update these changes in a commit probably a few hours from now.

ojhalakshya and others added 3 commits March 13, 2020 15:37
* soft shrinkage activation function

* changes

* new changes

* new changes

* void Fn complete

* comments

* changes

* changes

* lambda as param

* added derivate also

* changes

* complete function

* changes in ss function

* new changes

* soft shrinkage activation function

* changes

* new changes

* new changes

* void Fn complete

* comments

* changes

* changes

* lambda as param

* added derivate also

* changes

* complete function

* changes in ss function

* new changes

* style corrections

* style corrections 2

* corrections

* corrections

* final corrections

* base layer changes

* cmake softshrink function

* softshrink.hpp in layer

* softshrink_impl.hpp added

* layer_types.hpp updated

* updating activation_functions_test.cpp

* soft shrink changes

* commit

* style corrections

* style changes

* changes style

* small style changes

* changes

* style checks

* final checks

* changes

* doxygen syntax for comment

* soft shrink comments

* style checks for 80 characters

* dummy commit

* softshrink.hpp to softshrink_impl.hpp fn to forward

* bug fixes in build

* style fixes

* updating lambda definition

Co-Authored-By: Ryan Birmingham <[email protected]>

* Use _Internal module, not util.

* Print code snippets in ```julia blocks.

* Always pass 64-bit integers back and forth.

* Additional i386 fixes.

* Remove unneeded REQUIRE.

* Add docstrings to the mlpack module.

* New NumFOCUS badge mockup

* Match style of other badges

* Minor changes reflecting discusssion

* Use Csize_t instead of uint64_t, and UInt/Int.

* Oops, fix declaration of s.

* Switch int64_t to int.

* Remove inaccurate comments.

* Fix a few other minor issues.

* Redirected badge and removed info about NUMfocus

* Reverted deletion and added attribution line

* added @tparam and rvalue correction according to recent refactor pr

Co-authored-by: Ryan Birmingham <[email protected]>
Co-authored-by: Ryan Curtin <[email protected]>
Co-authored-by: Sriram <[email protected]>
Use Cint to match C++'s int type, not Julia's Int.
Copy link
Contributor

@sreenikSS sreenikSS left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good job. However, keep in mind as Ryan as already said, that you'll need to pull master and then make these changes to avoid a conflict.

Comment on lines 177 to 178
//! Get the value of model parameter.
bool const& Model() const { return model;}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You may consider changing the name of the function as Model() already exists and returns the network variable.

I also don't think that having a getter function for the private variable model is absolutely necessary.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay! I see your point. I will remove the Model() getters in the next commit.

Comment on lines 138 to 139
//! Get the value of model parameter.
bool const& Model() const { return model; }
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same as above.

Comment on lines 141 to 143
//! Get the value of run parameter.
bool const& Run() const { return run; }

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like run is not something that might benefit with a getter function.
On the other hand, having a getter coupled with a modifier might make sense if the user wants to manually shift between forward and backward. Any thoughts on this @zoq?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am going ahead and adding a setter for this now. If required I can remove it in a later commit.

Comment on lines +142 to +144
//! Get the number of steps to backpropagate through time.
size_t const& Rho() const { return rho; }

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is fine, however, little can be known by getting the rho alone. I think obtaining the other constructor parameters would be useful too.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So, there are 4 other variables in the constructor, namely const StartModuleType& start, const InputModuleType& input, const FeedbackModuleType& feedback and const TransferModuleType& transfer. I did not write getter functions for them because the data types that will be returned in that case are not usually returned in any other layers. Most probably this is because, these modules are pushed into the network and then returned via the getter for network. Let me know if I should still be returning them via getters and if so, what should be the style I need to follow!

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes you are right, I had overlooked that. No need to write separate getters for these since we have a getter for the network itself.

Comment on lines 142 to 147
//! Get the module output size.
size_t const& OutSize() const { return outSize; }

//! Get the number of steps to backpropagate through time.
size_t const& Rho() const { return rho; }

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same as as above, i.e., the other constructor parameters can have getters too.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same reason as the one I specified above. rnnModule and actionModule are pushed back into the network here. And then we are returning the network here.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same as above

Comment on lines 176 to 178
//! Get the value of the model parameter.
bool const& Model() const { return model; }

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same as the changes suggested in highway.hpp.

Comment on lines 132 to 146
//! Get the width of each sample.
size_t const& InSize() const { return inSize; }

//! Get the starting row index of subview vector or matrix.
size_t const& BeginRow() const { return beginRow; }

//! Get the ending row index of subview vector or matrix.
size_t const& EndRow() const { return endRow; }

//! Get the starting column index of subview vector or matrix.
size_t const& BeginCol() const { return beginCol; }

//! Get the ending column index of subview vector or matrix.
size_t const& EndCol() const { return endCol; }

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Might as well add modifiers for these.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will do!

@iamshnoo
Copy link
Member Author

Just a comment to avoid possible confusion. I am new to git. So, in case the commit history looks weird, my bad :) Also, I am not yet done with implementing all the changes requested. I am working on that commit as of now. And it shouldn't be long before I complete it and push.

@iamshnoo iamshnoo requested review from sreenikSS and zoq March 16, 2020 13:32
@sreenikSS
Copy link
Contributor

@iamshnoo it is currently very difficult to review the changes because the changes that you have pulled have made it to your commits as well. So reviewing your code separately would be a big hassle since there are a lot of files that have been changed (you can take a look at the Files Changed tab).

Could you just do something like create a new branch and copy-paste-replace the files where you have made changes and submit a new PR (just make sure there are no new changes to those files by other people after the last time you modified them)? Git might offer a better way to do it but this is what comes to my mind right now.

@geekypathak21
Copy link
Member

@sreenikSS I think updating the master branch will also work for this problem. I was also dealing with the same issue but when I updated master branch of my forked repo this issue was solved
Just do.

git checkout master
git remote add upstream https://github.com/mlpack/mlpack.git
git pull upstream master
git push

@iamshnoo
Copy link
Member Author

Thanks @himanshupathak21061998 @sreenikSS for your suggestions.
I have created #2307 and hopefully it is correct this time :)
Maybe you guys would like to have a look at the changes there?

Also, should I be closing this PR as only the other one is now relevant?

@sreenikSS
Copy link
Contributor

@iamshnoo you can close this one.

@iamshnoo
Copy link
Member Author

Okay! Sure.

@iamshnoo iamshnoo closed this Mar 17, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.