-
Notifications
You must be signed in to change notification settings - Fork 31.5k
TF: Unpack model inputs through a decorator #15907
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
The documentation is not available anymore as the PR was closed or merged. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No more need for this ugly work-around for non-NLP models 😎
sgugger
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So nice and elegant! Love how it makes the modeling code so much more readable!
|
This looks great! Also pinging @patrickvonplaten for review as it's quite central |
|
Thanks for running the tests on it. It is way more readable like that indeed. |
|
Looks very nice! Thanks for cleaning this up |
1 similar comment
|
Looks very nice! Thanks for cleaning this up |
d39f916 to
d43d258
Compare
|
It seems like everyone is in agreement :) @Rocketknight1 can you have a final look, and approve if you agree with the change? |
Rocketknight1
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks great to me! Is the plan to wait and see how things go with BERT, and assuming no problems to repeat those edits across all model classes?
Yeah! I'm also going to open an issue with the |
What does this PR do?
Unpacks TF model inputs through a decorator, improving code clarity. To be read with issue #15908, which holds the description of the problem, the proposed solution, and future plan.
Closes #15908
How to review: start with
modeling_tf_utils.py, and then check the changes in the models. I've runRUN_SLOW=1 py.test -vv tests/model_name/test_modeling_tf_model_name.pyfor all involved models, and the changes were applied to BERT (plus its copy-linked architectures) and Speech2Text, showing that it works for multiple modalities.