Add flag torch_jit_disable_warning_prints to allow disabling all warnings.warn#49313
Add flag torch_jit_disable_warning_prints to allow disabling all warnings.warn#49313gmagogsfm wants to merge 1 commit intopytorch:masterfrom gmagogsfm:disable_warning
Conversation
facebook-github-bot
left a comment
There was a problem hiding this comment.
@gmagogsfm has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
facebook-github-bot
left a comment
There was a problem hiding this comment.
@gmagogsfm has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
facebook-github-bot
left a comment
There was a problem hiding this comment.
@gmagogsfm has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
facebook-github-bot
left a comment
There was a problem hiding this comment.
@gmagogsfm has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
Codecov Report
@@ Coverage Diff @@
## master #49313 +/- ##
=======================================
Coverage 80.56% 80.56%
=======================================
Files 1874 1874
Lines 202776 202777 +1
=======================================
+ Hits 163362 163363 +1
Misses 39414 39414 |
facebook-github-bot
left a comment
There was a problem hiding this comment.
@gmagogsfm has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
Why not? |
It's a bit of long story, there are a couple reasons:
I am not entirely happy with current approach because it completely suppresses all warnings, but couldn't come up with a good alternative Let me know if you know of good options to implement the functionality of "limiting number of warnings fired per process regardless of how many independent inference calls are made". |
|
@gmagogsfm merged this pull request in 7518f54. |
…ings.warn (pytorch#49313) Summary: Adding a flag torch_jit_disable_warning_prints to optimize interpreter performance by suppressing (potentially large amount) of warnings.warn. This is to work around TorchScript's warning behavior mismatch with Python. Python by default triggers a warning once per location but TorchScript doesn't support it. This causes same warning to trigger and print once per inference run, hurting performance. Pull Request resolved: pytorch#49313 Reviewed By: SplitInfinity Differential Revision: D25534274 Pulled By: gmagogsfm fbshipit-source-id: eaeb57a335c3e6c7eb259671645db05d781e80a2
Adding a flag
torch_jit_disable_warning_printsto optimize interpreter performance by suppressing (potentially large amount) of warnings.warn.This is to work around TorchScript's warning behavior mismatch with Python. Python by default triggers a warning once per location but TorchScript doesn't support it. This causes same warning to trigger and print once per inference run, hurting performance.