Skip to content

Conversation

@ssnl
Copy link
Collaborator

@ssnl ssnl commented Jul 17, 2019

Fixes #22131

@pytorchbot pytorchbot added module: dataloader Related to torch.utils.data.DataLoader and Sampler module: multiprocessing Related to torch.multiprocessing labels Jul 17, 2019
@zhangguanheng66 zhangguanheng66 added the triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module label Jul 18, 2019
@ssnl
Copy link
Collaborator Author

ssnl commented Jul 19, 2019

@pytorchbot rebase this please

@ssnl
Copy link
Collaborator Author

ssnl commented Jul 26, 2019

@apaszke could you take a look? :) thanks

Copy link
Contributor

@apaszke apaszke left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Blah, another argument to the DataLoader 😩

LGTM, but might be better to make the attribute always set, with the default being the multiprocessing module. That would make for simpler code.

assert self.num_workers > 0

if loader.multiprocessing_context is None:
multiprocessing_context = multiprocessing
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can't we initialize the attribute with the multiprocessing module by default? You can make an exception for the isinstance() rule if it matches the module by identity.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@soumith is landing this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@colesbury has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@colesbury merged this pull request in e982e46.

@ssnl ssnl deleted the dl_ctx branch July 31, 2019 17:08
facebook-github-bot pushed a commit that referenced this pull request Aug 25, 2020
Summary:
#22990 added a multiprocessing_context argument to DataLoader, but a typo in the test causes the wrong DataLoader class to be used.

Pull Request resolved: #43343

Reviewed By: glaringlee

Differential Revision: D23299452

Pulled By: malfet

fbshipit-source-id: 9489c48b83bce36f46d350cad902f7ad96e1eec4
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Merged module: dataloader Related to torch.utils.data.DataLoader and Sampler module: multiprocessing Related to torch.multiprocessing open source triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[dataloader] Add a context= argument for multiprocessing

7 participants