Skip to content

Argument missing and ordering mismatch at torch.distributed.rpc.init_model_parallel doc #29905

@Kytabyte

Description

@Kytabyte

📚 Documentation

Per torch.distributed.rpc.init_model_parallel function on RPC document page in master branch here, I think there are some points to revise or polish:

  1. the argument worker_name_to_id is missing in doc.
  2. The order of arguments in doc does not match the function signature, which would cause additional ambiguity or inconvenience for readers.
  3. I think the repr for argument backend is too long to read, which may contain a lot of unneeded information for readers. It is arguable but it might be better to give the class a concise repr.

cc @pietern @mrshenli @pritamdamania87 @zhaojuanmao @satgera @rohan-varma @gqchen @aazzolini @xush6528

Metadata

Metadata

Assignees

No one assigned

    Labels

    module: rpcRelated to RPC, distributed autograd, RRef, and distributed optimizeroncall: distributedAdd this issue/PR to distributed oncall triage queuetriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions