Skip to content

Pruning doesn't affect speed nor memory usage #36214

@Cyril9227

Description

@Cyril9227

Hello,

I'm experimenting with the newly pruning feature on Pytorch and going thru the tutorial. My issue is that after pruning weights of a ResNet, the inference time and the memory footprint is exactly the same.

How do I get speed-up and reduced memory footprint using pruning in Pytorch ?

My code for pruning is pretty simple so far :

for name, module in model.named_modules():
    # prune 20% of connections in all 2D-conv layers
    if isinstance(module, torch.nn.Conv2d):
        prune.l1_unstructured(module, name='weight', amount=0.2)
        prune.remove(module, 'weight') # make it permanent
    # prune 40% of connections in all linear layers
    elif isinstance(module, torch.nn.Linear):
        prune.l1_unstructured(module, name='weight', amount=0.4)
        prune.remove(module, 'weight')

For timing I use :

    start = time.time()
    # get predictions
    output = model(inputs)
    end = time.time()  
    print(end - start)

Any help is welcome,
thanks

Metadata

Metadata

Assignees

No one assigned

    Labels

    module: pruningtriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions