Skip to content

Forgetting retain_variables=True causes segfault in conv2d #1288

@adamlerer

Description

@adamlerer

Usually calling backwards twice gives this message:

RuntimeError: Trying to backward through the graph second time, but the buffers have already been freed. Please specify retain_variables=True when calling backward for the first time.

But conv2d just segfaults.

import torch
import torch.nn as nn
from torch.autograd import Variable

l1 = nn.Conv2d(3, 10, 4, 2, 1, bias=False).cuda()
input = Variable(torch.cuda.FloatTensor(5, 3, 8, 8))

o1 = l1(input)

print("A")
o1.sum().backward()
print("B")
o1.sum().backward()
print("C")

$ pt test.py
A
B
Segmentation fault (core dumped)

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions