If I have a tensor x of size (100000000,), and I save it with torch.save(x[:10]), it will take a lot of space in the disk (same space than if I save the entire tensor). However, torch.save(x[:10].clone()) only seems to serialize the relevant part of the tensor.
Is it expected behaviour?