Skip to content

Conversation

@lantiga
Copy link
Contributor

@lantiga lantiga commented May 2, 2017

This PR is a resubmission of #1405 rebased onto master after the autograd refactor merge.

This PR demonstrates a possible way to expose custom attributes from C++ functions.
Motivation: right now functions implemented in C++ do not expose function-specific attributes or methods (e.g. stride for ConvForward). This makes it impossible to get such attributes while traversing a computation graph.

Currently only convolution attributes have been exposed, with the aim of illustrating the approach. If the proposal gets a thumbs up, I'll proceed and expose attributes for the rest of the C++ functions.

Copy link
Contributor

@apaszke apaszke left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great now 👍

registerCppFunction(typeid(C), &type);
}

template<typename T, typename M, typename P, M P::*ptr, typename V, PyObject* (*Convert)(V)>

This comment was marked as off-topic.

This comment was marked as off-topic.

PyObject* getTupleAttr(PyObject* obj, void* _unused)
{
THPCppFunction* self = (THPCppFunction*)obj;
auto& arr = std::static_pointer_cast<T>(self->cdata).get()->*ptr;

This comment was marked as off-topic.

This comment was marked as off-topic.

for (size_t i = 0; i < num_elems; ++i) {
PyTuple_SET_ITEM(py_tuple.get(), i, Convert(arr[i]));
}
return py_tuple.release();

This comment was marked as off-topic.

This comment was marked as off-topic.

@lantiga
Copy link
Contributor Author

lantiga commented May 3, 2017

Thanks for the review @apaszke! At this point I'll complete the PR with BatchNorm, the only other C++ function with parameters. BatchNorm has thpp::Tensor as param, so I'll write another getter in the process.

{
HANDLE_TH_ERRORS
THPCppFunction* self = (THPCppFunction*)obj;
auto& arr = (T*)(self->cdata.get())->*ptr;

This comment was marked as off-topic.

This comment was marked as off-topic.

{(char*)"transposed", (getter)getValueAttr<ConvForward, bool, ConvParams,
&ConvParams::transposed, long, PyBool_FromLong>, NULL, NULL, NULL},
{(char*)"output_padding", (getter)getTupleAttr<ConvForward, std::vector<int>, ConvParams,
&ConvParams::output_padding, long, PyInt_FromLong>, NULL, NULL, NULL},

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

@apaszke
Copy link
Contributor

apaszke commented May 7, 2017

@pytorchbot test this plesae

@apaszke
Copy link
Contributor

apaszke commented May 7, 2017

@pytorchbot test this please

@apaszke apaszke merged commit 23b556e into pytorch:master May 7, 2017
@apaszke
Copy link
Contributor

apaszke commented May 7, 2017

Thanks Luca!

caogang added a commit to caogang/pytorch that referenced this pull request May 8, 2017
* master:
  Add F.normalize (pytorch#1467)
  Expose custom attributes from C++ functions (pytorch#1430)
  Add high order gradient support for Sigmoid (pytorch#1496)
@lantiga lantiga deleted the autograd-attributes branch May 9, 2017 20:31
Jiaming-Liu pushed a commit to Jiaming-Liu/pytorch that referenced this pull request May 18, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants