Skip to content

Conversation

@gchanan
Copy link
Contributor

@gchanan gchanan commented Jul 27, 2018

CUDA lapack functions generally don't work unless has_magma is true.

CUDA lapack functions generally don't work unless has_magma is true.
Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

gchanan has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

jramseyer pushed a commit to jramseyer/pytorch that referenced this pull request Jul 30, 2018
Summary:
CUDA lapack functions generally don't work unless has_magma is true.
Pull Request resolved: pytorch#9936

Differential Revision: D9028579

Pulled By: gchanan

fbshipit-source-id: 9b77e3b05253fd49bcabf604d0924ffa0e116055
goodlux pushed a commit to goodlux/pytorch that referenced this pull request Aug 15, 2018
Summary:
CUDA lapack functions generally don't work unless has_magma is true.
Pull Request resolved: pytorch#9936

Differential Revision: D9028579

Pulled By: gchanan

fbshipit-source-id: 9b77e3b05253fd49bcabf604d0924ffa0e116055
@ezyang ezyang added the merged label Jun 26, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants