Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Have to_type behave the same for torch.Tensor and custom types #711

Open
wants to merge 1 commit into
base: master
from

Conversation

@ducksoup
Copy link

@ducksoup ducksoup commented Feb 10, 2020

Currently, to_type() behaves differently for torch.Tensor and custom types, calling to() directly on the custom type object without checking wether its data is floating point and residing on a GPU.
This change allows for custom types to be treated the same as standard tensors, properly handling non-floating point values, provided they also expose is_floating_point, is_cuda and to.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Linked issues

Successfully merging this pull request may close these issues.

None yet

1 participant
You can’t perform that action at this time.