Skip to content

Support for bfloat16 #7527

@philippwitte

Description

@philippwitte

Description

Are there plans to support the bfloat16 data type in the near future? This data type is becoming increasingly popular in LLM training. It looks like currently it's not supported. I.e., calling y = cp.asarray(x), where x is a torch tensor of type torch.bfloat16, returns "TypeError: Got unsupported ScalarType BFloat16". Are there any recommended workarounds in the meantime?

Additional Information

No response

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions