Skip to content

ValueError: Pointer argument (at 3) cannot be accessed from Triton #136078

@foreverpiano

Description

@foreverpiano

🐛 Describe the bug

图片
Dao-AILab/flash-attention#523
when I try to use flexattention
how can I fix this? it happenped randomly.

Versions

nightlyversion pytorch

triton=2.1.0
H100

cc @ezyang @chauhang @penguinwu @zou3519 @ydwu4 @bdhirsh @Chillee @drisspg @yanboliang @BoyuanFeng

Metadata

Metadata

Assignees

No one assigned

    Labels

    module: flex attentionmodule: higher order operatorstorch.cond and similarmodule: pt2-dispatcherPT2 dispatcher-related issues (e.g., aotdispatch, functionalization, faketensor, custom-op,oncall: pt2triagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions