Skip to content

Conversation

@ydshieh
Copy link
Collaborator

@ydshieh ydshieh commented Feb 4, 2022

What does this PR do?

Add cross attention to return values for TF T5/LED (so aligned with PT's version).

(Also needed to add a fix of undoing padding. Same undoing padding is required for longformer, but it is done in another PR)

@patrickvonplaten

@HuggingFaceDocBuilder
Copy link

HuggingFaceDocBuilder commented Feb 4, 2022

The documentation is not available anymore as the PR was closed or merged.

@patrickvonplaten
Copy link
Contributor

Great PR - thanks a lot!

@patrickvonplaten
Copy link
Contributor

CI failure is unrelated

@patrickvonplaten patrickvonplaten merged commit 131e258 into huggingface:master Feb 7, 2022
@ydshieh ydshieh deleted the fix_tf_t5_missing_cross_attn branch February 7, 2022 18:51
stevhliu pushed a commit to stevhliu/transformers that referenced this pull request Feb 18, 2022
* add cross attn to outputs

* add cross attn to outputs for TFLED

* add undo padding

* remove unused import

* fix style

Co-authored-by: ydshieh <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants