Skip to content

Conversation

@Deleter-D
Copy link
Collaborator

When using MTP and tie_word_embeddings=true, unexpected weight transposition behavior may occur.

The specific reason is that the eh_proj layer in MTP reuses the ParallelLMHead class. When MTP is turned on, its internal judgment logic about tie_word_embeddings will cause the eh_proj weight to produce unexpected transposition behavior. Therefore, the eh_proj layer of MTP is extracted separately to facilitate subsequent development.

@paddle-bot
Copy link

paddle-bot bot commented Jul 4, 2025

Thanks for your contribution!

@CLAassistant
Copy link

CLAassistant commented Jul 4, 2025

CLA assistant check
All committers have signed the CLA.

@paddle-bot paddle-bot bot added the contributor External developers label Jul 4, 2025
@freeliuzc freeliuzc merged commit e7fa57e into PaddlePaddle:develop Jul 4, 2025
2 checks passed
@Deleter-D Deleter-D deleted the fix_mtp_linear branch August 11, 2025 09:51
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

contributor External developers

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants