Skip to content

remove redundant file linking for tie_word_embeddings#506

Merged
HenryNdubuaku merged 1 commit intomainfrom
remove_file_linking
Mar 9, 2026
Merged

remove redundant file linking for tie_word_embeddings#506
HenryNdubuaku merged 1 commit intomainfrom
remove_file_linking

Conversation

@jakmro
Copy link
Copy Markdown
Collaborator

@jakmro jakmro commented Mar 7, 2026

No description provided.

Copilot AI review requested due to automatic review settings March 7, 2026 19:42
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR removes filesystem-level hardlink creation for output_weight.weights when tie_word_embeddings is enabled, relying on the runtime’s tie_word_embeddings behavior instead of duplicating/linking weight files during conversion.

Changes:

  • Removed os.link()-based linking from token_embeddings.weights to output_weight.weights under tie_word_embeddings.
  • Simplified the tied-embedding branch to only mark output projection tensors as “handled” in saved_tensor_full_names.
Comments suppressed due to low confidence (2)

python/src/converter.py:222

  • With the hardlinking removed, tie_word_embeddings=True + embedding_found=False now silently results in no output projection being written and no warning being emitted. Consider restoring an explicit warning/error in this branch when embeddings weren’t exported (since the runtime will not have an output weight to use).
    if tie_word_embeddings:
        if embedding_found:
            for name in OUTPUT_NAMES:
                if name in state_dict:
                    saved_tensor_full_names.add(name)

python/src/converter.py:223

  • When tie_word_embeddings is enabled, the converter no longer overwrites/removes any pre-existing output_weight.weights in the output directory. If a previous (partial) conversion left that file behind, it will remain and can make the output directory inconsistent. Consider deleting output_weight.weights when tie_word_embeddings is true (or ensuring the output dir is cleaned before conversion).
    if tie_word_embeddings:
        if embedding_found:
            for name in OUTPUT_NAMES:
                if name in state_dict:
                    saved_tensor_full_names.add(name)
    else:

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

@HenryNdubuaku HenryNdubuaku merged commit 59ad684 into main Mar 9, 2026
5 of 6 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants