Skip to content

Conversation

@AshAnand34
Copy link
Contributor

What does this PR do?

This PR creates a model card for the model XLM-RoBERTa as per the guidelines in #36979.

  • Reorganized shield icons for better visual alignment.
  • Added detailed usage examples for masked language modeling using both Pipeline and AutoModel.
  • Updated model overview and key features for clarity and completeness.
  • Included links to original checkpoints and relevant resources for user convenience.

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).

Who can review?

@stevhliu

Copy link
Member

@stevhliu stevhliu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good start, but check your other PR for some of the same comments you should apply here!

Copy link
Member

@stevhliu stevhliu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some additional comments about using the correct mask token and checkpoint

Copy link
Member

@stevhliu stevhliu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

@stevhliu stevhliu force-pushed the xlm-roberta-model-card branch from fab1617 to 4590094 Compare June 9, 2025 18:57
@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@stevhliu stevhliu merged commit e594e75 into huggingface:main Jun 9, 2025
10 checks passed
bvantuan pushed a commit to bvantuan/transformers that referenced this pull request Jun 12, 2025
…nd improved layout (huggingface#38596)

* Update XLM-RoBERTa model documentation with enhanced usage examples and improved layout

* Added CLI command example and quantization example for XLM RoBERTa model card.

* Minor change to transformers CLI and quantization example for XLM roberta model card
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants