Skip to content

Fix up the mammoth max length issue.#335

Merged
sam-scale merged 4 commits intomainfrom
ss/mammoth
Oct 20, 2023
Merged

Fix up the mammoth max length issue.#335
sam-scale merged 4 commits intomainfrom
ss/mammoth

Conversation

@sam-scale
Copy link
Copy Markdown
Contributor

No description provided.

if "mistral" in model_name:
max_num_batched_tokens = 8000
max_model_len = 8000
max_num_batched_tokens: Optional[int] = 2560 # vLLM's default
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When is max_num_batched_tokens None?

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Never, it's because of the typing of the override dict _VLLM_MODEL_LENGTH_OVERRIDES: Dict[str, Dict[str, Optional[int]]]

I can try a better solution

@sam-scale sam-scale merged commit 49eb538 into main Oct 20, 2023
@sam-scale sam-scale deleted the ss/mammoth branch October 20, 2023 18:40
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants