You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Found a bug in the codellama vllm model_len logic. (#380)
* Found a bug in the codellama vllm model_len logic.
Also, let's just avoid the vLLM error by making sure max_num_batched_tokens >= max_model_len
* nevermind I realized that if statement will never happen here.
# Based on config here: https://huggingface.co/TIGER-Lab/MAmmoTH-Coder-7B/blob/main/config.json#L12
# Based on config here: https://huggingface.co/codellama/CodeLlama-7b-hf/blob/main/config.json#L12
179
-
# Can also see 13B, 34B there too
179
+
# Can also see 13B, 34B there too. Note, codellama is one word.
0 commit comments