-
Notifications
You must be signed in to change notification settings - Fork 2.8k
feat: add DeepSeek V3.1 model to Chutes AI provider #7295
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
- Added deepseek-ai/DeepSeek-V3.1 to ChutesModelId type - Added model configuration with 163840 context window and 32768 max tokens - Added test coverage for the new model Fixes #7294
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Reviewing my own code. Found zero bugs. Suspicious.
|
https://openrouter.ai/deepseek/deepseek-chat-v3.1 according to here, Total Context is 163.8K |
daniel-lxs
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Summary
This PR attempts to address Issue #7294 by adding DeepSeek V3.1 to the Chutes AI provider dropdown. Feedback and guidance are welcome.
Changes
deepseek-ai/DeepSeek-V3.1to theChutesModelIdtype unionTesting
Related Issue
Fixes #7294
Notes
The DeepSeek V3.1 model uses the same configuration parameters as DeepSeek V3, with the standard temperature (0.5) for non-R1 DeepSeek models.
Important
Add DeepSeek V3.1 model to Chutes AI provider with configuration and test coverage.
deepseek-ai/DeepSeek-V3.1toChutesModelIdinchutes.ts.chutes.ts.chutes.spec.tsto verify model configuration.This description was created by
for b395046. You can customize this summary. It will automatically update as commits are pushed.