-
Notifications
You must be signed in to change notification settings - Fork 21.2k
Incorrect model is used in Prompt Generation. #22541
Copy link
Copy link
Closed
Labels
cloudWhen the version is cloud and it is a bug reportWhen the version is cloud and it is a bug report🐞 bugSomething isn't workingSomething isn't working
Description
Self Checks
- I have read the Contributing Guide and Language Policy.
- This is only for bug report, if you would like to ask a question, please head to Discussions.
- I have searched for existing issues search for existing issues, including closed ones.
- I confirm that I am using English to submit this report, otherwise it will be closed.
- 【中文用户 & Non English User】请使用英语提交,否则会被关闭 :)
- Please do not modify this template :) and fill in all the required fields.
Dify version
1.6.0
Cloud or Self Hosted
Cloud
Steps to reproduce
Install the langgenius/tongyi model plugin (v0.0.33).
Set system reasoning model to qvq-max-latest in Settings/Model Provider/System Reasoning Model.
In Prompt Generator, choose the model qwen-max and generate the prompt with any instruction.
✔️ Expected Behavior
Prompt generated with qwen-max.
❌ Actual Behavior
Error message shown:
Failed to generate rule config. Error: [tongyi] Error: PluginInvokeError: {"args": {},"error_type":"Service UnavailableError", "message":"Failed to invoke model qvq-max-latest, status code: 400, message:
\u003c400\u003e
InternalError.Algo.InvalidParameter: The incremental_output parameter of this model cannot be set to False."}
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
cloudWhen the version is cloud and it is a bug reportWhen the version is cloud and it is a bug report🐞 bugSomething isn't workingSomething isn't working