fix(agentflow): correct streaming field default of chat model configs in LLM node#5856
fix(agentflow): correct streaming field default of chat model configs in LLM node#5856tianwei-liu merged 5 commits intomainfrom
Conversation
Co-Authored-By: Claude Sonnet 4.6 <[email protected]>
Summary of ChangesHello @tianwei-liu, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed! This pull request addresses a bug in the Agentflow LLM node where the streaming behavior was not consistently determined. It introduces a more precise mechanism to evaluate the effective streaming setting, ensuring that the default streaming preference is correctly applied while allowing explicit configuration from the model to override it, thereby fixing potential issues with streamable responses. Highlights
Changelog
Activity
Using Gemini Code AssistThe full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips. Invoking Gemini You can request assistance from Gemini at any point by creating a comment using either
Customization To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a Limitations & Feedback Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here. You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension. Footnotes
|
There was a problem hiding this comment.
Code Review
This pull request refactors the logic for determining whether to use streaming in the LLM node. The change correctly considers the default streaming setting from the model's node definition when it's not explicitly configured by the user. My feedback includes a suggestion to make the implementation more concise and robust by also handling null values for the streaming configuration.
Verified working correctly. Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
|
the same applies for |
Root cause
LLM Agentflow determines whether to call
.stream()or.invoke()using:modelConfigcomes fromllmModelConfig, which is populated in the UI when the user selects a model. Flowise stores""(empty string) for any boolean input field the user never explicitly toggled — the field'sdefaultvalue is only used for display, not to pre-populate the stored value.So for any model with
streaming: { type: 'boolean', default: false }that the user never touches,modelConfig.streaming === "", and"" !== falseevaluates totrue— streaming is enabled regardless of the model's declared default.Why existing models aren't affected
Models that support streaming (e.g. ChatOpenAI) also get
streaming: ""inllmModelConfig. But since they actually support streaming, the.stream()call succeeds — the bug is invisible.The problem surfaces when a new chat model node declares
streaming: { default: false }because it does not support streaming. The""value causes LLM Agentflow to attempt streaming anyway, the model's endpoint rejects or silently fails the stream request, and no response is shown in the UI.Why the fix must be in LLM.ts, not in the chat model node
The
isStreamabledecision is made entirely inside LLM Agentflow, based onllmModelConfig— a snapshot taken at UI config time. The chat model node'sinit()only returns a model object; it has no way to reach back and influence the streaming decision in LLM Agentflow.Fix
When
streamingConfigis""orundefined(user never set it), fall back to thedefaultdeclared in the node's own input definition.defaultfalse→ respects ittrue→ respects itstreaminginput → defaults totrue(backward compatible)