Skip to content

fix(agentflow): correct streaming field default of chat model configs in LLM node#5856

Merged
tianwei-liu merged 5 commits intomainfrom
streaming-config
Feb 27, 2026
Merged

fix(agentflow): correct streaming field default of chat model configs in LLM node#5856
tianwei-liu merged 5 commits intomainfrom
streaming-config

Conversation

@tianwei-liu
Copy link
Copy Markdown
Contributor

@tianwei-liu tianwei-liu commented Feb 26, 2026

Root cause

LLM Agentflow determines whether to call .stream() or .invoke() using:

const isStreamable = ... && modelConfig?.streaming !== false && ...

modelConfig comes from llmModelConfig, which is populated in the UI when the user selects a model. Flowise stores "" (empty string) for any boolean input field the user never explicitly toggled — the field's default value is only used for display, not to pre-populate the stored value.

So for any model with streaming: { type: 'boolean', default: false } that the user never touches, modelConfig.streaming === "", and "" !== false evaluates to true — streaming is enabled regardless of the model's declared default.

Why existing models aren't affected

Models that support streaming (e.g. ChatOpenAI) also get streaming: "" in llmModelConfig. But since they actually support streaming, the .stream() call succeeds — the bug is invisible.

The problem surfaces when a new chat model node declares streaming: { default: false } because it does not support streaming. The "" value causes LLM Agentflow to attempt streaming anyway, the model's endpoint rejects or silently fails the stream request, and no response is shown in the UI.

Why the fix must be in LLM.ts, not in the chat model node

The isStreamable decision is made entirely inside LLM Agentflow, based on llmModelConfig — a snapshot taken at UI config time. The chat model node's init() only returns a model object; it has no way to reach back and influence the streaming decision in LLM Agentflow.

Fix

When streamingConfig is "" or undefined (user never set it), fall back to the default declared in the node's own input definition.

  • User never set streaming → uses node's declared default
  • User explicitly set streaming to false → respects it
  • User explicitly set streaming to true → respects it
  • Node has no streaming input → defaults to true (backward compatible)

@gemini-code-assist
Copy link
Copy Markdown
Contributor

Summary of Changes

Hello @tianwei-liu, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request addresses a bug in the Agentflow LLM node where the streaming behavior was not consistently determined. It introduces a more precise mechanism to evaluate the effective streaming setting, ensuring that the default streaming preference is correctly applied while allowing explicit configuration from the model to override it, thereby fixing potential issues with streamable responses.

Highlights

  • Streaming Default Logic Refinement: Refactored the determination of whether an LLM node is streamable by introducing an effectiveStreaming variable. This variable now correctly considers the node's streaming input default (falling back to true if not explicitly set) and then applies modelConfig?.streaming if it's not an empty string or undefined, ensuring more robust streaming behavior.
Changelog
  • packages/components/nodes/agentflow/LLM/LLM.ts
    • Adjusted the logic for determining if an LLM node supports streaming, ensuring proper default handling and configuration override.
Activity
  • No specific activity (comments, reviews) has been recorded for this pull request yet.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request refactors the logic for determining whether to use streaming in the LLM node. The change correctly considers the default streaming setting from the model's node definition when it's not explicitly configured by the user. My feedback includes a suggestion to make the implementation more concise and robust by also handling null values for the streaming configuration.

@tianwei-liu tianwei-liu changed the title fix(agentflow): chatmodel's streaming default in LLM node fix(agentflow): streaming field default of chat model configs in LLM node Feb 26, 2026
@tianwei-liu tianwei-liu changed the title fix(agentflow): streaming field default of chat model configs in LLM node fix(agentflow): correct streaming field default of chat model configs in LLM node Feb 26, 2026
@tianwei-liu tianwei-liu marked this pull request as ready for review February 26, 2026 21:35
tianwei-liu and others added 3 commits February 26, 2026 13:44
Verified working correctly.

Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
@HenryHengZJ
Copy link
Copy Markdown
Contributor

the same applies for Agent.ts

@tianwei-liu tianwei-liu merged commit c24fbe5 into main Feb 27, 2026
7 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants