Skip to content

Slack - Support of LLM streaming #2073

@david1542

Description

@david1542

Hey everyone,

Currently LLM APIs (like the OpenAI API) stream the LLM response token by token, since waiting for the entire response takes usually ~ 7-10 seconds.

Is there any intention in supporting streaming in the Slack platform? Namely, I'd like to build a chatbot in Slack and I want to stream its answers to the users. Currently there's no easy way of doing that and I simply post the answer once the LLM has finished.

I was thinking of doing multiple "edits" to the original message but I'm afraid it'd be rate-limited due to lots of API calls.

If you can shed some light on this issue, that would be great :)

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementM-T: A feature request for new functionalityquestionM-T: User needs support to use the projectserver-side-issue

    Type

    No type

    Projects

    No projects

    Milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions