-
Notifications
You must be signed in to change notification settings - Fork 423
Closed
Labels
enhancementM-T: A feature request for new functionalityM-T: A feature request for new functionalityquestionM-T: User needs support to use the projectM-T: User needs support to use the projectserver-side-issue
Milestone
Description
Hey everyone,
Currently LLM APIs (like the OpenAI API) stream the LLM response token by token, since waiting for the entire response takes usually ~ 7-10 seconds.
Is there any intention in supporting streaming in the Slack platform? Namely, I'd like to build a chatbot in Slack and I want to stream its answers to the users. Currently there's no easy way of doing that and I simply post the answer once the LLM has finished.
I was thinking of doing multiple "edits" to the original message but I'm afraid it'd be rate-limited due to lots of API calls.
If you can shed some light on this issue, that would be great :)
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
enhancementM-T: A feature request for new functionalityM-T: A feature request for new functionalityquestionM-T: User needs support to use the projectM-T: User needs support to use the projectserver-side-issue