Share feedback, ideas and get community help

Updated 6 months ago

Streaming not working with OpenAI block

Is OpenAI streaming responses not working for anyone else? I believe I have setup my typebot correctly and streaming is still not working.
Do I need to make some other configuration? Currently at latest version (2.26.1).
Attachment
image.png
2
B
R
S
50 comments
That looks right to me πŸ€”
Can I try it out? Can you give me access to the bot?
Sure thing, I will see into getting you access.
@Baptiste You should have received an invitation in your email. The link is editor.enchatted.com
@Baptiste did you have any chance to look at this?
Hey, this is streaming on my end πŸ€”
Can you send a recording of your screen?
Indeed, my bad, it's not streaming
I'm currently debugging and will let you know
Ok it will be fixed in v2.27. It should be released today πŸ™‚
@Baptiste Hello, it appears that as of v2.27 this is still not resolved. Streaming is not working neither on test mode nor in published mode.
How did you deploy it? Docker?
I believe self-hosted version isn’t on the latest updates and can take time to receive the latest updates.

I believe the self-hosted version only receives updates once a month.
The Docker installation is at the latest version (v.2.27)
Can you tell me more about your deployment setup?
With the only difference that when we setup typebot there was no .env file back then so all config was put into the docker-compose file.
@Baptiste can you help us with that issue please?
Hi is there any update on this? Does streaming work on openai block?
Can you invite me in your app (baptiste@typebot.io)? @Stergios @William so that I check what's wrong
You should already have access to our deployment. It is located at
We are on the same team w @Retr0 . Awaiting your valuable feedback @Baptiste . Thanks in advance.
I just need another example because I am thinking this might be due to a proxy you have set up
Maybe Cloudflare?
Yes we are using CF for our installation. If that's the cause of the issue, is there a way to mitigate it?
In my case, we are not using any proxy, infact its a plain installation as described on the typebot site, caddy-gen is the reverse proxy, let me know if you need access to the application @Baptiste
Will investigate it ASAP most likely today. Tracking this here: https://github.com/baptisteArno/typebot.io/issues/1701
I have sent you an invite from our workspace - to - support@get-typebot.com
also have created a bot in there to help you with reproducing the use case
Can you invite baptiste@typebot.io instead?
I just tried the docker-compose set up without proxy, streaming is working
@Retr0 @Stergios Does it work if you disable proxy on Cloudflare?
Attachment
CleanShot_2024-08-16_at_16.09.032x.jpg
Ok, I tried again on https://editor.enchatted.com/typebots and for long messages it seems it is still streamed, just in 2 big chunks
So I think the reverse proxy by default is buffering the response from the server. There should be a way to tell it not to do it
Did you set caddy-gen service for the reverse proxy?
Can we see your docker-compose file?
sent you an invite, please do check
i think we need to set flush_interval to -1 and then it should work
confirmed its working now
have updated the following in the docker compose file:

typebot-builder: labels: virtual.host: 'typebot.domain.com' # change to your domain name virtual.port: '3000' virtual.tls-email: '[email protected]' # change to your email virtual.proxy.directives: | flush_interval -1 typebot-viewer: labels: virtual.host: 'bot.domain.com' # change to your domain name virtual.port: '3000' virtual.tls-email: '[email protected]' # change to your email virtual.proxy.directives: | flush_interval -1
the new setting added is:

virtual.proxy.directives: | flush_interval -1
We can also confirm that setting flush_interval to -1 fixes the streaming issue to our deployment.
As William mentioned, at least the docs should be updated to reflect on this.
Awesome, thank you for the help πŸ‘Œ
Add a reply
Sign up and join the conversation on Discord