"Could not create stream" - OpenAI integration issue
"Could not create stream" - OpenAI integration issue
At a glance
The community member is encountering an error when trying to use the OpenAI integration in their self-hosted Typebot application, specifically the error message "Could not create stream". Other community members suggest that the issue may only occur in the preview and not in production, and that the chat stops working even in the published bot. The logs show that the failure happens at the /api/v2/sessions/[session-id]/streamMessage endpoint, which returns a 500 error, while the /continuChat endpoint is working fine. The request to the failing endpoint has an unusually small content-length of 2 bytes, which may be related to the issue. The solution was found by a community member, who discovered that the ENCRYPTION_SECRET values in the docker-compose.yml file were different between the builder and viewer services, causing AES cipher failures due to an encryption key mismatch, preventing proper data encryption/decryption between the services.
I've captured the logs showing the exact error sequence. The failure happens at the /api/v2/sessions/[session-id]/streamMessage endpoint which returns a 500 error. The v1 endpoints like /continuChat are working fine. The request to the failing endpoint has a content-length of 2 bytes, which seems unusually small. Could this be related to how the streaming request is being formatted?
The ENCRYPTION_SECRET values in docker-compose.yml were different between builder and viewer services. Matching these values and restarting the services resolved the error.
This was caused by AES cipher failures due to encryption key mismatch, preventing proper data encryption/decryption between services.