Share feedback, ideas and get community help

Updated 7 months ago

max tokens openAI assistant

Hey,

First and foremost thank you for the amazing tool Baptiste!

Quick question, is there any way to set a max_token in the OpenAI block?

My assistant is using 19k tokens per query which is a bit too expensive…
B
11 comments
Thank you 🙏
(On a similar topic I think that there is an issue with the total tokens in save response)
I am not sure about how max_token is helpful here
because it could greatly impact the user experience
It means the assistant reply could be cut off if max tokens limit is reach
I would advise you to instruct the model to not generate long answers instead
Isn’t the max token affecting the input memory instead of the output?
Also conversely, I know it’s possible to make a check for a long message sent by the user and ask them to write a shorter message but is it on your radar to add a limit within the send message block? Otherwise a user has to retype their message if they send a message that is too long
The maximum number of tokens that can be generated in the chat completion.
Yeah I could add such feature
Add a reply
Sign up and join the conversation on Discord