Share feedback, ideas and get community help

Updated 3 months ago

Help Needed for Creating a Custom Block in Typebot for Ollama Models (LLaMA 3.2)

Hi everyone,

I’m currently working on integrating Ollama models (specifically LLaMA 3.2) into Typebot by creating a custom block. My goal is to allow users to interact with the LLaMA model directly within a Typebot conversation.

I’d really appreciate some guidance on the following:

How do I create a custom block in Typebot that can make API requests to the Ollama models?
Is there a specific process or best practices for handling API calls (e.g., sending user input, receiving responses) in Typebot’s custom blocks?
If anyone has done something similar, especially integrating an external AI model like Ollama, I’d love to learn from your experience.
Any code examples or tips would be super helpful!

Thanks in advance for your support!
Add a reply
Sign up and join the conversation on Discord