Help Needed for Creating a Custom Block in Typebot for Ollama Models (LLaMA 3.2)
Help Needed for Creating a Custom Block in Typebot for Ollama Models (LLaMA 3.2)
At a glance
The community member is working on integrating the LLaMA 3.2 model from Ollama into Typebot by creating a custom block. They are seeking guidance on how to create a custom block that can make API requests to the Ollama models, and the best practices for handling API calls (e.g., sending user input, receiving responses) in Typebot's custom blocks. They are also interested in learning from the experience of anyone who has done something similar, especially integrating an external AI model like Ollama. The community member has provided a link to the Typebot documentation on contributing to the platform.
I’m currently working on integrating Ollama models (specifically LLaMA 3.2) into Typebot by creating a custom block. My goal is to allow users to interact with the LLaMA model directly within a Typebot conversation.
I’d really appreciate some guidance on the following:
How do I create a custom block in Typebot that can make API requests to the Ollama models? Is there a specific process or best practices for handling API calls (e.g., sending user input, receiving responses) in Typebot’s custom blocks? If anyone has done something similar, especially integrating an external AI model like Ollama, I’d love to learn from your experience. Any code examples or tips would be super helpful!