Chat with local Llama3 model via Ollama in KNIME
----
Run in Terminal window to start Ollama. You can also try and use other models (https://ollama.com)
ollama run llama3:instruct
Medium - Chat with local Llama3 Model via Ollama in KNIME Analytics Platform - Also extract Logs into structured JSON Files
https://medium.com/p/aca61e4a690a
P.S.: yes I am aware of the large empty white space but I have no idea how to remove it in KNIME 4 and have already contacted KNIME support
To use this workflow in KNIME, download it from the below URL and open it in KNIME:
Download WorkflowDeploy, schedule, execute, and monitor your KNIME workflows locally, in the cloud or on-premises – with our brand new NodePit Runner.
Try NodePit Runner!